Winter 2005 |
||
|
||
Expanded Review System for ABLE Major Workshops Anne L. Cordon |
||
Over the last couple of years the ABLE executive has been discussing ways to get even better feedback on conference workshops. Although we encourage participants to complete evaluations in each Major and Mini session, we realize that more in-depth comments would be very helpful to ABLE and individual presenters alike. Because the Proceedings Editor would work with presenters to incorporate applicable comments into the final edit, labs published by ABLE would have a stronger assessment component to further extend ABLE’s credibility. We are working on the assessment rubric and the mechanics for the distribution and collection of these in-depth comments from conference participants. It is not too late to voice your thoughts! Please send your comments and suggestions to Anne Cordon. We are particularly interested in your answers to the following questions: 1. Would you be willing to provide an in-depth reviews of some (one/two/three)
of the workshops you attend at the ABLE conference? If yes: 2. How much time would you be willing to spend per review? 3. Selection of reviewers: (drafted versus volunteer) 4. Timing of reviews: 5. Format of reviews: During ABLE2004 (Bowling Green State University), we polled conference participants’ willingness. On the back of the usual evaluation form we asked if they would be willing to complete an in-depth review of that workshop. We also asked when they would complete the comments (during or after the conference), how much time they would be willing to spend and what format this evaluation should be (electronic or paper). Table 1 Summary of the responses to participating in in-depth peer review of ABLE workshops FORMAT of assessment: TIME commitment for assessment: EXPERTISE CRITERIA for assessment
|
||
|