Labstracts mast
Winter 2005

PREVIOUS|PAGE 1

 

 

Expanded Review System for ABLE Major Workshops

Anne L. Cordon
University of Toronto Mississauga
anne.cordon@utoronto.ca

 

Over the last couple of years the ABLE executive has been discussing ways to get even better feedback on conference workshops. Although we encourage participants to complete evaluations in each Major and Mini session, we realize that more in-depth comments would be very helpful to ABLE and individual presenters alike. Because the Proceedings Editor would work with presenters to incorporate applicable comments into the final edit, labs published by ABLE would have a stronger assessment component to further extend ABLE’s credibility.

We are working on the assessment rubric and the mechanics for the distribution and collection of these in-depth comments from conference participants. It is not too late to voice your thoughts! Please send your comments and suggestions to Anne Cordon. We are particularly interested in your answers to the following questions:

1. Would you be willing to provide an in-depth reviews of some (one/two/three) of the workshops you attend at the ABLE conference? If yes:
___Major
___Mini
___either

2. How much time would you be willing to spend per review?
___ less than 30 minutes
___ 30-60 minutes
___ 60-120 minutes

3. Selection of reviewers: (drafted versus volunteer)
___Do you think conference participants should be asked to review particular workshops?
___Should we leave if open for people to volunteer?

4. Timing of reviews:
___During the conference ___After the conference ____both options available ___doesn’t matter

5. Format of reviews:
___paper
___electronic
___either

During ABLE2004 (Bowling Green State University), we polled conference participants’ willingness. On the back of the usual evaluation form we asked if they would be willing to complete an in-depth review of that workshop. We also asked when they would complete the comments (during or after the conference), how much time they would be willing to spend and what format this evaluation should be (electronic or paper).

Table 1 Summary of the responses to participating in in-depth peer review of ABLE workshops

FORMAT of assessment:
Online after conference: 53% (18)
Paper at conference: 12% ( 4)
Either: 12% ( 4)
No response: 26% (9)

TIME commitment for assessment:
Many people said 30-60 minutes, though some would spend longer.

EXPERTISE
Most people preferred to review labs they know the most about.

CRITERIA for assessment
Considerable range of comments including—investigative, practical, economical, adaptable to many student populations, general biology topic (not too specific), scientific merit, novel approach.

 

PREVIOUS|PAGE 1