Critical Thinking and Biology
Ask any five university instructors what university “is”, and you’ll probably receive five different answers. Personally, I think it’s a place to learn to receive arguments (not the aggressive kind: rather the reasoning behind whether something is true) and be able to dissect, reject, rebut, and synthesize your own counterargument. This is “higher order thinking”, and is something not easily (if at all) done by fill-in-the-blank or multiple-choice examinations. At ABLE, our raison d’etre is largely to provide laboratories where students are challenged to think and troubleshoot. “Cookbook” labs are re-worked to create open-ended investigations. Students actually do science in ABLE science labs.
How, then, can you measure thinking skills? I believe one route (I’ll go on a limb and say the best route) is to have students not only do the work of a scientist, but to write about it, again like a scientist. First-year students often have poor communication skills, particularly when it comes to essay-style (open-ended) writing. When I query my students about when they last wrote an essay, they often say it was over a year in the past. Students who come to my class after having taken a year or more off since high school might be three, four, or more years “out of practice”. My own surveys of student writing skills show starkly the range of abilities of students, with the average work being hard to comprehend and incomplete. Workshops, assigned readings (e.g. Pechenik’s A Short Guide to Writing About Biology), and online resources (e.g. Labwrite: http://www.ncsu.edu/labwrite/) did little to improve student work. What did they need? Practice!
Ah, but there’s the rub. Instructors are often unwilling to intensively grade (that is, correctly and with lots of feedback) papers by their students, particularly when the work is, quite simply, bad. To become good, students have to write. They won’t write unless it’s graded (and to be honest, if I was a student and asked to write “because it’s good for me”, I wouldn’t – like students today, I had competing demands on my time and energy). Students’ reluctance to write also has roots in the fact that they find it hard to do. It’s a cycle – students won’t write because it’s hard; they get better if they write and so it becomes less hard; but they won’t write to get better because …
My lecturer colleagues seem to agree that student writing needs attention. When I tried to fly the idea of them assigning and grading writing, they balked at the idea of marking from 150 – 300 essays (I wonder why?). Many classes have Teaching Assistants (TAs) who can help share the burden of marking, but in many cases the result is hurried feedback because TAs often have very full schedules and many demands as well. In biology, grammatically correct, clear written communication may not be the forte of our instructional staff, and so the ability to give good, correct feedback may be limited.
Robert Hodson, a friend and colleague I met through ABLE, shares my concerns about the logistics of getting students to write and also help shape their skills. We both agree that the optimal teaching situation is to have a competent instructor give correct feedback on student writing. However, we have differing resources: my laboratory instructors have workloads that preclude marking multiple papers (one high-resolution evaluation per student is what I ask for and receive), while Bob has a steady crop of instructors with the workload of multiple markings built in. In the last few years, we’ve come up with some opinions and methods for helping students come to grips with clear scientific communication.
Because I lack manpower, I use Calibrated Peer Review (CPR). This is an online system that gets students to create open-ended work. Comments and grades are done anonymously by the participating students using a clever algorithm, and excellent, good, and poor work is modeled. It’s a far cry from having a professional grader work on the material, but the mere practice that this system evokes has yielded a demonstrably better final paper from students. CPR allows students to refine their writing skills through three exercises, and then apply what they’ve learned on a final paper that is evaluated by an instructor. By inserting these exercises before the final paper, the quality of the final project is obviously better, although we haven’t tried to quantify this (it’s the opinion of instructors who’ve taught the course over the last few years with and without the CPR activities).
Bob’s experience is that TAs sometimes are unsure about how to grade work. He’s instituted “norming sessions” in which the instructors all work on the same paper and together establish marking criteria. His very scientific approach, which included an analysis of the efficacy of CPR, showed that the norming sessions had a large positive impact on grading efficiency and consistency (which is critical to grading any open-ended assignment). His work with CPR showed that it did not detract from students’ abilities to write correctly. In one instance, there was statistically significant improvement of student writing ability. Once again, though, students benefited from repeated practice attempts of the exercise.
Clear writing opens a window into the minds of students as they try to construct a model of science principles. Coaching them to do this well requires feedback, reward, and effort. Individual instructor biases can be attenuated with norming sessions, yielding more consistent evaluations of student work. While having instructors provide feedback to students is the best situation, sometimes alternative strategies, such as online practice activities, must be employed. Finding ways to allow students to practice and receive feedback allows them to create better end products, as one might expect, but the resources to allow these extra practice attempts are sometimes not easy to find. |