SABER 2013 Impressions, Day 2

I am not alone! This is the realization that I share after this morning’s seminars and conversations. I have taken so many notes that I have used up my cell phone battery. In which ways do I mean that I am not alone? First, I am on track with how I am thinking about biology education research and it aligns with approaches that other people are using. I am not coming out of left field. I kind of knew this, but its nice to have reinforcement.  Other instructors have the same questions that I do and the approaches that I am broadly considering seem to be legitimate. Second, I have taken part in several discussions about entrenched resistance to changing educational approaches in research institutions; this seems to be a problem from Texas to California and north to Vancouver. It is very validating to know that I share these challenges with other people and that solutions can potentially emerge from the many brains that are engaged in these questions.

This morning, I focused on attending talks that discussed classroom level assessment strategies and their efficacy. There are some practices that I would like to adopt in my own classroom. Here are a few that stand out:

Ella Tour is implementing a teaching approach with the goal of enhancing critical thinking about primary literature at the Master’s level. This ties directly into the research question that I described in yesterday’s post. She identified that the most difficult aspects of interpreting literature are data interpretation, understanding terminology, and understanding methodology. I have frequently seen students struggle with methodology and designed a somewhat silly exercise but very successful exercise for lower division students to expose the importance of understanding methodology. To assess the impact of learning, Ella’s students took a critical thinking skills test and a self-evaluation. What I found intriguing about her data was that students performance was not significantly different between the pre and post tests, however students reported that they had acquired more skills. This is a very similar pattern to I observed in my own research for one of the course sections that we studied. This demonstrates one of the challenges in this type of research – why do we see increase in confidence or perception of improvement and not clear skills increase? Do we need better tools to measure measure change?

David Gross described how his blended learning approach enhances student performance on exams and attitudes towards his upper division biochemistry course. He compared student performance after two approaches: a standard lecture format vs. online lectures & in-class active learning approaches (e.g., peer-to-peer instruction, TBL approaches, iClicker polls). He found that the blended learning approach consistently enhanced exam performance and improved student attitudes towards the class. He mentioned that the approach seems to target those students in the middle of the pack.  He reported that the initial time investment is huge, but once in place it takes about same amount of time to instruct as a standard lecture course. One of the audience members brought up an interesting and potentially loaded question, how do we deal with grade inflation as we improve our teaching?

I sometimes hear students ask for “the right answer” or “tell me the facts I need to know”. Thus, it was refreshing to hear Nancy Ruggeri’s description of using scientific uncertainty to engage students and enhance critical thinking so as to overcome the common misconception that  science is a certain set of facts. This is especially important for topics in Nancy’s teaching portfolio which includes evolution and anthropogenic climate change and extends into my interests in health claims or public misconceptions surrounding vaccination. In both cases, uncertainty is distorted by the media. Here is a funny cartoon about that. Nancy asks students to identify uncertainty and to analyze evidence that could could influence a larger theoretical model. Her approach has helped students realize a need to be skeptical about data and models with respect to scientific arguments.
Non-sequitur: I had no idea that a platypus did not have a stomach!

I have been seeking an approach to help students model conceptual processes for a 4th year Immunology course that I am teaching this fall. Elena Bray Speth described an approach that I might be able to adapt to my course. Her goal for her introductory biology class was to get students to thinks about why and how things happen in biology by asking them to build  conceptual models using box and arrow models. She uses a structure behavior function (SBF) theoretical framework described Goel et al. (1996). She took the models produced by the students and looked look for patterns in perception. She was able to expose areas in which students have misconceptions (e.g. Genotype to phenotype – don’t know where alleles come from, can’t articulate connection between genotype and phenotype).  I am planning on looking into this approach a little more deeply and hope that I can adapt it to my own practice.

Nienke van Houten