Showcasing iClicker Epidemiology at the Teaching and Learning Symposium

For the first time in three years, the SFU Teaching and Learning Center offered a symposium on Leading Change @ SFU. It took place on May 16th and 17th and comprised plenary sessions, workshops, and panel discussions at the Burnaby campus.

Nienke vH. and Mark L. hosted a workshop entitled iClicker epidemiology:  giving students hands-on experience in the scientific method with student-directed data sets and ideas. The goal of the workshop was to demonstrate iClicker as a tool for generating questions and datasets that were then analyzed by students using the scientific method. The workshop was attended by ~ 20 participants from departments including Mathematics, Health Sciences, Psychology, SIAT and others, many of whom (63%) had no experience using iClicker in their classes and most of whom (90%) were not familiar with the data-filtering tool in iClicker. Participants had the opportunity to test iClickers themselves and observe the data collection and filtering process.

Context and rationale for exercise

The iClicker scientific method exercise was developed for HSCI 100 – Human Biology, an introductory course with no prerequisites or lab. Approximately 80% of the students enrolled in the course are from other faculties and this represents one of their few experiences in science. Thus, the goal of the exercise is to reinforce the theme of scientific methodology in the absence of a lab.

Nienke developed the exercise for the Fall 2011 offering of HSCI 100. The exercise was inspired by conversations with Charlie G. who uses student directed datasets in his statistics courses, an approach that is commonly used in statistics (Stedman, 1993; McGowan 2011). Since iClicker was already being used in the class, it was a natural extension for collecting student data and an engaging way to apply the theme of ‘scientific thinking’. Nienke has not been able to find reports that describe this approach for biology so its possible that this is a novel application of iClicker.

In Spring 2012, Mark refined the exercise by adding a pre-assessment component and increasing the amount of class time used to conduct the exercise. To prepare for the workshop, Nienke and Mark systematically reviewed the student responses to the exercise and combined their experiences so as to showcase the method.

iClicker epidemiology: the exercise

Flow of exercise

Both Nienke and Mark provide the students with a big picture question, specifically, they asked about independent variables that could influence exam scores.  Students provided suggestions in class or tutorial and the most frequent ideas were used. Variables that were suggested by students included:  commute time, volume of coffee consumed, year in school, total number of hours studying, hours of exercise per week, had students taken biology 12, ESL, lecture attendance, number of courses and hours worked per week. The series of questions were then input into the demographics questions list in iClicker. Click for iClicker epidemiology instructions.

In class, students would first answer the ‘big picture’ question. Results are shown in Figure 1, which shows the average exam scores for the first two midterms for Fall 2011.

What was your average exam score for the first two midterms?

  1. 85 – 100%
  2. 75 – 85%
  3. 65 – 75%
  4. 50 – 65%
  5. Below 50%

Students would then answer questions based on the independent variables that they had selected.

For example, how frequently did you attend lecture?

  1. Most classes
  2. Once per week
  3. Once every two weeks

A similar chart was generated for lecture attendance, and then the two charts were merged using the data filtering option in iClicker software (shown in Figure two). This process was completed for each independent variable. Data figures were distributed in tutorial for analysis by students and was guided by a list of questions based on steps of the scientific method. Click for Sample questions PDF

Nienke’s impressions

When I reviewed the student reports from the exercise, I observed that students misread graphs, made comparisons that were not represented by the data, confused control group and experimental group, used vague terms such as ‘better’, ‘some’ or ‘most’, and only one group mentioned the importance of sample size when analyzing the results. Thus, this exercise became a tool to demonstrate gaps in instruction and learning. This will significantly inform my future practice for this course.

Both TAs and students enjoyed the exercise, and students appeared to be genuinely interested in the data set since they had taken part in its generation. They also seemed more capable of observing weaknesses in the data collection methodology since they had taken part in the process. Future iterations of this exercise could include groups of ‘control students’ who work with datasets in which they do not have a personal investment to assess whether the student directed approach is more effective.

Mark’s impressions

My experience directly benefitted from Nienke’s experience from the Fall semester as well as input from an FHS T&L conversation.  Based on this, I introduced a ‘pre-assignment’ that was a condensed version of the actual assignment for the iclicker scientific method exercise.  This was made possible by using data obtained by Nienke and her Fall class.  This ‘pre-assignment’ was given to students the week before the actual exercise and was not graded.  It was intended to help students prepare for the exercise and think about parameters of the scientific method that they were soon to employ.  The ‘pre-assignment’ was not graded to provide a no risk setting for learning and would provide us with a ‘before’ and ‘after’ view of students’ understanding of the scientific method.

While students could easily list the steps of the scientific method and describe possible flaws such as bias in the ‘pre-assignment’, they had more trouble interpreting data figures and making fair conclusions.  Most also had difficulty in stating a null hypothesis.  On the actual exercise a week later with the data collected in class on their own questions, students had a much better grasp of the figures and could make more reasoned interpretations. In addition, they displayed a good appreciation of control groups and limitations/flaws in our approach and many offered excellent suggestions for improving the iClicker survey design and other sophisticated studies or experiments to test their hypotheses.  Especially notable was a gain in stating a valid null hypothesis.  In the future, I would like to obtain more quantitative results on student learning and the gain in understanding before and after using iClicker to apply the scientific method.

TA and student impressions

Teaching assistants and students have given encouraging feedback on using iclickers to apply and learn the scientific method in class.   A student and TA described their experience with the method at the workshop and indicated that the hands on experience and the relevance to their own life were important aspects that gave the approach more credibility than ‘2-dimensional’ textbook descriptions.  Perhaps the biggest drawback was that they felt even more time could be devoted to integrating the scientific method via iClickers in the course, and this possibility will be explored in upcoming semesters.


McGowan H.M., and J. Vaughan. 2011. Testing a student generated hypothesis using student data. Teaching Statistics. 34(2) pp61-64

Stedman, M. E. 1993. Statistical pedagogy: Employing student generated data sets in introductory statistics. Psychological Reports, 72(3), pp1036-1038.

By N. van Houten and M. Lechner