The Educational Revolution and some Tips for using ChatGPT in University Courses

Let me begin by saying I don’t know much about educational practices and theory in China or India or Nigeria, etc.: these comments apply only to Western settler North American pedagogical theory and practice.

Educational specialists here have been pushing teachers to be the guide on the side, not the sage on the stage, since the 1990s. However, student-centred learning and experiential learning have been approaches more often given lip service in higher education in North American institutions, honoured and advertised but not widely implemented. A minority of instructors have committed to unlearning the ways they were themselves taught and implementing classroom practices more likely to lead to deep learning and understanding rather than a temporary and superficial acquisition of knowledge. We (I count myself in this group) have read, talked, supported each other, and experimented in our classrooms with what works best for our own fields and disciplines. At the same time, many of our colleagues have clung to the lecture as a delivery system for knowledge and as a way to model disciplinary knowledge acquisition.

In the world of Wikipedia, YouTube, and now ChatGPT, though, students can find information and models everywhere. If they don’t understand a professor’s explanation of Derridean deconstruction or solenoid circuits, they’ll google it and watch a few videos at 1.25 speed made by other faculty and students. Universities rely on the cachet of the PhD and the expertise of the instructor to keep students paying to come to listen to a live version of the information they’ll need to become literary critics or electrical engineers. Because salaries for continuing faculty with PhDs are large, many universities welcome the idea of online courses in which faculty make their own videos and students watch them on their own time, hoping that they can increase class sizes and eventually sell the course without paying the instructor/content creator.

But why should students pay massive amounts of money to listen to information they can get easily and quickly elsewhere and at any time? Education then becomes only about the degree, so why not cut corners and avoid classes, use AI to do the rote work, and get the degree? If all we’re offering is a slightly different take on available knowledge or a promise that in your graduate work you’ll begin creating new knowledge yourselves, then no, students shouldn’t be paying massive amounts of money for a BA or BSc.

In the AI revolution, education in the K-12 systems needs to be more hands-on and experiential. Homework will either disappear or become more creative and knowledge-generating, so students can't use ChatGPT to do their homework for them. AI literacy education will need to start in middle school grades. And, in higher education, we similarly need to use our contact hours for hands-on work, for discussion and debate, for individual attention from the expert instructor coaching student learners who are themselves producing new knowledge—designing new experiments and publishing their results, or writing original criticism and publishing their results. Instructors will be advisors, checkers, editors, motivators, coaches helping students develop skills. The lecture is moribund, though many people will want to give it palliative care rather than assisted death. Essays and other out-of-class assignments will need to be more creative, require critical thought and originality, be individualized to students’ contexts, and anticipate the use of AI assistance.

I personally welcome this big revolution in how the average university instructor designs classroom activities and assessments! The trouble is, to do this well means smaller class sizes, not the cost-saving mega-lectures and online courses that institutional administrations would prefer.

At the moment I see two main paths forward: 1) we teach more effectively in order to give a more useful learning experience, which means smaller classes and higher tuition—and, without more government subsidy, a more elitist and less accessible education!—or 2) we become another YouTube channel, offering videos from behind a paywall, which means larger classes and lower tuition, but the learning that students need to do is much harder to achieve because they’re on their own to learn and understand and develop skills … or they just use AIs to produce their assignments and never learn anything. Like the business and future tech author of this blog post and like many folks in Education, I see the best path as the one with small classes and good wages for teachers, but as a pessimist, I don’t see us getting there easily. Let’s be really vocal about the need for good education for as many people as possible, and for original critical thought in the copy-paste world that AIs give us. Institutionalized education is not dead, but it needs to change more quickly and more completely than it has for the past thirty years. Vive le revolution!

I've been experimenting with ChatGPT to find out what it can and can't do—for details, see the preceding blog post. I’ve also been watching videos such as this one and reading reactions from teachers like these, trying to get my head around how to design classroom activities and assessments when our students have access to powerful AI programs that can write essays and write code. And here are some suggestions I’ve gleaned from others (including the pieces I’ve linked to, other online sources, and some confidential list-serv discussion) and then made more detailed and specific.

Help students develop AI literacy

note from Mar 14: I found out today that to use ChatGPT in class you're supposed to have a PIA, privacy impact assessment; PIAs have been Canadian law since Nov 2021, but there are four levels of complexity, and for demonstrating an external (not in Canvas) web-based tool in class and not inputting any personal info, it's just an email to the right person to get the approval.

  • after teaching what a good essay looks like in your discipline, demonstrate ChatGPT in class and have a discussion about what it does well (grammar, organization) and what it doesn’t do well (style, conclusions, original thought, citations)
  • have ChatGPT write a paper, project it on screen or post it for the class in Canvas, then have students critique it and see if they can use it to develop a better and more original argument (or if it’s just too boring and unoriginal)
  • compare a list of research articles generated by ChatGPT to one from the MLA International Bibliography (or whatever database is best in your discipline)
  • discuss how academic integrity relates to using ChatGPT, Wikipedia, Chegg, tutors, etc. in your discipline
  • teach students how to read an academic article in your discipline, then have students read an academic article outside of class; afterwards, you use ChatGPT in class to summarize the article and have students critique its summary
  • have students do some in-class writing, have them do peer feedback, then have them ask ChatGPT to give them general feedback and see how that compares; next, have them ask ChatGPT to identify their grammar errors and see what happens (chances are good their peer feedback on writing will be better, but ChatGPT will find more grammar errors)
  • make students aware of the ways in which ChatGPT is embedded in systemic racism: specifically, how it defines "correct" English and how it reproduces widespread white ideologies of various kinds (I asked it to write a one-act play about refugees with three characters, two of whom were refugees, and got a wince-inducing piece about a white female landlady and two Middle Eastern immigrants who are excessively grateful to her.)

Change how you design written assignments

  • scaffold written assignments so students are working in class on drafts at different stages and getting feedback from peers or the TA/instructor at each stage that they must take into account in the next stage (an unscrupulous student could still ask ChatGPT to "revise using the following criteria")
  • add a required “acknowledgments” section to all written assignments, so that students can say they began with Wikipedia and ChatGPT to do their initial research, had a research librarian's help in finding better sources, got idea X from a peer reviewer, and had their mother proofread their paper
  • offer more creative, personal, and/or original assignments, and avoid assignments that merely ask for a summary of existing knowledge
  • consider using some form of portfolio grading, oral exams, or ungrading, keeping in mind that students have access to AI anywhere there’s an internet connection
  • find ways to publish students’ completed, revised, and polished work online at the end of term, so they’re doing authentic work and have more consequences if it later turns out they used AI to produce their work (at SFU, the library's Digital Publishing department will publish work from your course--I've worked with them on several projects)

Change how you do in-class work

  • have students work in class on written assignments while you circulate through the room looking at sentences and asking students questions about them
  • have discussions, not lectures; keep track of discussion points on screen and at the end comment on how this relates to or builds on current thinking on the subject
  • use peer reviews, small group discussions, think-pair shares, jigsaw groups, debates, role playing exercises, etc., etc.

For more, here's a link to Cynthia Alby's (co-author of Learning that Matters) useful list of short-term fixes.

******************************************

Oh, and I asked ChatGPT "what are the best classroom techniques to integrate ChatGPT into university courses and make students aware of the limitations of ChatGPT-generated essays?" Here's what it gave me:

Here are a few strategies that could be effective in incorporating ChatGPT into university courses and making students aware of its limitations:

  1. Use ChatGPT as a tool, not a replacement: ChatGPT can be a useful tool for generating ideas and helping students get started on writing assignments. However, it should not be used as a replacement for student writing. It's important to make it clear to students that the essays generated by ChatGPT should not be submitted as their own work.

  2. Discuss the limitations of ChatGPT: Make sure to discuss the limitations of ChatGPT with your students. This can include the fact that it is not capable of original thought, and that the essays it generates may contain errors or inconsistencies.

  3. Use ChatGPT in conjunction with other AI tools: Consider incorporating other AI tools, such as grammar checkers or plagiarism detectors, into your coursework to give students a well-rounded understanding of the capabilities and limitations of AI in education.

  4. Encourage critical thinking: Encourage students to think critically about the essays generated by ChatGPT and to consider how they might be improved. This can help students develop their own writing skills and become more aware of the limitations of AI-generated content.

  5. Stay up to date: Make sure to stay current on the latest developments in the field of AI and how it is being used in education. This will help you and your students stay informed about the latest capabilities and limitations of ChatGPT and other AI tools.