|
Hi Erin,
I love this idea! I'm definitely adopting it.
Best, Ronda
Dr. Ronda Arab Associate Professor of English Simon Fraser University
pronouns: she/her From: Erin Barley
Sent: 29 December 2022 11:34:44 To: Toby Donaldson; Ronda Arab; Steve DiPaola Cc: Behraad Bahreyni; Nicky Didicher; James Fleming; Sam Black; academic-discussion@sfu.ca Subject: Re: Some advice please re. ChatGPT Hi all,
One tool I have found helpful for addressing academic integrity, is asking students to write an Acknowledgements section for their assignments. This is distinct from a reference list, and could include things like: website X helped me understand concept Y, my friend Q helped me brainstorm ideas, my mother proofread a draft etc. Academic integrity can include a bit of a grey area, and my intention was to ask students to be transparent about this grey area. (Context: I introduced this during covid when students had open book assignments but were expected to write 'original' answers.)
The Acknowledgements section proved to be very helpful. I had several cases where students either copied from each other or from sources like Chegg, and did not list this in their Acknowledgements section. This made it easier for me to deal firmly with these cases - they had very clearly been dishonest. I also learned from the Acknowledgments sections how much students were helping each other out on social media groups. There was no penalty (they were honest), but I used this as an opportunity to further clarify for the class the boundary between how much help was OK and not OK. An Acknowledgments section is not a full solution to this complex problem, but it could be a useful tool in the academic integrity toolkit.
Erin
From: Toby Donaldson <toby_donaldson@sfu.ca>
Sent: Wednesday, December 28, 2022 10:36 PM To: Ronda Arab; Steve DiPaola Cc: Behraad Bahreyni; Nicky Didicher; James Fleming; Sam Black; academic-discussion@sfu.ca Subject: Re: Some advice please re. ChatGPT Hi all,
I've had a bit of experience with Github Copilot, which is similar to ChatGPT,
but trained specifically for programming. Copilot does remarkably well: it
regularly gives perfect answers to common basic programming questions.
It can also (with varying degrees of success):
- write documentation, examples, and test cases
- convert from one programming language to another
- "clean up" code to make it more readable
- explain a piece of code in step-by-step English (useful for beginners)
Another example: I cut-and-paste my introductory programming midterm into it and
it scored 35/37 (maybe my midterm is too easy!). A year ago this was science fiction.
And now it's routine, just another feature in my text editor.
Of course, it's not perfect, and does make mistakes. But overall I find that it saves me
time, and I've learned a lot from its suggestions.
I'm excited to see how teachers and students use Copilot. I think it could
improve the quality of introductory programming courses by giving students more
feedback more quickly.
Copilot has about a year head-start on ChatGPT as a practical tool, and may be a
sign of things to come for Word and PowerPoint. Features like this could become
standard: "summarize this paragraph", "polish this paragraph", "add more
examples", "write this paragraph in plain English", ....
I am less excited to see what scammers and spammers will do with ChatGPT-like
systems. Junk email and text messages may become a lot harder to recognize.
-- Toby
Appendix
========
For those who want more details of Copilot, here's an example. In a text editor
in a C++ program file, I typed this:
// count all s, f, and u characters in a string
int
In about 1 second it suggested this:
// count all s, f, and u characters in a string
int count_sfu(const string& s) {
int count = 0;
for(int i = 0; i < s.size(); i++) {
if (s[i] == 's' || s[i] == 'f' || s[i] == 'u') {
count++;
}
}
return count;
}
Which is a great answer. It even chose a good name and formatting style.
It can also create examples. I typed this after the above function:
// count_sfu("
Then it suggested the following, a line at a time:
// count_sfu("sfsu") == 3
// count_sfu("sfu") == 3
// count_sfu("sfs") == 3
// count_sfu("sf") == 2
// count_sfu("s") == 1
// count_sfu("f") == 1
// count_sfu("u") == 1
// count_sfu("") == 0
// count_sfu("abc") == 0
All correct and reasonable test cases that you might put into a real program,
and I did little more than press the TAB key to get the next line.
Another trick that Copilot can do is summarize code. If I highlight the above
program, it gives me this summary:
Here is the explanation for the code above:
1. The function `count_sfu` takes a string as a parameter and returns an
integer.
2. The function declares an integer variable `count` and initializes it to
0.
3. The function uses a for loop to iterate through the string.
4. The function increments the count variable if the character at the
current index is an 's', 'f', or 'u'.
5. The function returns the `count` variable
While hardly perfect, this is serviceable text that could be very useful for
beginners. They can get near-instant English explanations of *any* code without
needing to talk to a teacher or TA.
-- Toby Donaldson School of Computing Science Simon Fraser University From: Ronda Arab <ronda_arab@sfu.ca>
Sent: Wednesday, December 28, 2022 5:35:17 PM To: Steve DiPaola Cc: Behraad Bahreyni; Nicky Didicher; James Fleming; Sam Black; academic-discussion@sfu.ca Subject: Re: Some advice please re. ChatGPT Thanks, Steve,
That’s a helpful and interesting overview. I appreciate the perspective.
Best,
Ronda
Sent by magic
On Dec 28, 2022, at 5:05 PM, Steve DiPaola <sdipaola@sfu.ca> wrote:
|