Do Chatbots like ChatGPT Encourage Students to Cheat?
Since its launch last November, ChatGPT—a chatbot created by an artificial intelligence research lab called OpenAI—has been all over the news headlines. With just a few prompts, it generates text that is indistinguishable from what a person would write. And, well, it has raised a red flag.
Many are worried that text-generating AI will have such a wide-reaching impact that it will severely compromise a person’s ability to write coherently, creating a world of—as one Fortune headline descriptively put it—“lazy plagiarists.” So, where does that leave us? Ursinus Magazine asked two faculty members for their quick thoughts on the ChatGPT revolution.
Chris Tralie is an assistant professor of math and computer science. Talia Argondezzi is the director of Ursinus’s writing and speaking program. Here are their thoughts:
So, should we be worried about ChatGPT?
Argondezzi: ChatGPT and other AI tools can produce pretty impressive simulacra of human writing, but there’s no reason to take any newly adversarial approach to student writing, since students are mostly relatively earnest in their desire to learn. There’s no big cause for alarm. For as long as I’ve been teaching, there have always been a certain number of students who look for ways to cut corners. But most students don’t, and any new technology is unlikely to change that proportion.
But how accurate is it? And can it be detected?
Tralie: ChatGPT and other large language models have been called “stochastic parrots” because they are fed most of the Internet, and they spit back out a random traversal through related words and sentences that sound convincing given a particular prompt. Convincing is the key word here. I personally find it hard to trust the outputs of these because they were not necessarily trained to be truthful or trustworthy, but to model some kind of “average consensus”—with randomness—of what happened to be on the Internet when it was trained.
Argondezzi: This technology does make it harder to prevent and detect plagiarism. It’s not just that students can easily use AI to generate papers. Savvy students can instruct the AI to include the kinds of mistakes that will make it seem plausible the papers were written by an ordinary person. It will also become more difficult to “plagiarism-proof” assignments. Including personal angles and current events might help, but the AI currently can fabricate personal anecdotes relatively well and is able to read whatever contemporary articles or issues you ask students to interact with.
Tralie: There may be a ceiling to how well systems are trained to generate “intelligent” answers. Right now, the technology still lacks nuance, and organizations are working to update AI reinforcement learning with human feedback. I do believe that these technologies will continue to improve.”
Are you concerned that students will use AI to write papers?
Argondezzi: The best deterrent of plagiarism of any kind is to offer students assignments they care about. Ursinus is in a great position to help students navigate this technology and maintain their engagement with our course content and their own writing. The very best way to prevent students from using AI to write their papers is to make essay-writing meaningful to them. Most students will put time into tasks they consider “worth it.” It seems like a good idea to talk with our students about AI-generated writing—how they use it, what uses they think are helpful and ethical within the context of a college education and in other contexts, and how they think faculty should be either encouraging or forbidding or something in between.
Tralie: If the goal is to get students engaged and not just mindlessly parrot a prompt into ChatGPT and turn in the answer, we must make writing prompts incredibly bespoke. I think CIE [the Ursinus Common Intellectual Experience] is already naturally set up well for this because we brainstorm new prompts every semester. We should begin to have as many discussions as possible with students, both about the value of learning the skills for themselves, but also about having them explain the work they have submitted.