AI plagiarism is spreading in US colleges. It’s left professors feeling confused and exhausted.
A professor says AI cheating is like a “virus” in US colleges.
When Darren Hick, a philosophy professor, first came across an AI-generated essay in late 2022, he knew it was just the start of something bigger.
Almost two years later, Hick says the use of AI among students has become a “virus.”
“All plagiarism has become AI plagiarism at this point,” Hick, who teaches philosophy at Furman University, told B-17.”I look back at the sort of assignments that I give in my classes and realize just how ripe they are for AI plagiarism.”
Students were some of the earliest adopters of AI-text generators when they realized their potential to produce essays from scratch and help with assignments.
This quickly resulted in a rise in plagiarism, false accusations of cheating from educators, and a new atmosphere of distrust between students and professors.
Schools and universities initially tried to combat the flood of AI-generated plagiarism by banning the technology outright. Now, many are trying to incorporate technology into curriculums and encourage students to use it responsibly.
But the mixed messaging and lack of guidance have left professors and students fatigued and uncertain about how they should be using the tech, professors say. They say the widespread use of AI is becoming harder to spot and even harder to police.
“There’s no top-down national guidance,” Christopher Bartel, a philosophy professor at Appalachian State University, said. “There isn’t even at the university level, top-down guidance on it.”
“It comes down to departments or sometimes individual instructors — there’s a lot of confusion over that,” he added.
Back to basics
Some professors have radically transformed their teaching to try to stamp out the use of AI among students.
Bartel said he has re-introduced in-class, handwritten assessments.
“Over the past two years, I’ve changed the way that I grade things. I have gone back to book exams,” he said.
Before OpenAI launched ChatGPT in late 2022, Bartel said 100% of his classes consisted of take-home essays. Now, only around 30% of his classes are essay-based. Even then, he usually reserves take-home essays for the upper-level students doing more refined work.
“The thing that I’ve been most surprised with is how happy the students are with it. They didn’t push back or get upset about it,” he said. “I told them the problems that I was having with students using ChatGPT, and they were very reasonable.”
Spotting AI-generated content is also becoming increasingly difficult as models improve, Hick said.
“A lot of the tells are gone now. There are other little things that are giveaways, but they’re subtler now,” he said. The AI detectors on offer are also not perfect and run the risk of penalizing students who have done nothing wrong.
In its FAQs for educators, OpenAI acknowledges that there is no surefire way to distinguish between AI and human-made content. In answer to a question about whether AI detectors work, OpenAI wrote, “In short, no.”
“While some (including OpenAI) have released tools that purport to detect AI-generated content, none of these have proven to reliably distinguish between AI-generated and human-generated content,” the company previously said.
‘Spinning plates’
It has been almost two years since ChatGPT arrived on the scene, but colleges across the US are no closer to solving the issues surrounding AI plagiarism.
In some fields, there is a growing acceptance of the use of AI.
“It’s like the Wild Wild West right now because the technology is still evolving, and the use cases are still evolving,” Adam Nguyen, founder of tutoring company Ivy Link, said. “Universities are still trying to figure out what to do with it, so they’re leaving it to the individual departments and professors to come up with their own policies in many cases.”
Hick said he had been surprised by how many teachers have embraced AI and are trying to make the most of it in their classes.
While he acknowledged there are some acceptable uses — for example, students with English as a second language using the tech to polish up essays — he says the shift is still a minefield for professors trying to determine what is and isn’t plagiarism.
“It’s a big change because there hasn’t really been a technology that fundamentally changed the way that we work since Google,” Bartel said.
The new level of detective work involved in uncovering AI plagiarism is also adding hours more work to educators’ already overworked schedules.
“It feels like we’re spinning plates or something,” Hick said. “We’re going to get more tired. You can only keep those plates in the air for so long before you’re just exhausted.”