Turnitin launched its ai detection tool in April 2023 and has since reviewed more than 200 million student papers and found that 10.3% included at least 20% ai-generated content. Additionally, 3% (more than 2 million articles or other written materials) consisted of at least 80% ai-generated content.
An independent event sponsored by Turnitin survey on the use of ai at the university among teachers and students It also found that between spring 2023 and fall 2023, the number of students who said they used ai at least once a month increased by 22%, rising from 27% to 49% of respondents.
These findings are consistent with other recent data points on ai use, including a recent survey of college students from Intelligent.com which found that 37% of students used ai, and 29% percent of these students use it to generate complete articles.
Anecdotally, I've noticed a steady upward trend in the number of ai-generated articles I see in the introductory English courses I teach.
I recently spoke with Patti West-Smith, Senior Director of Customer Engagement and Experience at Turnitin and former Director. He discussed what this recent Turnitin data on ai cheating means and what we as teachers can do to protect academic integrity and, more importantly, the student learning that occurs through writing.
<h2 id="are-1-in-10-students-really-using-ai-to-cheat-xa0″>Do 1 in 10 Students Really Use ai to Cheat?
Not quite. Although Turnitin data found that about 1 in 10 submitted articles contained at least 20% ai-generated content, West-Smith isn't particularly concerned about those articles because that level of ai writing in an article could involve overuse. legitimate.
“Students who have language difficulties might be looking for a little help or have used it for research, and potentially didn't know to cite it depending on the instructor, the institution and their requirements,” he says.
<h2 id="what-about-the-3-of-papers-that-were-more-than-80-ai-generated-xa0″>What about the 3% of articles that were more than 80% generated by ai?
These 2+ million articles are more concerning. “That indicates that ai is replacing the student's own thinking,” says West-Smith.
This is a problem for several reasons. “You don't want a student to receive credit for work she didn't complete, and from an assessment perspective, that's really important,” she says.
But more important than academic integrity is how the student is shortchanging their own learning process, West-Smith says. “Writing is a tool for thinking. It is the way the brain makes sense of information. And if you outsource that to ai on a regular basis on a large scale, like 80% of the writing, then what that tells me is that as a student, you're completely disconnected from that learning process, you've essentially outsourced giving it to a contractor. ”.
<h2 id="has-ai-cheating-replaced-other-forms-of-cheating-xa0″>Has ai cheating replaced other forms of cheating?
I have I've written about how the prevalence of ai-generated work in my classroom has cost me a lot of time.. However, the positive thing is that I have started to see fewer cases of traditional plagiarism. Unfortunately, this appears to be a fluke.
“We theorized that we would potentially see this dramatic drop in the more classic cases of plagiarism,” says West-Smith. Turnitin data has not revealed this so far. “We are seeing a lot of text similarity. I think one reason is that, in some cases, text similarity is not intentional plagiarism. You have a huge skill deficit that leads to that. Students who do not know how to paraphrase correctly. “Students who don’t understand quotes.”
<h2 id="what-can-i-do-to-prevent-ai-writing-in-my-class-xa0″>What can I do to avoid ai writing in my classroom?
This is one of the pressing issues in education. For his part, here are West-Smith's suggestions:
- Institutions must have clear guidelines on the use of ai that are communicated to teachers and from teachers to students.
- Institutions should also have clear guidance on whether, which ai detection tools are used, and what educators can do with the information these tools provide. Because of false positives, educators should use ai detection readings as just one data point to evaluate whether a student used ai.
- Educators should educate themselves about ai. Know the strengths, weaknesses, etc. of the tool.
- Specific class policies around ai should be communicated, as many instructors have ai use cases that they agree with and others that they do not allow. These policies sometimes vary from class to class.
Ultimately, West-Smith says the communication component is critical and very easy to overlook.
“A mistake we sometimes make as instructors is that we assume that students have the same type of value systems as us, and that they will implicitly know what is right or wrong from our perspective. And in my experience, that is almost always not true,” she says. “The moment you assume that students believe the same thing, you believe. You are already at a level of lack of communication.”