‘The only winning move is not to play’

June 5, 2023

UM-Dearborn’s decision to opt out of a controversial AI-writing detector earlier this year was not the first time the university was a leader in protecting students’ digital rights.

Photo of students working on an assignment in the CASL Atrium
Students often gather in the CASL Atrium to work on final projects and papers.

Turnitin – the plagiarism-detecting software used by UM-Dearborn and thousands of other schools and universities worldwide – made a big announcement in February: it would soon integrate an AI-writing detector into its platform. The rationale for the add-on was clear. Since the release of ChatGPT in November 2022, generative AI tools have taken off. Like the advent of the World Wide Web in 1990, this potentially society-altering technology was released into the world before regulators could formulate or coordinate responses. The implications for higher education were obvious. Reports of students using generators to write exam answers, or even entire essays, soon followed.

Still, when UM-Dearborn Director of Digital Education Christopher Casey saw Turnitin’s announcement, his first instinct was, “Whoa.” Like the technology it was designed to catch, the AI-detection software was being rolled out quickly and would inevitably have flaws. Indeed, the company’s own press release stated the software had a one percent error rate. The Office of the Provost estimates that UM-Dearborn faculty run 20,000 student writing samples through Turnitin each semester. This would mean up to 200 of these assignments could be incorrectly flagged for an academic infraction – and that was if the company’s projection was accurate.

What’s more, the “black box” nature of the detector means that the software does not explain how it arrives at its conclusions. For any student whose work was called out, it would be their word against the machine’s. “There's no hard evidence,” said Casey. “It’s just this mystery. If a student says, ‘No, I didn't do that,’ how do they refute that? It is just a student versus this detector.”

Photo of Chris Casey
Director of Digital Education Christopher Casey

Casey’s colleagues in the Office of the Provost and the Hub for Teaching and Learning shared his concerns. So, when the add-on launched in April, UM-Dearborn joined several universities across the U.S. and internationally in requesting an opt-out. Turnitin did not grant every opt-out request, but it did UM-Dearborn’s – in part, Casey believes, because the university was proactive and persistent. The move garnered national media attention, with Associate Provost for Undergraduate Programs and Integrative Learning Mitchel Sollenberger quoted in a Washington Post article that made the issue clear right in its headline: “We tested a new ChatGPT-detector for teachers. It flagged an innocent student.”

This was not the first time UM-Dearborn made news for being out front in protecting students’ digital rights. As online exams proliferated during the pandemic, the university received attention for being one of the first to issue a policy prohibiting the use of remote proctoring software. That decision was based on a range of concerns, including unequal access to the necessary technologies, privacy issues and growing evidence that algorithm-based proctoring was discriminatory. For instance, it sometimes flagged eye and head movements as suspicious when they were not (this has particular implications for students with certain disabilities) and it could have trouble recognizing darker skin tones. 

Ultimately, the Office of the Provost determined that online proctoring ran counter to the university’s student-focused mission. “It's like immediately going and creating this conflict between students and the university,” Casey said. “We're not starting from a place of being partners in learning. We're starting from a place of being adversaries and that's not the place we want to start from.” After the policy took effect, Casey and his colleagues at the Hub published a research paper sharing what they’d learned.

The university’s stances on AI detection and online proctoring were informed by growing concern nationally and internationally about the algorithmic tools designed to catch academic infractions. In 2020, the Electronic Frontier Foundation, one of the leading organizations working to protect citizens’ digital rights, noted that most remote proctoring tools were “effectively indistinguishable from spyware, which is malware that is commonly used to track unsuspecting users’ actions on their devices and across the Internet.”

That same year, U.S. Senator Richard Blumenthal led an inquiry into three of the largest virtual testing companies after receiving reports of “alarming equity, accessibility and privacy issues faced by students using the platforms.” Last year, a federal judge ruled in favor of a student who sued Cleveland State University, alleging that a room scan taken as part of online proctoring was unconstitutional. 

Casey said the Office of the Provost recognizes the extreme challenges faculty face in the current digital environment and that the Office of the Provost and the Hub are working swiftly to provide guidance and support. In February, the Hub released a Generative AI at UM-Dearborn guide outlining some of the benefits and concerns that UM-Dearborn identified early on. Currently, Casey is leading a UM-Dearborn task force on generative AI composed of faculty and staff representing all four colleges. The task force is meeting every other week and expects to release a report later this summer. Associate Professor of Electrical and Computer Engineering Paul Watta represents UM-Dearborn on a tri-campus U-M task force on generative AI, which is also expected to release a report this summer. In addition, faculty and staff are welcome to join several cross-campus learning communities on this topic .

Casey acknowledges that, with the technology evolving so quickly, as soon as UM-Dearborn’s report is issued, it will be outdated. The key, he said, will be facing that reality. “Those generative AI services are here. They’re not going away,” he said. “We're beyond the point of blocking it. You're standing in front of a rising tide and you think you're going to stop the rising tide? You're not. So that's what our task force is really now grappling with.”

One strategy Casey anticipates the task force will recommend is redesigning assignments so that, for instance, students don’t just answer questions or respond to prompts but show how they arrived at their conclusions. The Hub has been supporting faculty with such redesigns since the start of the pandemic and the more hands-on, show-your-work approaches the Hub recommends are inherent in the university’s practice-based learning model. 

It’s important to recognize that generative AI has great potential to be used as a learning tool, Casey stressed. For instance, in coding, it might take over basic functions and free up users for more complex tasks. (Read three campus experts’ thoughts on generative AI.)

Developing a thoughtful, coordinated response to higher ed’s new digital reality will be a process, Casey said. “It's not going to be an instantaneous change,” he explained. “Instead of doing these stop-gap, quick measures that really aren't super effective in the immediate term and definitely don't have a lot of benefit long term, we're looking to the future and saying, ‘What should this look like? And let’s just start moving there.’ We need to go toward the end game.” 

“Much like the AI in the 1983 movie ‘Wargames’ figured out about global thermonuclear war, we continue to believe that when it comes to AI detection, the only winning move is not to play,” he added. 

One thing Casey now knows for sure: the university’s concerns about Turnitin’s AI-detector were on point. Two weeks ago, the company issued an update on the software. Among the flaws the company has identified since the roll-out? Higher than estimated incidences of false positives in certain scenarios.

Article by Kristin Palm.