69ý

Opinion
Artificial Intelligence Opinion

The AI Cheating Crisis: Education Needs Its Anti-Doping Movement

Three reasons to reject ‘AI doping’ as the new normal
By Noor Akbari — February 28, 2024 5 min read
A group of students are confronted with a robotic arm that extends from an open laptop holding a stack of books and a graduation cap
  • Save to favorites
  • Print
Email Copy URL

Since launching in November 2022, OpenAI’s ChatGPT, a generative artificial intelligence, has been compared to “steroids” numerous times. “ChatGPT is like steroids for your skills,” one Reddit user. It’s “Google on steroids,” a journalism professor. AI will be like Photoshop but “on steroids,” OpenAI CEO Sam Altman a U.S. Senate subcommittee last year.

Perhaps “steroids” is the right term. Like steroids in sports, generative AI could create a global cheating crisis that undermines the purpose and value of an education.

International sporting had its ChatGPT moment in 1998 when French customs agents found narcotics, testosterone, amphetamines, growth hormones, and syringes inside the car of Willy Voet, the caretaker of the Festina cycling team, on his way to the Tour de France. Further investigation revealed that the entire Festina team took banned substances in coordination with their management and doctors. All three medalists in the 1998 Tour de France were later found to have been doping.

Subsequent doping scandals in Major League Baseball and other major athletic leagues further tarnished the reputation of sports as a fair and meritocratic institution.

The rise of “AI doping” is strikingly similar. In a Study.com of 1,000 college-age students, 89 percent of respondents admitted to using ChatGPT to complete a homework assignment. Another 48 percent admitted to using it on at-home tests or quizzes, and 53 percent had the bot write an essay. Yet, 72 percent of the students reported believing ChatGPT should be banned from campus networks.

Are they hypocrites? No. When enough players in a competitive game can cheat with a high upside and low risk of consequences, other players will feel forced to cheat as well. As Lance Armstrong the French newspaper Le Monde in 2013, several months after publicly admitting he had used performance-enhancing drugs, it was “impossible to win the Tour de France without doping.”

If enough students improve their grades using ChatGPT, their peers may conclude that it’s “impossible” to compete unless they cheat, too. In cycling, not doping meant losing competition winnings and sponsorships. In education, not using generative AI could mean losing out on college admissions, scholarships, and career opportunities.

At the height of the sports doping crisis, a common argument was to let doping happen. If no substance is banned, isn’t the playing field level? The counterargument is that substances are banned because they pose a health risk to athletes. If sporting organizations not only allowed but tacitly encouraged athletes to dope, the resulting biochemical arms race would have a sure loser: athletes and their well-being.

Likewise, the normalization of AI doping would create an arms race among students, resulting in several consequences for them and society:

1. Unchecked use of AI renders education pointless. We fund public education as a common good because it empowers citizens to live fulfilling lives and contribute to their communities. If the point of education is merely to get a diploma, then who cares if a student or AI does the work? The true point of an education, however, is to train a person’s mind and character.
Claiming that students no longer need to learn skills like writing—because AI does it—is like arguing that no one should strength train because carts and forklifts move heavy stuff for us. This conflates means and ends. People lift weights for the inherent benefits to their mind and body. Likewise, we learn to write for the inherent benefits to our cognition and communication skills.

2. AI threatens to undermine academic integrity, the foundation for professional credibility. We trust our surgeon, certified public accountant, or lawyer because we trust the institutions that test, certify, and employ them.
Well, researchers have found that ChatGPT can pass the U.S. medical licensing exam, CPA exam, and . You might argue that cheating on those exams is almost impossible. But what about an online nursing exam, an online certification in cybersecurity, or an online degree in social work? A person who cheats for the credential in those cases could become a danger to others.

3. The struggle to maintain academic credibility could produce a two-tiered education system that is even more inequitable than the current one. Elite colleges with full-time professors and graduate students have the resources to design assignments in which AI provides no edge. Community colleges and online education platforms don’t have that luxury. A class with hundreds of students and one part-time instructor cannot convert every digital test into an original research project or in-person test with pencils and paper—not without raising costs considerably.
Efforts to democratize education will be laughable if the only credible degrees come from private, in-person institutions that cost students an average of almost per year.

K-12 schools will experience similar inequities, particularly between expensive private schools and crowded public schools. 69ý with lower student-teacher ratios are better positioned to design assignments that limit students’ reliance on AI than schools with more limited staff and resources.

So how do we address AI’s threat to academic integrity and an affordable education?

Forget watermarking AI-generated text and AI detectors—they’re easily duped. And forget academic “honor codes.” In my birth country of Afghanistan, the strict honor codes of Islam that forbid corruption didn’t stop the country from becoming an after the United States’ 2001 occupation injected billions of dollars. Like Lance Armstrong, no one struggled to justify corrupt behavior when everyone else was doing it, too.

Exams, whether in person or online, must be proctored such that no one can cheat using AI. That said, to prepare students for the working world, schools should teach generative AI in classrooms using versions with limited capabilities.

The silver lining of the 1998 Tour de France was that the International Olympic Committee formed the World Anti-Doping Agency the following year. Though far from perfect, WADA created a unified list of banned substances and standards for detecting them. In other words, the organization defined what “doping” means in sports. Soon enough, education systems may need a WADA-like organization to define cheating in the AI age and set standards for preventing and detecting it.

Doping in sports undermined the fairness and meritocracy of a beloved institution, until that institution took the threat seriously. It’s time we take AI doping in schools seriously.

Related Tags:

A version of this article appeared in the March 13, 2024 edition of Education Week as Education Needs Its Anti-Doping Movement

Events

School & District Management Webinar Crafting Outcomes-Based Contracts That Work for Everyone
Discover the power of outcomes-based contracts and how they can drive student achievement.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School & District Management Webinar
Harnessing AI to Address Chronic Absenteeism in 69ý
Learn how AI can help your district improve student attendance and boost academic outcomes.
Content provided by 
School & District Management Webinar EdMarketer Quick Hit: What’s Trending among K-12 Leaders?
What issues are keeping K-12 leaders up at night? Join us for EdMarketer Quick Hit: What’s Trending among K-12 Leaders?

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.

Read Next

Artificial Intelligence Opinion What to Know About AI Misinformation: A Primer for Teachers (Downloadable)
It’s not difficult to educate students to be savvy about artificial intelligence. Two researchers offer simple steps.
Sam Wineburg & Nadav Ziv
1 min read
Modern collage with halftone hands, eyes and search box. Person looking for information in the search bar. Concept of searching, looking, finding opportunities and knowledge in internet. SEO concept
Alona Horkova/iStock + Education Week
Artificial Intelligence Q&A What Happens When an AI Assistant Helps the Tutor, Instead of the Student
A randomized controlled trial from Stanford University examines the efficacy of an AI-powered tutoring assistant.
4 min read
Illustration of artificial intelligence bot in a computer screen teaching math.
iStock/Getty
Artificial Intelligence From Our Research Center 'We're at a Disadvantage,' and Other Teacher Sentiments on AI
Teachers say they have other, more pressing priorities.
3 min read
3D illustration of AI button of keyboard of a modern computer. Light blue button. 3D rendering on blue background. Training.
Bo Feng/iStock + Education Week
Artificial Intelligence Opinion What Makes 69ý (and the Rest of Us) Fall for AI Misinformation?
Researchers Sam Wineburg and Nadav Ziv explain how to turn your students into savvy online fact-checkers.
Sam Wineburg & Nadav Ziv
4 min read
Trendy pop art collage search concept. Halftone laptop computer with search bar and cut out hands pointing on it.
Cristina Gaidau/iStock