69ý

Artificial Intelligence

Deepfakes Expose Public School Employees to New Threats

By Olina Banerji — May 08, 2024 7 min read
Signage is shown outside on the grounds of Pikesville High School, May 2, 2012, in Baltimore County, Md. The most recent criminal case involving artificial intelligence emerged in late April 2024, from the Maryland high school, where police say a principal was framed as racist by a fake recording of his voice.
  • Save to favorites
  • Print
Email Copy URL

A recent, AI-generated “deepfake” audio recording of a principal making hateful comments has laid bare an uncertain landscape for educators—a bleak one that could consist of costly investigations, ruined reputations, and potential defamation cases.

On April 25, the Baltimore County Police department Dazhon Darien, 31, the athletic director at Pikesville High School in Baltimore, Md., with theft, stalking, and disrupting school operations. Dazhon had created and circulated a faked audio clip of Pikesville’s principal Eric Eiswert making racist and antisemitic remarks against students and colleagues. The audio clip, which surfaced in January, quickly went viral and divided the school’s community over its veracity.

For more than a year, U.S. schools and districts have grappled with the wide-reaching implications of AI technology for teaching and learning. What happened to Eiswert, who has since been absolved of wrongdoing, shows that AI can also be weaponized against school officials—and that most districts are ill-equipped to handle that threat.

School leaders have noticed—and they believe something similar could just as easily happen to them or their staff.

“I was very alarmed to see that AI could be used in this way. As someone who is a considered by law to be a public figure, you are always open to criticism of this nature,” said Kimberly Winterbottom, principal of Marley Middle School in Glen Burnie, Md. “Someone could click a picture across a parking lot and put it up on social media. But this is a whole new level.”

A lack of policy

After the deep fake audio recording came out, Eiswert was put on administrative leave between January and April while the county police and the school district investigated. He isn’t coming back to Pikesville High this school year, said Myriam Rogers, the superintendent of Baltimore County Public 69ý, in a statement. The district is “taking appropriate action regarding the arrested employee’s conduct, up to and including a recommendation for termination.”

Rogers did not clarify if Eiswert will return for the new school year.

The faked audio clip, over 27,000 times, roiled the Baltimore County school community—with demands that Eiswert be removed as principal. His physical safety, and that of his family, was threatened on phone calls and messages that Eiswert received.

Long before the incident, though, there were growing undercurrents in schools of AI tools being misused to target students and educators alike.

Male students have used apps to fake pornographic images and videos of female students; in March 2023, a group of high students in Carmel, N.Y., created a deepfake video of a middle school principal in the district shouting angry, racist slurs and threatening violence against his Black students.

Such cases have shone a light on the yawning gap between a rapidly evolving technology, and a lack of policies to govern it.

“We definitely need some adaptation to bring the laws up to date with the technology being used. For instance, the charge of disrupting school activities only carries a maximum sentence of 6 months,” said Scott Shellenberger, the Baltimore County state’s attorney, at a press conference held after Darien’s arrest.

Principals are vulnerable because of their positions

It’s still unclear what specific tool Darien used to create the deepfake. A by the Baltimore Banner said that Darien had used the school’s internet to search for Open AI’s tools and large language models that could process data to produce conversational results.

As authority figures who must take disciplinary action from time to time, principals contend that they are more susceptible to backlash and vengeful reactions, which can now easily take the form of believable-yet-fake video and audio clips. They fear that the technology will progress to a point where it will be difficult to distinguish between real and fake.

The relative ease with which Darien faked the audio has principals thinking closely about how they communicate with their staff, students, and the parent community.

For one thing, it doesn’t take a lot of data for an AI tool to be able to replicate a voice from an audio clip.

“I have a colleague who sends out a voice message to her student community. I told her she should stop that,” said Melissa Shindel, the principal of Glenwood Middle School in Glenwood, Md. “It could need less than a two-minute audio clip. And you can’t always trace the origin.”

Shindel said she’s been cautioning other school leaders about their unbridled support of AI in their schools.

“People are in denial about the harm it could do,” she said. “Deepfakes are more damaging than negative social media posts. You believe what you see or hear, over what you read.”

A troubling notion, exemplified in the Eiswert case, is that a grievance could spin into AI-fueled revenge—a parent unhappy about how a child was disciplined, a student or staff member angry about a decision. Shortly before Darien created and spread the fake audio clip, Eiswert had been investigating Darien’s alleged misuse of school funds.

“We are vulnerable. Everything that happens in the school funnels to me. Credit and blame,” Shindel added.

Winterbottom, the principal from Glen Burnie, said Darien’s extreme actions made her revisit the impact she has on people, especially when disciplinary issues are involved. But getting charged by the police has made Winterbottom hopeful that the case has set up the right message.

“I’m ecstatic that they were able to trace it [the audio]. The precedent is that you’re going to get caught,” she said.

She hopes it will make people think twice before they jump to conclusions when they encounter a faked charge.

Districts can be proactive on AI use, but can’t prevent misuse

Eiswert, though pilloried on social media and put on administrative leave by his district, had the support of the Council of Advisory and Supervisory Employees, which represents school administrators. CASE’s executive director, William Burke, said in an email that CASE has maintained the audio was AI-generated from the time it surfaced.

CASE engaged AI experts to assess if the audio was real, and put Eiswert through a polygraph test, the results of which, Burke said, showed conclusively that Eiswert had not “made the statements on the audio.” The evidence from the AI experts and the results of the polygraph were shared with the police, Burke said.

Eiswert did not respond to several request for comments sent via CASE, which is handling media inquiries related to the incident.

Such investigations can prove expensive and time consuming for unions, schools, and district administrations, especially if they have to investigate multiple cases.

“School and district administrators will be given what looks like real evidence [of wrongdoing]. If they don’t know the risks associated with a tool like AI, they may believe the evidence even if it’s falsified,” said Adam Blaylock, a lawyer who works with school districts in Michigan.

Blaylock fears that the lack of awareness about AI could put districts at risk of lawsuits. “If you end a young administrator’s career based on something that’s not true, it opens up the district to a huge risk of litigation,” said Blaylock.

As for victims of AI frauds, states’ defamation laws typically put the burden of proof on the victim, who would have to engage experts to prove it’s not them on the audio or video, an expensive proposition.

“A principal, like in Pikesville’s case, may feel personally harmed. But if there was no firing or demotion, they will be hard pressed to show damages,” Blaylock said.

Blaylock is keen to help school districts avoid the pitfalls for AI generated deepfakes. His advice is to build defenses to help identify AI-generated content.

Updating student and employee handbooks with specific clauses about AI is one idea.

“We have one districtwide policy about technology misuse, which covers cellphones. It should now have language specific to AI,” said Winterbottom.

School and district teams should have one member who is either an AI expert, or constantly updates their knowledge about the quick developments in the technology. “There are some telltale signs of AI-generated content. People in AI videos will sometimes have six fingers. The expert on the team will be familiar with these indicators,” said Blaylock.

Ultimately, there is no Turnitin for AI deepfakes yet, Blaylock said, referring to the popular tool used to detect plagiarism in student work. District administrators can only hedge the risk—maintain a list of approved generative-AI tools, train individuals on the appropriate use of AI, and comply with data-protection standards when dealing with any contractor who will use the school’s data.

Blaylock encourages individual school leaders to maintain a healthy skepticism about media that seems incredulous, while not backing away from muscular leadership.

“The risk is going to be focused on how we review information. But I can’t ask school leaders not to do what’s best for their kids … because they’re scared of a deepfake.”

A version of this article appeared in the May 29, 2024 edition of Education Week as Deepfakes Expose Public School Employees to New Threats

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Artificial Intelligence Webinar
AI and Educational Leadership: Driving Innovation and Equity
Discover how to leverage AI to transform teaching, leadership, and administration. Network with experts and learn practical strategies.
Content provided by 
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School Climate & Safety Webinar
Investing in Success: Leading a Culture of Safety and Support
Content provided by 
Assessment K-12 Essentials Forum Making Competency-Based Learning a Reality
Join this free virtual event to hear from educators and experts working to implement competency-based education.

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.

Read Next

Artificial Intelligence From Our Research Center 'We're at a Disadvantage,' and Other Teacher Sentiments on AI
Teachers say they have other, more pressing priorities.
3 min read
3D illustration of AI button of keyboard of a modern computer. Light blue button. 3D rendering on blue background. Training.
Bo Feng/iStock + Education Week
Artificial Intelligence Opinion What Makes 69ý (and the Rest of Us) Fall for AI Misinformation?
Researchers Sam Wineburg and Nadav Ziv explain how to turn your students into savvy online fact-checkers.
Sam Wineburg & Nadav Ziv
4 min read
Trendy pop art collage search concept. Halftone laptop computer with search bar and cut out hands pointing on it.
Cristina Gaidau/iStock
Artificial Intelligence Parents Sue After School Disciplined Student for AI Use: Takeaways for Educators
The Massachusetts lawsuit is one of the first to highlight the benefits and challenges of generative AI use in the classroom.
5 min read
Person using technology smart robot AI, enter command prompt. A.I. Chat concept AI, Artificial Intelligence.
iStock/Getty
Artificial Intelligence Q&A This Counselor Used AI to Help 69ý Apply to College. Here's How
Jeffrey Neill shares his tips on when it makes sense to use AI in the college application process.
6 min read
Jeffrey Neill, director of college counseling at Graded - The American School of São Paulo in Brazil, presents on how to use AI tools in his work at the College Board’s annual forum in Austin, Texas on Oct. 21, 2024.
Jeffrey Neill, director of college counseling at Graded: The American School of São Paulo in Brazil, presents on how to use AI tools in his work at the College Board’s annual forum in Austin, Texas, on Oct. 21, 2024.
Ileana Najarro/Education Week