69ý

Artificial Intelligence

Los Angeles Unified’s AI Meltdown: 5 Ways Districts Can Avoid the Same Mistakes

By Alyson Klein — July 08, 2024 10 min read
Image of the complexities of Artificial Intelligence.
  • Save to favorites
  • Print
Email Copy URL

Back in March, the Los Angeles Unified School District was held up as a trailblazer for its embrace of artificial intelligence, when it unveiled a custom-designed chatbot.

Superintendent Alberto Carvalho a “game changer” that would “accelerate learning at a level never seen before.”

But in just five months, LAUSD has gone from enviable AI pioneer to

The district has temporarily “Ed.” That decision appears to have been prompted by upheaval at AllHere, the company LAUSD hired to create the tool at a

AllHere has . Its CEO and founder, Joanna Smith-Griffin, is no longer with the company. And a company whistleblower has

LAUSD has now become the poster district for what not to do in harnessing AI for K-12 education.

For starters, the district didn’t appear to have tightly defined the problem it was trying to fix with the tool, experts said. Plus, the district selected an inexperienced vendor and set an overly ambitious timetable for the project without proper protections for student data—all signs that its leaders bought uncritically into the AI hype.

“There’s a dream that AI is just more or less automatically going to solve all or many problems [of K-12],” said Ashok Goel, a professor of computer science and human-centered computing in the School of Interactive Computing at Georgia Institute of Technology. “It’s overhyped. That is not how learning and education works.”

Even so, LAUSD has made it clear it is not giving up on “Ed” or AI. In a statement, the district said that the tool—— “belongs” to LAUSD and that it will ensure whatever company acquires AllHere “will continue to provide this first-of-its-kind resource to our students and families,” a district spokesperson said.

AllHere did not respond to Education Week’s requests for comment.

This marks the second time the district has bungled a cutting-edge tech initiative. LAUSD, under different leadership, rolled out a 1-to-1 computing iPad program in 2013 that was a complete disaster.

It’s unclear what will happen next with AI in LAUSD. But the challenges the nation’s second largest school district faced in developing an AI tool offer important lessons for other school systems.

Those lessons include:

1. Be clear about what problem you’re trying to solve with AI

LAUSD had hoped “Ed"—a chatbot shaped like the sun—would serve as a one-stop shop for students and parents seeking information on everything from bus schedules to upcoming tests. The district also wanted the tool to boost students’ academic skills, enhance their social and emotional development, and improve attendance.

That’s a broad to-do list for a single tool, suggested Katy Knight, the executive director of the Siegel Family Endowment. (Editorial Projects in Education, the publisher of Education Week, receives support from the endowment for its coverage of AI and other technology issues. The media organization retains sole editorial control over that coverage.)

“The notion you can get everything you need from a chatbot, it’s sexy,” Knight said. “But I look at this and I think, OK, this is what happens when you let the technology lead you instead of figuring out what you actually need. And then finding technology to serve that need.”

Before purchasing—or in LAUSD’s case, commissioning the design of an AI product— districts must articulate what need or challenge they are addressing and decide if AI is really the best way to go.

“You need a clear understanding of what you want,” said Rick Gay, the executive director of business services for the Fort Bend Independent School District in Sugarland, Texas. Before even looking for a vendor, answer questions, such as: “How do you want [the tool] to look? How do you want it to perform? What does success look like?”

And if a district decides to deploy AI, leaders need to explain the technology to anyone who will use the tool, including principals, teachers, students, and parents. The district must clearly communicate AI’s limitations, including bias, and tendency to spit out completely wrong information.

“There’s an eyes wide open component to this,” said Bree Dusseault, the managing director for the Center for Reinventing Public Education, a research organization, . “Once you choose to go this route, you have to build everyone’s capacity” on AI.

2. Vet ed-tech companies carefully

Many ed-tech companies purporting to be AI experts are startups without much of a track record in general, much less in K-12 schools, Gay said.

“They spring up out of somebody’s basement,” he said. “You have to vet these guys pretty carefully.” That means asking for references from school district officials who have worked with the company on similar projects and asking it to demonstrate similar tools it has built, Gay said.

LAUSD may have rolled the dice in choosing a startup like AllHere, experts said.

A more established company might not have had so much difficulty delivering the tool and adhering to the principles necessary for it to work in K-12 schools. But a big-name ed-tech business would also likely have been pricier, and less attentive to the district’s specific needs.

“With a bigger player, the cost would have been higher, but the risks would have been lower,” Goel said. But many districts “don’t pay as much attention to risks” when assessing a new technology, he added. “They pay most of the attention to benefits and cost analysis,” Goel said.

To be sure, AllHere, a Boston-based company best known for creating a text-messaging system to help improve student attendance, appeared by some measures to be a thriving organization. AllHere was (ranking 74 out of 250). The company’s website says it has worked with more than 8,000 schools in 34 states.

That may speak to another AI challenge: The technology is so new that it’s hard to discern which vendors have the capacity and expertise to make a customized product like “Ed” work, Dusseault said.

“I was struck by the fact that this organization was on lists and was recommended,” Dusseault said. That means it’s possible that “there’s no standard of quality for vendors right now. It’s really unpredictable who is truly equipped to succeed.”

In fact, Jeremy Roschelle, the co-executive director of learning science research for Digital Promise, a nonprofit organization that works on equity and technology issues in schools, estimates only about a third of the new ed-tech startups marketing AI-powered products will be around a year from now.

“I say that with respect, because everything’s changing fast. The failure rate of new companies is high” even when investors and developers aren’t trying to capitalize on a hyped-up new technology, Roschelle added.

3. Consider starting small and work on a reasonable timetable

LAUSD had big ambitions for “Ed” from the start—and may have been unrealistic about the amount of time and testing it would take to achieve them, experts said.

“No district has done this type of work before, at the level that I know LAUSD wanted to do it,” Dusseault said. “I think that there is some element of biting off more than you can chew quickly.”

The district put out a request soliciting a company to build “Ed” in February 2023. It piloted the tool with about 1,000 students before unveiling it with great fanfare in March. The district then made the bot available to an additional 55,000 students at select schools for the remainder of the school year.

And LAUSD planned to extend the tool to all students and teachers in the district by the beginning of the 2024-25 school year, that does not appear to have been revised in light of AllHere’s turmoil.

Compare LAUSD’s experience with “Ed” to a similar chatbot, , which Goel is developing and testing to assist adult learners at Georgia Tech.

That project began in 2016. About eight years later, the bot is still being put through its paces. It’s only been used in about 60 or 70 of the institution’s roughly 3,000 classes. Instead of going big quickly, Goel and his team have methodically used teacher feedback to improve the tool.

Although LAUSD probably had more resources to direct to a project like this than a researcher like Goel, the timetable was likely too aggressive, experts say.

Companies and school districts alike need to be careful about “setting expectations around how fast and how much time” it can take to create and properly test new AI tools, Roschelle said.

But some vendors are reluctant to acknowledge the long game, he added. When Roschelle listened to companies pitch AI tools at a recent ed-tech conference, “I wasn’t [hearing] ‘this is going to take a while,’” he said.

That desire to set a speedy pace without considering the consequences prompted LAUSD to roll out “Ed” to all students. But experts say the district likely would have been better off targeting a limited number of schools.

“I’m not saying don’t use the technology, but don’t jump all in with something which then you have egg on your face when things don’t go right,” said Punya Mishra, a professor at Mary Lou Fulton Teachers College at Arizona State University. “Think small pockets of innovation. Support those. Promote those and I think that you’d get more bang for the buck.”

4. Make data privacy a top priority

LAUSD has laid aside “Ed” for now. But that move may not put to rest questions about how

A former AllHere software engineer warned district and state officials that the chatbot—which required access to troves of student information to function—was violating the district’s privacy rules and putting some student data at risk,

In a statement to Education Week, LAUSD said it takes data concerns seriously and will continue to protect students’ data.

“Throughout the development of the “Ed” platform, Los Angeles Unified has closely reviewed it to ensure compliance with applicable privacy laws and regulations, as well as Los Angeles Unified’s own data security and privacy policies, and AllHere is contractually obligated to do the same,” a district spokesperson said in an emailed statement. “Any student data belonging to the district and residing in the Ed platform will continue to be subject to the same privacy and data security protections, regardless of what happens to AllHere as a company.”

Goel hopes districts will take this aspect of LAUSD’s experience to heart.

“You need to start worrying about data privacy from Day 1,” Goel said. “One of the big problems right now is who owns the data? Where is it stored? Who has access to it?”

Districts need to make the answer to those questions crystal clear in their contracts with vendors, ” said Mishra of ASU. And they need to spell out “penalties if we find out that our data has been [misused],” he added.

Otherwise, vendors “could play loose and fast with” student information, he said. And if a district is building a new AI learning tool, it shouldn’t just rely on the company’s stated privacy policy, even if it’s comprehensive.

“Companies shift ownership,” Mishra said. “Terms of agreement suddenly vanish.”

5. Don’t let LAUSD’s experience completely sour you on AI

Back in 2013, LAUSD was among the first districts to embrace a 1-to-1 computing initiative, distributing iPads to students across a broad swath of schools. The effort was plagued by poor implementation and possible conflicts of interest—leaving districts around the country wary of going 1-to-1.

But districts interested in exploring AI shouldn’t expect that their own experience will mirror LAUSD’s—particularly if they proceed cautiously.

“I hope that LAUSD moving so fast, in such a big way, and so publicly when there was not much there [doesn’t] discourage districts from ever thinking that they should do this,” Dusseault said.

The technology is here to stay. Understanding and using it properly will be important in the future workforce, she said.

“LAUSD is a cautionary tale on how to not address AI,” Dusseault said. “But it does not mean that districts shouldn’t be contemplating how to support their students and staff with AI-enabled strategies.”

Arianna Prothero, Assistant Editor and Lauraine Langreo, Staff Writer contributed to this article.
A version of this article appeared in the August 14, 2024 edition of Education Week as Los Angeles Unified’s AI Meltdown: 5 Ways Districts Can Avoid The Same Mistakes

Events

School & District Management Webinar Crafting Outcomes-Based Contracts That Work for Everyone
Discover the power of outcomes-based contracts and how they can drive student achievement.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School & District Management Webinar
Harnessing AI to Address Chronic Absenteeism in 69ý
Learn how AI can help your district improve student attendance and boost academic outcomes.
Content provided by 
School & District Management Webinar EdMarketer Quick Hit: What’s Trending among K-12 Leaders?
What issues are keeping K-12 leaders up at night? Join us for EdMarketer Quick Hit: What’s Trending among K-12 Leaders?

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.

Read Next

Artificial Intelligence Opinion What to Know About AI Misinformation: A Primer for Teachers (Downloadable)
It’s not difficult to educate students to be savvy about artificial intelligence. Two researchers offer simple steps.
Sam Wineburg & Nadav Ziv
1 min read
Modern collage with halftone hands, eyes and search box. Person looking for information in the search bar. Concept of searching, looking, finding opportunities and knowledge in internet. SEO concept
Alona Horkova/iStock + Education Week
Artificial Intelligence Q&A What Happens When an AI Assistant Helps the Tutor, Instead of the Student
A randomized controlled trial from Stanford University examines the efficacy of an AI-powered tutoring assistant.
4 min read
Illustration of artificial intelligence bot in a computer screen teaching math.
iStock/Getty
Artificial Intelligence From Our Research Center 'We're at a Disadvantage,' and Other Teacher Sentiments on AI
Teachers say they have other, more pressing priorities.
3 min read
3D illustration of AI button of keyboard of a modern computer. Light blue button. 3D rendering on blue background. Training.
Bo Feng/iStock + Education Week
Artificial Intelligence Opinion What Makes 69ý (and the Rest of Us) Fall for AI Misinformation?
Researchers Sam Wineburg and Nadav Ziv explain how to turn your students into savvy online fact-checkers.
Sam Wineburg & Nadav Ziv
4 min read
Trendy pop art collage search concept. Halftone laptop computer with search bar and cut out hands pointing on it.
Cristina Gaidau/iStock