The ed-tech world has been buzzing for months about ChatGPT, the powerful artificial intelligence tool that can respond to seemingly any prompt. Observers have speculated about how it might change the nature of teachers’ jobs and personalize education for students.
Now, some in ed tech are betting it can improve tutoring—a strategy that has boomed as schools struggle to help kids make up lost ground after months of disrupted learning.
In March, the online learning platform Khan Academy debuted a new AI chatbot—Khanmigo—designed to tutor and coach students in one-on-one interactions. Saga Education, a nonprofit that works with districts to design and implement tutoring programs, is starting to use AI to analyze tutors’ performance and offer feedback. And Varsity Tutors, a platform that provides private tutoring and also works with schools, is rolling out AI-generated lesson plans.
Developing and staffing the kind of tutoring that research has shown is most effective—often referred to as high quality, or high-impact tutoring—is complex, time-consuming, and expensive. Tutors meet with students at least three times a week, in small groups or one-on-one. Work should be targeted to a specific subject and aligned to high-quality curriculum, and should develop strong tutor-tutee relationships.
“High-impact tutoring is not homework help. They’re not sporadically dropping in,” said Carly Robinson, a senior researcher at Stanford University’s Graduate School of Education who works with the National Student Support Accelerator, a group promoting research-based tutoring programs.
Experts say AI can take over some pieces of this puzzle, but there are potential pitfalls, too.
An AI-enabled tutoring program can give students immediate, personalized feedback, said Helen Crompton, an associate professor of instructional technology at Old Dominion University in Norfolk, Va. And the kinds of questions that students ask of the AI can offer teachers and tutors an insight into their thinking.
But there’s also the possibility that a virtual tutor could present students with incorrect information, or reinforce bias, Crompton said. And then, there’s one crucial element of good tutoring that experts agree an AI can’t replace: the student-tutor relationship.
“AI is all brain and no heart,” Crompton said. “There’s a human aspect that should always be added in.”
How tutoring companies are adapting
Since ChatGPT exploded on the scene at the end of 2022, educators have debated how the tool, which generates its answers based on taking in huge amounts of written text, could be used in education. Still, the tool is far from foolproof: It plagiarizes, often makes up information, and has trouble with certain tasks, like solving math problems.
For now, in education, “we’re at the stage of the early adopters,” said Rob Moore, an assistant professor of educational technology at the University of Florida. Some of these early adopters are tutoring companies.
The most well-known of these might be Khan Academy, which debuted its AI tutor chatbot early this year. Called Khanmigo, the the Spanish words conmigo (“with me”) and amigo (“friend”). The tool combines the technology that powers ChatGPT with tutoring-specific instructions and relevant information from Khan Academy’s platform, said Kristen DiCerbo, the company’s chief learning officer.
“If you just go to ChatGPT, and type in something, that’s going to get you a different experience than what we do,” she said.
Say, for example, that a student types “I’m stuck” into the chatbot. “We don’t just send, ‘I’m stuck,’ to the model,” DiCerbo said. Khanmigo also gets a short prompt about how to be a Socratic tutor, asking students questions to get to the root of their misunderstanding, rather than just giving them the answer.
It also has access to the Khan Academy site—what course student is in, what unit the class is working on now, and the specific skills that the student has or hasn’t mastered. It uses all of that information to craft its response, DiCerbo said.
Khan Academy is now piloting Khanmigo in schools. Most teachers are using it in a classroom context to provide students with individualized support while a teacher is present, rather than in a separate designated tutoring block, DiCerbo said.
In Khan Academy’s model, the AI is the tutor: 69ý ask Khanmigo questions and the bot helps them work through the problem. Other companies are using the ChatGPT technology to support the processes around tutoring—creating lesson plans, writing session notes, and offering tutors feedback, but not actually as a tutor.
Varsity Tutors, which works with about 200 school districts, is rolling out AI-generated lesson plans, which will be designed to align to standards and reflect individual students’ learning progressions, in the 2023-24 school year, said Anthony Salcito, the chief institutional business officer at Nerdy, the parent company for Varsity Tutors.
Right now, Varsity Tutors is testing AI-produced session notes: feeding the tool transcripts of tutoring sessions and asking it to provide a summary. “That’s exciting, because it brings us consistency,” Salcito said. “We can consume all the notes and then track patterns over time.”
Finally, Saga Education, a tutoring nonprofit that works with school districts, is using AI to give the tutors feedback. The organization has partnered with the University of Colorado Boulder, using AI technology developed by researchers there to analyze recordings of tutoring sessions, said Krista Marks, chief technology officer at Saga Education.
The AI has been trained on rubrics of effective tutoring, measuring recordings of the sessions against those markers and offering feedback. Tutoring site directors can then review feedback from multiple sessions, and use that to identify strengths and weaknesses for coaching sessions with the tutors.
AI is all brain and no heart. There’s a human aspect that should always be added in.
The tool can offer site directors a way to personalize coaching conversations with their staff, without having to spend all day sitting in on different tutoring sessions, Marks said.
Marks, Salcito, and DiCerbo said that the next frontier is more personalization.
Next year, Saga Education will pilot technology that offers tutors real-time recommendations about the next steps to take in a tutoring session, adapting suggestions to individual student interests—offering a two-step equation math problem about basketball, for example.
Khan Academy is hoping to tailor Khanmigo’s assistance more closely to student needs. Right now, DiCerbo said, the AI knows where kids are in a unit and their past performance. But it doesn’t know exactly why students are struggling, or how to pinpoint and address prerequisite skills that they need but might not have mastered.
“It’s not about, I need to just keep banging my head against the wall on this grade level thing,” DiCerbo said. “I need to fill in these gaps on things I haven’t learned yet.”
For now, Khanmigo can’t do that, DiCerbo said: “It’s shaky at best.”
When AI gets the answers wrong
There are other, more immediate issues with AI tutoring technologies, too. Khanmigo, for example, struggles with math—a problem ChatGPT has as well.
“If you just ask it to do the problem, it usually does it correctly,” DiCerbo said. Sometimes, though, it gets the wrong answer. And it has difficulty with math conversations, where it has to discuss problem-solving steps and strategies with students.
Khan Academy offers teachers and students an option to give feedback and flag interactions where the AI gives the wrong information. Khanmigo also asks kids to explain their reasoning when its answer doesn’t match a student’s, DiCerbo said. “It allows the model to check its own work and work through that,” she said.
Pointing out these flaws to students can help them understand that AI isn’t infallible, said Moore, of the University of Florida. “You can flip that and make that a teaching moment.”
Still, errors could introduce new challenges in tutoring, where a student is already struggling—and might not have a good enough grasp on the material to recognize an AI’s misstep.
Not every student is going to be well-served by an AI tutor, said Robinson, the Stanford researcher. Beyond these technical limitations, there’s also a big question of motivation. Would the students who are in most need of support actually use the tool as intended?
When districts have provided on-demand optional programs in recent years, most students don’t actively engage with them, she said. The kids that do take advantage of them are more likely to be higher-achieving students—not typically the demographic that schools were most hoping to reach, she added.
“Delivering a scripted curriculum that can be responsive to students’ answers is something that AI could take on. Where we might actually need a human is where it comes to the relationships and the motivation and the engagement aspects of tutoring,” Robinson said.
All of the company representatives said that AI is meant to complement teachers and tutors—not replace them. “If we do not have teachers, tutors, caring adults to guide students in how to ask good questions … I think we’re lost,” said Marks, of Saga Education.
‘We haven’t even scratched the surface’ on ethics
In in action that Khan Academy has shared, the bot is encouraging—gently prompting students to persevere and praising them when they get the right answer.
These cheery interactions with students are part of the programming, Crompton said. “It doesn’t actually have a personal relationship with students. It doesn’t actually have that investment in how a student is doing.”
And that’s a concern for some researchers: How an AI acts is based on the source material and instructions it’s given—a fact that introduces “a lot of ethical concerns that we haven’t even scratched the surface on,” said Robinson.
AI tools can be designed to have a positive and helpful attitude, “but they could just as easily be designed to bully,” Robinson said.
Crompton fears the possibility for bias. The technology that powers ChatGPT pulls information from the internet, so it’s going to reflect dominant perspectives and ideas in published materials, she said. Take social studies as an example, a discipline whose subjectivity has led repeatedly to fierce debates over the years over which narratives should be taught.
“Those textbooks, those notes, all those have bias in them. So we are still going to have a biased product in the end,” she said.
These fallibilities mean that districts need to carefully vet AI tutoring tools, the same way they would vet other materials and programs that they bring into schools, Crompton said.
“We’ll need to do our due diligence and recognize that there will be a lot out there provided,” she said. “And some AI programs will be good, and some won’t be as good.”