The U.S. Department of Education isn’t exactly known for its facility with metaphors. But a vivid image in a 71-page epitomizes the agency’s central contention that teachers need to have the ultimate power over how the technology is used in schools.
“We envision a technology-enhanced future more like an electric bike and less like robot vacuums,” the department wrote in the report, released May 23. “On an electric bike, the human is fully aware and fully in control, but their burden is less, and their effort is multiplied by a complementary technological enhancement. Robot vacuums do their job, freeing the human from involvement or oversight.”
In other words: While AI has great potential to help students learn more efficiently and make teachers’ lives easier by creating lesson plans, bridging achievement gaps through intelligent tutoring, or making recommendations about how to help individual students grasp a concept, educators should understand its limitations and be empowered to decide when to disregard its conclusions. The report calls this keeping “humans in the loop.”
“We are seeing a dramatic evolution in ed tech,” said Roberto Rodriguez, the assistant secretary for planning, evaluation, and policy development at the U.S. Department of Education. “Educators have to be proactive in helping to shape policies, systems, and being engaged as AI is introducing itself into society in a more major way.”
That means teachers need to be just as aware of AI’s potential pitfalls as they are of its promise, the report contends. AI can take on biases in the data used to train the technology. For instance, a voice-recognition program used to measure reading fluency might give an incorrect picture of a student’s ability because it hasn’t been trained on their regional accent.
The technology is evolving quickly, Rodriguez said. He doesn’t want to see school districts fall behind in planning for it.
“I am worried that we are not moving quickly enough [in setting school level policies and district level policies] that both capture the powerful potential that AI provides, but also minimize the risks of these tools in classrooms and in learning for students,” Rodriguez said.
The report was informed by four listening sessions conducted last summer and attended by more than 700 experts and educators.
Other recommendations include:
Align AI models to a shared vision for education. Like any tool used to improve student achievement or manage classrooms, AI-powered technology needs to be based on evidence and aligned with what educators are trying to accomplish in the classroom.
Design AI using modern learning principles. AI tools need to build on learners’ strengths and help students develop so-called “soft skills” like collaboration and communication, as well as include supports for English learners and students in special education, the report contends.
Inform and involve educators. Teachers need to be at the table when developers create AI-powered technologies aimed at K-12 schools. Educators also must understand that AI can make mistakes, so they need to be encouraged to rely on their own judgement. “Sometimes people avoid talking about the specifics of models to create a mystique,” the report says. “Talking as though AI is unbounded in its potential capabilities and a nearly perfect approximation to reality can convey an excitement about the possibilities of the future. The future, however, can be oversold. … We need to know exactly when and where AI models fail to align to visions for teaching and learning.”
Prioritize strengthening trust. Educators haven’t had a universally positive experience with learning technology. If school districts want to take advantage of the promise of AI tools, they need to build trust in the tech, while making clear it’s not infallible. During the listening sessions, the department found that “constituents distrust emerging technologies for multiple reasons,” the report said. “They may have experienced privacy violations. The user experience may be more burdensome than anticipated. Promised increases in student learning may not be backed by efficacy research. Unexpected costs may arise.”