The U.S. Department of Education has released new resources for schools on artificial intelligence that include recommendations on a range of potentially thorny issues, including the use of AI detection tools that may falsely accuse students of plagiarism and how to build educators鈥 AI literacy skills.
The two reports come at a time when educators are still puzzling through how to approach this powerful and fast-advancing technology. Many teachers are hesitant of using AI in the classroom, surveys show. Meanwhile, students are increasingly using AI tools. A recent found that half of teens have used an AI text generator, 34 percent have used an image generator, and 22 percent have used a video generator.
Taken together, the department resources detail both potential pitfalls that could stem from AI and the opportunities that it has for K-12 education.
Even though many states are crafting AI guidance for schools, federal guidance on AI is still needed, said Pat Yongpradit, the chief academic officer at Code.org and a leader of .
鈥淲e really need to move beyond AI is bad [or] AI is good, and get super nuanced about the proper and improper uses of AI in education,鈥 he said.
These resources, he said, give schools a good starting point to have those conversations.
The Education Department鈥檚 office for civil rights report, which was released last week, focuses on and details several scenarios where schools鈥 use of, or response to, AI could trigger an OCR investigation.
Among the examples:
- A teacher uses an AI detection tool to determine if students used a generative AI program like ChatGPT to write an assignment. Unbeknownst to the teacher, the tool has a much higher false-positive rate with students who are learning English, meaning English learners are falsely flagged and accused of cheating while their native English-speaking peers are not. (Some research has found that this happens.)
- School administrators don鈥檛 respond aggressively enough after being tipped off that a student is creating 鈥渄eepfake鈥 nude images of their female classmates.
- A school uses an AI tool to create the schedule for sports practices and games, and female teams are assigned worse times and days to play. The school does not respond to the student-athletes鈥 complaints.
- A school district purchases facial-recognition technology that misidentifies Black students and incorrectly flags them as known criminals from a database.
Those are a sampling of the potential issues OCR has identified that might arise from schools overrelying on AI and not keeping real people in the decisionmaking loop. But they鈥檙e not purely hypothetical: 69传媒 are already dealing with some of these issues, such as students making sexually explicit deepfake images of their classmates.
鈥淭he examples that they have in the document are quite real,鈥 said Yongpradit. 鈥淭hese are not two sentence descriptions of a potential action. These sound like they are already happening. And it should be a wake-up call when it comes to the risks of AI in schools. There鈥檚 actual discrimination that could be exacerbated or created because of improper use of AI in schools. And it really alludes to the need for comprehensive AI literacy.鈥
However, he said, the takeaway shouldn鈥檛 be that AI is bad, and education leaders shouldn鈥檛 react by trying to ignore or disengage from it.
That鈥檚 where the second , released in October, comes in. While a portion of the tool kit is devoted to the risks of AI, it also offers practical tips on approaching topics like evaluating AI interventions and updating school technology policies for AI.
The tool kit was developed with support from Digital Promise, a nonprofit group that focuses on equity and technology issues in schools. A group of 16 teachers, principals, superintendents, and other educators also contributed their insights.
The tool kit is comprised of eight modules that address three broad themes: AI risk mitigation, strategies for integrating AI into instruction, and effective use and evaluation of AI.
For example, that last theme includes a module on building AI literacy that gives an overview of what AI literacy looks like for educators, its importance, and a list of topics that AI literacy professional development initiatives should cover, including the technology鈥檚 history and origins and data and machine learning.
So, how should school leaders approach these reports? Yongpradit recommends using the resources to open up discussions in faculty meetings.
鈥淭he tool kit is more directive鈥攖he modules are set up as book club readings or practical activities that teachers can do,鈥 he said. 鈥淭he office of civil rights guidance is really focused on discussion and picking apart the scenarios and reflection on whether the school is proactively addressing the potential for discrimination, or if the school is doing some of these things, or if teachers are putting themselves at risk and their learners at risk.鈥