It鈥檚 been more than a year since ChatGPT鈥檚 ability to produce astonishingly humanlike writing sparked fundamental questions about the role of artificial intelligence in K-12 education.
Yet most school districts are still stuck in neutral, trying to figure out the way forward on issues such as plagiarism, data privacy, and ethical use of AI by students and educators.
More than three-quarters鈥79 percent鈥攐f educators say their districts still do not have clear policies on the use of artificial intelligence tools, according to an EdWeek Research Center survey of 924 educators conducted in November and December.
District leaders want to help schools chart the right course on the potentially game-changing technology, but many feel 鈥渙verwhelmed and overloaded,鈥 said Bree Dusseault, a principal at and the managing director for the Center for Reinventing Public Education, a research organization at Arizona State University鈥檚 Mary Lou Fulton Teacher鈥檚 College, .
The consequences of us ignoring it and sticking our heads in the sand is that students will game the system.
The lack of clear direction is especially problematic given that the majority of educators surveyed鈥56 percent鈥攅xpect the use of AI tools to increase in their districts over the next year, according to the EdWeek Research Center survey.
And while experts are encouraging schools to teach their students to use AI appropriately, banning the tools for students is still a relatively common practice in K-12 education, the survey found.
One in 5 educators surveyed said that their district prohibits students from using generative AI, such as ChatGPT, although teachers are permitted to use it. Another 7 percent of educators said the tools were banned for everyone鈥攊ncluding staff.
When district officials鈥攁nd school principals鈥攕idestep big questions about the proper use of AI, they are inviting confusion and inequity, said Pat Yongpradit, the chief academic officer for Code.Org and leader of Teach AI, an initiative aimed at helping K-12 schools use AI technology effectively.
鈥淵ou can have, in the same school, a teacher allowing their 10th grade English class to use ChatGPT freely and getting into AI ethics issues and really preparing their students for a future in which AI will be part of any industry,鈥 Yongpradit said. 鈥淎nd then literally, right down the hall, you can have another teacher banning it totally, going back to pencil and paper writing because they don鈥檛 trust their kids to not use ChatGPT. Same school, different 10th grade English class.鈥
The new 鈥渄igital divide will be an AI divide,鈥 Yongpradit said.
鈥楶olicy is always behind technology鈥
It鈥檚 not hard to understand why most district leaders aren鈥檛 eager to make big decisions about how their schools will use the technology.
Many educators worry that if students are exposed to generative AI, they鈥檒l employ it to cheat on assignments. Plus, AI tools can spit out false information and magnify racial and socioeconomic biases. AI also develops鈥攕ome would say 鈥済ets smarter鈥濃攂y consuming data, opening the doors for potential student-data-privacy nightmares.
The vast majority of educators don鈥檛 have the capacity to cope with those complications on top of their other responsibilities, the survey found.
More than three quarters鈥78 percent鈥攐f educators surveyed said they don鈥檛 have the time or bandwidth to teach students how to think about or use AI because they are tied up with academic challenges, social-emotional-learning, safety considerations, and other higher priorities.
What鈥檚 more, AI is changing so rapidly that any policy a district or state crafts could be outdated the moment it is released.
That鈥檚 typical when it comes to new technologies, said Kristina Ishmael, who until late last year served as the deputy director of the U.S. Department of Education鈥檚 office of educational technology.
鈥淧olicy is always behind technology,鈥 said Ishmael, who is now a strategic advisor at Ishmael Consulting. In some cases, that鈥檚 鈥渧ery intentional, because it鈥檚 policy; once you put it in, it鈥檚 hard to take it off.鈥
But AI requires a shift in thinking, she pointed out.
AI policy and guidance need to be 鈥渓iving, breathing documents, because the technology is changing so quickly,鈥 Ishmael said. 鈥淚t鈥檚 not something like a continuous improvement plan where your school is looked at every couple of years, then the binder sits on the shelf.鈥
Another stumbling block, she said: Some district leaders are hitting the pause button to see if their state or Washington policymakers establish AI parameters. The federal Education Department has pledged to release resources, including an AI policy tool kit this year. Members of Congress have introduced legislation on select AI issues, such as literacy.
But it鈥檚 not clear if more significant action is on the horizon, Ishmael said.
鈥淔olks are waiting to see what happens at the federal level,鈥 said Ishmael. But she recommends districts avoid delay.
鈥淚鈥檇 tell them to start working on things now,鈥 she said. 鈥淭his is a brand-new tool that is impacting our classrooms and our lives. There needs to be some sort of baseline parameters for students to be able to use [it].鈥
鈥榃e鈥檙e all entering this innovative environment with a lot of unknowns鈥
Most educators see value in understanding AI.
Two-thirds of those surveyed by the EdWeek Research Center say students will need knowledge of AI because the technology already features so heavily in the products and services that are part of their daily lives. And another 60 percent say that employers are looking for people who can work with AI tools to do their jobs more efficiently.
Nearly half said students will need AI skills to be successful in college, and nearly a third believe younger students will need them to do well academically in the upper grades.
That鈥檚 motivated some district leaders to move quickly.
鈥淚 think the driver for me is really looking at the jobs of the future and looking at it through the economic lens,鈥 said Jerry Almendarez, the superintendent of the Santa Ana Unified school district, which he describes as a largely 鈥渂lue collar鈥 Southern California community. 鈥淚 see this as a window of opportunity for communities like mine, to catch up to the rest of society by giving [students] skills and access to a technology that has never been at their fingertips before,鈥 he said.
District and school leaders who want to help their students navigate this technology 鈥渟hould know that they鈥檙e not alone, if they don鈥檛 know where to start,鈥 Almendarez said. 鈥淭hat鈥檚 OK. None of us really do. We鈥檙e all entering this innovative environment with a lot of unknowns.鈥
Almendarez suggested districts turn to entities that have already sketched out what AI policy guidance could look like. That includes six states鈥, , , , , and 鈥攁s well as districts that were early out of the gate on AI policy, such as near Seattle.
Nonprofit organizations have also stepped up. that offers suggestions for mitigating privacy risks of AI tools, tactics for ensuring students use the technology to inform their assignments and not to cheat, and tips on how to train staff on using AI appropriately.
Last fall, the Council of the Great City 69传媒 and the Consortium for School Networking released a 93-question checklist to help educators think through policies around generative AI. The list includes queries such as: Does your district have a dedicated point person on the role of artificial intelligence in K-12 education? Are you requiring vendors that use AI algorithms in their products to ensure they are free from bias?
That kind of direction is what district leaders are searching for, said Dusseault of the Center for Reinventing Public Education.
鈥淲e鈥檝e heard superintendents say, 鈥業 would like to see support, and it doesn鈥檛 have to come from my state. It could come from a trusted nonprofit,鈥欌 she said.
Some districts are taking it a step further. New York City, which reversed an initial ban on ChatGPT, and Santa Ana are launching AI policy shops whose work can inform the broader field.
鈥榃hat really is the purpose of having kids take world literature or biology and physics?鈥
Much of the early discussion around the use of generative AI in K-12 classrooms centered on how students might use the technology to cheat, Dusseault said.
鈥淥ne of the big questions that I know out the gate was kind of scary and put some of the districts on their back foot was plagiarism, this idea that ChatGPT is going to end up giving students the ability to plagiarize and not represent their work,鈥 Dusseault said.
But district and state leaders鈥 thinking has evolved over the past year, she said.
鈥淣ow, a year later, we鈥檙e seeing: 鈥榃e are probably going to all be using some large language model or something like ChatGPT into the future, so students may need to actually have skill building on how to use it appropriately.鈥欌
One state on the vanguard of this approach: North Carolina, whose AI guidance, released last month, includes a clear outline of possibilities for using AI on assignments without the technology encouraging cheating or plagiarism.
As generative AI gets ever more adept at the kinds of assignments teachers regularly give students鈥攚rite an essay on bird imagery in Shakespeare鈥檚 鈥淢acBeth,鈥 explain the differences between igneous and metamorphic rocks鈥攅ducators will need to rethink long-held tenants of teaching and learning, said Catherine Truitt, North Carolina鈥檚 superintendent of public instruction.
They will have to ask themselves: 鈥淲hat really is the purpose of having kids take world literature or biology and physics. What should kids be getting out of these courses?鈥 she said.
Educators 鈥渁re going to have to start having hard conversations鈥 about what it really means to teach content or help students develop critical-thinking and analytical skills, she said. 鈥淭he consequences of us ignoring it and sticking our heads in the sand is that students will game the system.鈥
Data analysis for this article was provided by the EdWeek Research Center. Learn more about the center鈥檚 work.