69ý

Special Report
Artificial Intelligence Reported Essay

No, AI Won’t Destroy Education. But We Should Be Skeptical

Artificial intelligence is a reminder of the importance of teaching students how to learn
By Lauraine Langreo — August 31, 2023 9 min read
Illustration of stylized teacher student relationship with AI represented between them as layered screens.
  • Save to favorites
  • Print
Email Copy URL

Type “artificial intelligence news” on any search engine and you would likely see the same doomsday headlines I’ve seen in the last few months: “”; “”; and “.”

Even the AI-powered autocomplete search-engine features finished the sentence “Will artificial intelligence …” with similar apocalypses: … replace humans? … take away jobs? … take over the world?

Artificial intelligence and its use are not new. Computer scientists have been working on improving the technology for decades, and a lot of the tools we use daily—navigation apps, facial recognition, social media, voice assistants, search engines, smartwatches—run on AI. And beyond that, most—if not all—industries are already using AI one way or another. It’s in health care, transportation, the military, finance, telecommunications, drug research, education, and more.

3 steps for teachers to prepare

Stylized illustration of a teacher figure using a futuristic smart board
Traci Daberko for Education Week
Artificial Intelligence What Teachers Need to Know About AI, But Don’t
Lauraine Langreo, August 31, 2023
3 min read

But since the arrival of ChatGPT almost a year ago, AI has captivated the public’s attention and reignited discussions about how it could transform the world. In the K-12 space, educators have been discussing what and how much of a role AI should play in instruction, especially as AI experts say today’s students need to learn how to use it effectively in order to be successful in future jobs.

Right now, educators are unsure of AI’s superpowers. When the EdWeek Research Center asked a nationally representative group of 1,301 educators over the summer what they thought the impact of AI would be on teaching and learning in their school or district over the next five years, 49 percent said AI will have an “equally negative and positive” impact, 28 percent said “mostly negative,” 13 percent said “mostly positive,” and 10 percent said “no impact.”

About This Project

This story is part of a special project called Big Ideas in which EdWeek reporters ask hard questions about K-12 education’s biggest challenges and offer insights based on their extensive coverage and expertise.

Even with the healthy dose of skepticism, several educators have told me that schools need to accept that ChatGPT and other AI tools like it are here to stay. 69ý, they argue, need to find ways to use the technology for the benefit of teaching and learning while being aware of its potential downsides.

“AI is calling for a fundamental reevaluation” of what the goal of education is, said Chad Towarnicki, an 8th grade English teacher in the 4,800-student Wissahickon school district in Pennsylvania.

What’s different with the arrival of ChatGPT

AI technologies replicate human thinking by training computer systems to do tasks that simulate some of what the human brain can do. They rely on systems that can actually learn, usually by analyzing vast quantities of data and searching out new patterns and relationships. These systems can improve over time, becoming more complex and accurate as they take in more information. (Or they can become more inaccurate if they’re pulling from faulty data).

ChatGPT is an AI-powered tool from research laboratory OpenAI that can hold humanlike conversations and instantly answer seemingly any prompt. But instead of learning a computer programming language to talk to the chatbot, people can just communicate with it in their natural language.

“People began to realize something is different,” said Glenn Kleiman, a senior adviser at the Stanford Graduate School of Education whose research focuses on the potential of AI to enhance teaching and learning. “Suddenly, the capabilities became available to everybody and easily accessible.”

Now, people are using AI to plan trips, draft emails, organize essays, summarize research papers, and write code. In K-12 classrooms, teachers have used ChatGPT to plan lessons, put together rubrics, provide students feedback on assignments, respond to parent emails, and write letters of recommendation.

It’s easy to get wrapped up in the hype surrounding the transformative powers of this next generation of AI—many technology CEOs have been quick to talk up its groundbreaking potential. With its new capabilities, AI can become our co-author, co-pilot, or personal assistant. Sam Altman, the CEO of OpenAI, believes the technology will help people become way more efficient and productive at their jobs. He sees it as an engine for new job creation.

Doomsday scenarios aren’t likely but ‘not impossible’

But many people are also raising cautionary flags about generative AI. Thousands of executives, researchers, and engineers who work in the AI field have sounded the alarm more than once in recent months that AI poses a “” of the human race and have called for . Even Altman said he’s a “” of AI and conceded that .

What happens when the superpowers of AI fall into the wrong hands? Or what if militaries around the world—which already have some autonomous weapons—fall into competitive pressure to create more sophisticated autonomous weapons to the point where they’re uncontrollable and unpredictable?

Hal Abelson, a professor and researcher of computer science and artificial intelligence at the Massachusetts Institute of Technology, told me that while many of the doomsday scenarios we hear about in the media about AI aren’t likely, they’re also “not impossible.”

And beyond those scenarios, “there’s a whole long list of concerns,” Abelson said. “We don’t even know what they are yet because this is merely just starting.”

Generative AI tools are trained at a certain time, and the datasets they are trained on are not updated regularly, so these tools can provide outdated information or can fabricate facts when asked about events that occurred after they were trained. For instance, the free version of ChatGPT doesn’t have training beyond events and information available in 2021.

We’re at a point in time where “it’s very hard to identify what’s false, and a lot of people believe it,” Abelson said. “Does that mean that as a society we are no longer even aware that there’s such a thing as [objective] truth? What does that mean for our society?”

Education doesn’t go away. It just needs to change.

And because the datasets on which the AI tools are trained contain racist, sexist, homophobic, violent, and other biased information, that is also included in the responses generated by the tools. In fact, when you first log into ChatGPT, it warns you that it may “occasionally generate incorrect or misleading information and produce offensive or biased content.”

These AI tools—if left unchecked—could amplify harmful stereotypes about people who are already disadvantaged, according to Yeshimabeit Milner, the founder and CEO of Data for Black Lives, a nonprofit organization whose mission is to use data and technology to improve the lives of Black people.

To combat the inaccuracies that come with using these AI models, some education organizations are focusing on a version of the technology some are calling “walled-garden AI.” A walled-garden AI is trained only on content vetted by its creator, instead of all the unchecked content all over the internet. One example of this is Stretch, a (not-yet-publicly-available) chatbot trained only on information that was created or vetted by the International Society for Technology in Education and the Association for Supervision and Curriculum Development. There’s also Khanmigo, a chatbot developed by the nonprofit Khan Academy that acts like a tutor.

These more focused bots could potentially be an excellent model for K-12 schools to use because they’re (theoretically) more tailored to the needs of educators and students. Some experts warn that these models will still have to work to keep out bad information.

What students need to succeed in an AI-powered world

With all that in mind, it’s imperative for the K-12 system to prepare students to be successful in the age of AI.

ChatGPT has made it “painfully obvious that teaching the old ways and teaching the old curriculum is going to be out of date,” said Hadi Partovi, the CEO of Code.org, a nonprofit organization dedicated to expanding access to computer science education in schools. “How we work is going to change, and it also means how we prepare students for living in a digital world is going to change.”

To prepare for a future where AI is everywhere and in everything, students will still need to know the foundational skills in reading, math, science, and history. But schools will also need to be more explicit about teaching students how to learn rather than what to learn, because that will help them become much better problem-solvers.

“We need [education] to evolve for a world of lifelong learning,” Partovi said. “And knowing that in every job and every year you’re going to be learning new things using digital access to information and AI tools to help you along the way. That’s really a different format of learning than what most of us learned in K-12.”

69ý will need to examine information with a critical and skeptical lens. If a chatbot says “something that sounds fishy, students should be able to say, ‘Well, maybe it’s not true,’” Abelson said. “That’s a skill that everybody’s going to need, as these AI systems permeate the environment.”

69ý will also need to learn how to use AI as a tool, as an assistant and an adviser, in order for them to be better decisionmakers. 69ý already teach the importance of teamwork and collaboration among students, but “the tweak I would make to that is that teams now should include some computer programs and some people,” said Tom Mitchell, a professor and researcher in machine learning and artificial intelligence at Carnegie Mellon University.

AI should be high up on school districts’ priority lists

The K-12 education system tends to be a slow-moving bureaucratic machine that is unable to respond quickly to change. Change is careful and methodical in the field, and some would argue that’s a good thing because it prevents schools from jumping on every trendy or bandwagon idea when it comes along.

And, sure, K-12 education has a lot on its plate right now. Staffing shortages have worsened. Student academic achievement and mental health have plummeted. Staff morale and student motivation are low in many schools across the country.

But this isn’t the first time the education system has had to deal with big changes. The most recent example is the pandemic, when schools had to suddenly switch to remote learning. District leaders had to adapt quickly to attend to the needs of their students and staff.

This moment with AI shouldn’t be different. Every day, there are new, groundbreaking developments in AI. It is in our faces and consequently forcing all of us—particularly schools—to take a hard look at what education in the age of AI should look like and what it shouldn’t look like. And schools need to do this now, before they fall further behind and risk leaving kids unprepared for their future.

Think about it. Big existential questions are already being raised about the role of AI in education, such as: Does the use of AI defeat the purpose of education?

The answer is a resounding no—at least from me and the people I talked to. In fact, everyone I interviewed argued that the existence of AI makes the purpose of education even more clear: to learn how to learn in an ever-changing, increasingly complex world.

“If we’re questioning the whole point of education, then it’s like we’re just sitting back and letting AI take control of everything instead of being the ones that are able to control it,” said Stephanie Harbulak, the curriculum, instruction, and assessment specialist at Meeker & Wright Special Education Cooperative in Minnesota. “Education doesn’t go away. It just needs to change.”

Related Tags:

A version of this article appeared in the September 13, 2023 edition of Education Week as No, AI Won’t Destroy Education. But We Should Be Skeptical

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Artificial Intelligence Webinar
AI and Educational Leadership: Driving Innovation and Equity
Discover how to leverage AI to transform teaching, leadership, and administration. Network with experts and learn practical strategies.
Content provided by 
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School Climate & Safety Webinar
Investing in Success: Leading a Culture of Safety and Support
Content provided by 
Assessment K-12 Essentials Forum Making Competency-Based Learning a Reality
Join this free virtual event to hear from educators and experts working to implement competency-based education.

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.

Read Next

Artificial Intelligence From Our Research Center 'We're at a Disadvantage,' and Other Teacher Sentiments on AI
Teachers say they have other, more pressing priorities.
3 min read
3D illustration of AI button of keyboard of a modern computer. Light blue button. 3D rendering on blue background. Training.
Bo Feng/iStock + Education Week
Artificial Intelligence Opinion What Makes 69ý (and the Rest of Us) Fall for AI Misinformation?
Researchers Sam Wineburg and Nadav Ziv explain how to turn your students into savvy online fact-checkers.
Sam Wineburg & Nadav Ziv
4 min read
Trendy pop art collage search concept. Halftone laptop computer with search bar and cut out hands pointing on it.
Cristina Gaidau/iStock
Artificial Intelligence Parents Sue After School Disciplined Student for AI Use: Takeaways for Educators
The Massachusetts lawsuit is one of the first to highlight the benefits and challenges of generative AI use in the classroom.
5 min read
Person using technology smart robot AI, enter command prompt. A.I. Chat concept AI, Artificial Intelligence.
iStock/Getty
Artificial Intelligence Q&A This Counselor Used AI to Help 69ý Apply to College. Here's How
Jeffrey Neill shares his tips on when it makes sense to use AI in the college application process.
6 min read
Jeffrey Neill, director of college counseling at Graded - The American School of São Paulo in Brazil, presents on how to use AI tools in his work at the College Board’s annual forum in Austin, Texas on Oct. 21, 2024.
Jeffrey Neill, director of college counseling at Graded: The American School of São Paulo in Brazil, presents on how to use AI tools in his work at the College Board’s annual forum in Austin, Texas, on Oct. 21, 2024.
Ileana Najarro/Education Week