Google is one of the big ed-tech players adding generative artificial intelligence to existing products already popular in the K-12 world. The head of education impact for the tech company says the new technology can help solve some of the problems schools are facing right now, but recognizes its limitations too.
Last month, Google announced that Gemini—its generative AI model—will be available as an add-on for educational institutions using its Workspace for Education product. Educators can now access generative AI features in Google’s Workspace apps, such as Docs, Sheets, Slides, and Gmail, as well as access to the Gemini chatbot.
Other ed-tech companies have also announced AI features for their education products. Microsoft announced that Copilot—its AI chatbot—is now part of Microsoft 365 apps such as Teams, Word, and PowerPoint, and can generate materials for educators. Khan Academy has Khanmigo, an AI assistant for students and for teachers. And OpenAI, the creator of ChatGPT, now has ChatGPT Edu, though that’s mostly for those in higher education.
The Google announcement and tech developments at other companies are happening as more educators are trying out AI-driven tools. At the same time, schools are struggling to figure out how they’ll use the technology in instruction and school operations, given the big concerns about data privacy and academic integrity that come with AI use.
During the International Society for Technology in Education conference here, Education Week spoke with Jennie Magiera, Google’s global head of education impact, about the role of AI in education, the technology’s limitations, and educators’ concerns about it. Before working at Google, Magiera taught in the Chicago public schools for a decade. She was also a chief innovation officer at a district outside of Chicago for two years.
This interview has been edited for brevity and clarity.
How do you see Google’s AI features solving some of the problems educators are facing?
What we’re trying to do at Google is elevate educators and help them personalize learning for every student, and we’ve been trying to do that for years. Now that AI technologies are becoming more advanced and more prevalent through all of our products, that hope and that dream is becoming more real than ever.
One of the products I’m super stoked about is . When I was in the classroom, I spent all of this time trying to close the instruction-differentiation-assessment-reteach loop and that would take me many weeks. What the team has done with practice sets is a young person could be doing an activity or an exploration and, in real time in that moment of cognitive dissonance, find out what exactly wasn’t working for them, and then get reteaching materials all within moments. As an educator, it’s not only saving me time, it’s accelerating my students’ progress and making them feel success sooner, building their confidence.
What are some problems educators are facing that these AI features probably won’t be able to solve right now?
I don’t think generative AI can ever solve every problem that educators have, and it shouldn’t. We need that human element, we need that teacher in the loop—the human in the loop.
What is Google’s philosophy when it comes to developing AI tools for K-12?
Technology is a vehicle to help educators be even better at what they’re doing and to help students meet their goals as well. It’s not about the technology, it’s about the technology being an invisible and ubiquitous tool that anyone can access as needed.
We do this by respecting the user. We spent hours and hours and hours talking with educators at all levels from all over the world to find out what are the needs that they have? What are the transformational opportunities that they wish for? We even sometimes call it a magic-wand wish, like if you had a magic wand, you can make one thing appear, what would it be? And then our teams try and dream up ways to make those a reality.
Is Google helping provide educators with professional development on AI?
We do that through two main ways. One: Our communities of practice bring educators together to celebrate them, elevate them, and help them elevate and celebrate each other.
The other is enabling educators by giving them training and professional learning support. We recently published two free online courses about AI: one we did in collaboration with and [Responsible AI for Social Empowerment and Education]. That’s the generative AI for educators course. That course is product agnostic. The goal for that course is to be a zero-entry, super easily accessible course.
The other course that we have is Getting Started with Gemini for Workspace. It is literally like, this is what this button does. This is what the feature can do. This is the power of it. This is how you access it.
We also have a series on YouTube Shorts with our Google for Education champions. These are our most energized, passionate, imaginative, creative educators around the world that create these YouTube Short videos of how we might use AI.
Some educators are worried AI could degrade the critical thinking skills of students and teachers. They also say the human touch is better than AI. What’s your response to that?
When I hear educators say that there’s a fear of AI, or that the concern is that AI will replace humans or that AI will bring down the human or creativity level, I hear you. I thought that, too. I felt that, also.
When I dig into my use of AI and I talk to more people, what I’m finding is that it’s actually enhancing my humanity. It’s elevating my creativity. The reason for that is, I found that I spend so much of my time doing rote tasks that I don’t get to the creative part of my brain, at a creative part of my work product, because I’m doing all the other pieces.
For an educator, you’re spending all day grading papers, coming up with that one slightly different version of this lesson plan. But imagine you could have AI cut through all the work, and then give you a prompt once your brain is fried and you have that writer’s block, it becomes almost like that spaghetti-at-the-wall thought partner.
So, in a lot of ways, I have become a more creative, more human version of myself, because I’ve left all of the robotic tasks to AI, and then asked AI to help instigate my creativity so that I can go further with it.
Some teachers say AI doesn’t make their jobs easier because it’s more effort to rethink assignments and to double check the work of the generative tool. What do you think?
I think it’s [about] expectation setting. If you’re expecting AI to spit out a perfect lesson plan or a perfect prompt for your students, that’s not what we want. I don’t want a world where AI can do that because I want a world where it’s human first, where it is a human teacher in the loop.
If we take a step back and think of it as a starter, instead of giving me the whole thing, then it does save time. I cannot count how many times I would be sitting in front of a blank screen, thinking about what I was going to teach on Monday on Sunday night and just spending hours just staring at it and being like, “I just don’t even want to get started.”
But with AI, I can say, “You know what, why don’t you just give me a shell of this lesson, a shell of a differentiation, a shell of that essay, and then I’m going to use my human brain, my human heart, my spirit, my experiences to make it something truly fabulous.” And that not only will save me a ton of time, but I believe that the end product will be even more improved as well.
What do schools need to know and do to combat AI’s imperfections?
Just because the technology exists doesn’t mean you need to use it in all ways. So I think a lot of it is going to be about schools supporting their educators and asking the critical questions of how they’re applying the technology. When they use AI, how do they use AI?
And being judicious about it. Just because it exists doesn’t mean it needs to overtake every single practice we have in the classroom. We need to consider where the opportunities are, and we need to lead with the need.
Is Google doing any research on AI’s effects on teaching and learning?
We’re always exploring how we can learn more about how our products are used, what the impact is. We’re also really trying to maintain a level of humility, like what is our expertise? What are we good at? And where should we play a role researching long term learning impacts?
We know there are other organizations that are doing really powerful work there and have a long history and resume of that kind of action research. We’ve recently joined multiple consortiums who are digging deeply into AI policy and practice, such as Teach AI, the EdSafe Alliance. We also work really closely with ISTE/ASCD, Digital Promise.
What we try and do is stay as a collaborative thought partner and friend, so that we can share what we’re learning, learn from them, and then integrate those learnings into the way that we go about our work.
What’s your prediction for how AI will be integrated into K-12 education?
I hope that this gives us a moment of inflection to approach a powerful technology in our schools in a way that’s more thoughtful, equitable, and intentional than we have in the past.
I was part of the big 1-to-1 [computing] phase in schools. It felt at the time it was like, get the devices in front of kids, get them open, use them and have them on. It was kind of like time on technology versus time on task.
What I hope now with AI is it’s less about time on tech and more about who has access to it, how are they using it? Is this the right use? I see a lot of organizations doing that, ours included, and we’re making those choices in what features we’re building, how we’re building it and how we’re making it accessible.