69传媒

Artificial Intelligence

Can 69传媒 and Vendors Work Together Constructively on AI? A New Guide May Help

The Education Department outlines key steps on AI development for schools
By Alyson Klein 鈥 July 08, 2024 4 min read
Ai chatgpt 1707891351 01
  • Save to favorites
  • Print
Email Copy URL

Educators need to work with vendors and tech developers to ensure artificial intelligence-driven innovations for schools go hand-in-hand with managing the technology鈥檚 risks, recommends guidance released July 8 by the U.S. Department of Education.

The guidance鈥攃alled 鈥"鈥攊ncludes extensive recommendations for both vendors and school district officials.

In some ways, companies and tech developers are in a tough spot with AI. Many want to move cautiously in developing tools with educator feedback that are properly tested and don鈥檛 amplify societal bias or deliver inaccurate information.

On the other hand, developers also want to serve the current market鈥攁nd don鈥檛 want to get left behind the competition. And they want to find new and exciting uses for AI.

It鈥檚 not an either-or, the department鈥檚 guidance argues. Vendors and educators can try new things with AI鈥攍ike enabling teachers to use it for writing emails鈥攊f they consider important questions such as: Who will ensure that students鈥 private information isn鈥檛 shared? Or, as districts use AI-powered 鈥渆arly warning鈥 systems that could identify students at-risk of dropping out, how is it possible to evaluate whether the system is instead magnifying biases or even violating students鈥 civil rights?

The guidance also recommends that AI should not be allowed to make decisions unchecked by educators, and that developers need to design AI tools based on evidence-based practices, incorporating educator input and feedback, while safeguarding students鈥 data and civil rights.

The document grew out of the Biden administration鈥檚 AI bill of rights, which was released in October of 2022, and its , which was put forth a year later and called on the Education Department to develop AI policy resources for school districts.

And it builds on the department鈥檚 own report on AI in K-12 schools, released in May 2023. That document helped form the basis of some states鈥 and school districts鈥 approach to AI policy.

The new guidance is not regulatory, meaning no developer or district is obligated to adhere to it. Instead, it is intended to shape educators鈥 and developers鈥 thinking.

One key point the department is making simply by releasing the guidance now: Developers and school districts don鈥檛 need sweeping, official federal action鈥攍ike regulation or new legislation鈥攖o get started on creating responsible AI tools.

鈥淲e don鈥檛 have to wait,鈥 said Jeremy Roschelle, the co-executive director of learning science research for Digital Promise, a nonprofit organization that works on equity and technology issues in schools. He was one of the experts who informed the guidance. 鈥淲e have some solid principles already, and they can fly.鈥

For example, Roschelle said 鈥渋f there is algorithmic discrimination in a product and it鈥檚 blocking students鈥 opportunity to learn or unfairly disciplining students 鈥 that can be a civil rights issue.鈥

Plus, he said 鈥渨e have a ton of laws on privacy. They鈥檙e not perfect. They need an update. But we can get going. And we have a lot of evidence [on what works in education]. Those are three really strong stakes in the ground鈥 to guide AI development.

Here are the five key recommendations in the report for vendors and educators :

1. Design with teaching and learning and mind

This encompasses steps such as ensuring that humans鈥攏amely, teachers鈥攁re the final decisionmakers on any AI recommendations, and that products are deployed ethically.

2. Show how evidence-backed principles were used to create products

Developers need to be clear about how they use evidence-backed principles to design their products, as well as which student outcomes their tools are meant to improve. They also need to help educators understand which types of students their tools are for and tested on, and under what conditions.

3. Take steps to remove or mitigate bias

AI can amplify societal biases鈥攁 dangerous risk when it鈥檚 being deployed for education. Developers need to ask themselves questions, the guidance notes, such as: 鈥淲hat steps can we take to audit and remove the potential bias or algorithmic discrimination in our product, with special attention to mitigating any impacts for vulnerable or underserved populations?鈥

4. Protect student privacy

For starters, developers need to know the major laws and regulations governing student privacy, including the Family Educational Rights and Privacy Act (FERPA) and the Children鈥檚 Online Privacy Protection Act (COPPA). To be sure, cybersecurity has been a major responsibility of districts鈥攁nd vendors鈥攆or years. But AI privacy violations can go beyond the typical cybersecurity woes. For instance, students and even educators have used AI to create 鈥渄eepfakes鈥 of their classmates and colleagues. Developers need to probe questions, the guidance says, such as: 鈥淲hat are we hearing as top safety and security concerns from educators regarding our product or similar products?鈥

5. Be transparent about how products are designed

Developers need to be clear with educators about how their AI-powered products actually work. That could include steps such as providing demonstrations, training for educators on using the products, and guaranteeing that if something goes wrong, it will be fixed quickly.

The guidance recommends that developers ask themselves: 鈥淗ow can our organization contribute to AI literacy in the broader ed-tech ecosystem?鈥

The new guidance was crafted, in part, through an 鈥渆xtensive series鈥 of public listening sessions with students, parents, and educators as well as developers, industry associations, and nonprofit organizations. Department officials also met with a smaller group of developers, drawn from the initial participants.

Related Tags:

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Artificial Intelligence Webinar
AI and Educational Leadership: Driving Innovation and Equity
Discover how to leverage AI to transform teaching, leadership, and administration. Network with experts and learn practical strategies.
Content provided by 
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School Climate & Safety Webinar
Investing in Success: Leading a Culture of Safety and Support
Content provided by 
Assessment K-12 Essentials Forum Making Competency-Based Learning a Reality
Join this free virtual event to hear from educators and experts working to implement competency-based education.

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide 鈥 elementary, middle, high school and more.
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.

Read Next

Artificial Intelligence From Our Research Center 'We're at a Disadvantage,' and Other Teacher Sentiments on AI
Teachers say they have other, more pressing priorities.
3 min read
3D illustration of AI button of keyboard of a modern computer. Light blue button. 3D rendering on blue background. Training.
Bo Feng/iStock + Education Week
Artificial Intelligence Opinion What Makes 69传媒 (and the Rest of Us) Fall for AI Misinformation?
Researchers Sam Wineburg and Nadav Ziv explain how to turn your students into savvy online fact-checkers.
Sam Wineburg & Nadav Ziv
4 min read
Trendy pop art collage search concept. Halftone laptop computer with search bar and cut out hands pointing on it.
Cristina Gaidau/iStock
Artificial Intelligence Parents Sue After School Disciplined Student for AI Use: Takeaways for Educators
The Massachusetts lawsuit is one of the first to highlight the benefits and challenges of generative AI use in the classroom.
5 min read
Person using technology smart robot AI, enter command prompt. A.I. Chat concept AI, Artificial Intelligence.
iStock/Getty
Artificial Intelligence Q&A This Counselor Used AI to Help 69传媒 Apply to College. Here's How
Jeffrey Neill shares his tips on when it makes sense to use AI in the college application process.
6 min read
Jeffrey Neill, director of college counseling at Graded - The American School of S茫o Paulo in Brazil, presents on how to use AI tools in his work at the College Board鈥檚 annual forum in Austin, Texas on Oct. 21, 2024.
Jeffrey Neill, director of college counseling at Graded: The American School of S茫o Paulo in Brazil, presents on how to use AI tools in his work at the College Board鈥檚 annual forum in Austin, Texas, on Oct. 21, 2024.
Ileana Najarro/Education Week