As personalized learning and adaptive learning tools in K-12 schools, researchers and educators are raising questions: What’s behind the algorithms that determine students’ growth and progress, and how can students’ personal and academic data be stored safely?
A from the University of Colorado’s probes these questions, calling for stronger regulations around the use of algorithms in personalized learning and the collection and storage of student data.
The researchers suggest that the swift and enthusiastic adoption of personalized and adaptive learning in some schools has come at the expense of student privacy. The algorithms that power ed-tech software need to be publicly available for educators and researchers to review, and companies and districts need to be held accountable if they violate students’ privacy, the report argues.
“Science and education both are supposed to be open processes, and open to discussion and evaluation,” said Faith Boninger, a research associate at NEPC and the lead researcher on the report.
When personalized learning software uses proprietary algorithms, the report contends, developers are obscuring what data is used to evaluate students and how that evaluation process works. Educators have to trust that the criteria and method set up by the company is pedagogically and ethically sound.
This “black box” could pose a problem, the researchers wrote.
Algorithmic bias, and the limitations of adaptive technology in general, .
Though it’s common to think of algorithms as neutral, factual tools, these formulas are designed by people and . The algorithms in educational software, the report reads, “reflect the assumptions and biases of their developers and are subject to limitations in what the software can actually sort and measure.”
To combat this, NEPC suggests that legislation should require third-party evaluation of all software powered by adaptive technology. Any technology should be assessed for “validity and utility” before it’s introduced in the classroom, the report argues.
Policies like these would slow down the tech adoption process, so that discriminatory algorithms or holes in companies’ data privacy policies can be identified. Boninger said NEPC wants to avoid students playing the role of “guinea pigs.”
“A better way to try to weed out unintended consequences,” she said, “is to carefully examine what you’re using before you start to use it.”
The report also calls for school-level policies that outline what data will be used for, how it will be protected, and when how it will be disposed of.
“There isn’t a hurry, really, to get these applications into schools,” said Boninger. “What is really important is to protect the kids.”
See more: