All school year, Kaylee Carrell has been watching online math videos using a free software platform called Algebra Nation.
What the Florida 8th grader didn’t know: The software was also watching her.
As part of her nightly homework, Carrell might start a video, watch an instructor explain a concept, rewind to review, press pause when she was ready to solve a problem, and post messages to the Facebook-style “wall” if she needed help. Occasionally, a brief survey might pop up.
Behind the scenes, the software was diligently tracking all that activity, anonymously logging the clicks and keystrokes of Carrell and more than 200,000 other students. As part of an $8.9 million federal grant project, researchers then used machine-learning techniques to search for patterns. Their ultimate goal: improve student learning by teaching the software to pinpoint when children are feeling happy, bored, or engaged.
It’s just one example of a growing push to use educational technology to measure, monitor, and modify students’ emotions, mindsets, and ways of thinking.
The trend is provoking strong and conflicting reactions throughout the K-12 world.
“I can see how it could be really helpful,” Carrell said after a reporter explained the mechanics of the Algebra Nation research in which she was unknowingly taking part.
“But home is also supposed to be a safe space. You don’t want to feel like your computer is watching you.”
Personalized Learning for the â€Whole Child’
For years, there’s been a movement to personalize student learning based on each child’s academic strengths, weaknesses, and preferences. Now, some experts believe such efforts shouldn’t be limited to determining how well individual kids spell or subtract. To be effective, the thinking goes, schools also need to know when students are distracted, whether they’re willing to embrace new challenges, and if they can control their impulses and empathize with the emotions of those around them.
To describe this constellation of traits and abilities, education experts use a host of often-overlapping terms, such as social-emotional skills, non-cognitive abilities, character traits, and executive functions.
For many parents and teachers, it’s common sense: Kids do well when they pay attention, work hard, and get along with others.
An emerging body of research backs that intuition, tying these non-academic factors to improved school achievement, future workplace success, and long-term well-being. Learning scientists are also increasingly convinced these traits and abilities can be improved with practice.
The result has been a groundswell of interest. A major international test of students’ social-emotional skills will be unveiled in 2019. The recently passed federal education law, known as the Every Student Succeeds Act, tried to push states and schools to broaden their definition of success. And major philanthropies and venture-capital firms are lining up to support the movement with hundreds of millions of dollars.
Sensing opportunity, a fresh crop of companies has sprouted up.
One of the most popular now administers online surveys to more than 7 million students a year, generating a massive database about children’s “grit” and “growth mindset.”
Others claim they can improve children’s “impulse control” through video games; provide parents with a “high-dimensional psychometric profile” of their preschoolers; and allow school staffers to use smartphones to continually record their observations of students’ feelings.
Meanwhile, cutting-edge researchers are also exploring facial recognition, eye-tracking, wearable devices, and even virtual reality as ways to better gauge what students are feeling.
Proponents view this push to understand and respond to the “whole child” as a promising path to dramatically improving student learning—and an antidote to the K-12 world’s longstanding focus on standardized tests.
Critics fear an Orwellian surveillance state, in which government and corporations alike invade students’ privacy, encroach on their individual liberty, and try to manipulate their behavior based on spurious measurements.
Recent fiascoes involving misuse of sensitive data harvested by consumer platforms such as Facebook have only amplified the stakes.
Add it all up, and the K-12 sector finds itself walking a tightrope.
“Social-emotional learning is really about building relationships and communities,” said Jeremy Taylor, the director of assessment for the nonprofit Collaborative for Academic, Social, and Emotional Learning. “We hope ed tech can support that, but we have to make sure things are done in a way that is responsible and not outpacing people’s comfort with what is happening in their schools,”
More Than Just a Test Score
Among the most prominent groups seeking to make sense of this new landscape is the Chan Zuckerberg Initiative.
A private venture-philanthropy organization founded by Facebook CEO Mark Zuckerberg and his wife, pediatrician Priscilla Chan, CZI plans to devote hundreds of millions of dollars per year to “whole-child personalized learning.”
As part of that work, the group teamed up in May with the Bill & Melinda Gates Foundation to scour the field for the latest insights on how to measure and support students’ “executive functions,” such as the ability to focus and filter out distractions.
Jim Shelton, who heads Chan Zuckerberg’s education efforts, explained his thinking in an interview.
“Kids have physical development, mental-health development, identity development, cognitive-skills development, and social-emotional skills development, as well as academic-skill development,” Shelton said. “Unless you understand how they are relating to each other, it is very difficult to optimize any of them.”
Creating that kind of 360-degree view of each individual child is the key to reaching millions of students who have been failed by the existing education system, Shelton believes. Technology may eventually help make such a holistic understanding possible at scale, he said, but efforts to develop such tools are still in their infancy.
Nevertheless, real companies are already working to pursue various parts of what Shelton described.
Take Panorama Education. Founded in 2012, the data-analytics company has raised $32 million in venture capital (including multiple rounds from CZI.)
It already works with 500 school systems, including the 29,000-student Spokane, Wash., district.
In addition to questions about school safety and their relationships with teachers, all of Spokane’s 4th to 12th graders now take online surveys asking things like:
• In school, how possible is it for you to change how easily you give up?
• How often do you stay focused on the same goal for several months at a time?
• During the past 30 days, how carefully did you listen to other people’s points of view?
“We have a bigger responsibility than just cranking out kids who can pass tests.”
—Travis Schulhauser,
director of assessment and instructional technology, Spokane Public 69´«Ă˝
Panorama stores the responses in a central database, then feeds the information back to Spokane administrators and educators via customized dashboards.
As part of this year’s pilot, a small group of Spokane teachers was able to see how each individual child in their classrooms scored on a 0-5 scale for the social-emotional domain corresponding to each question: grit, growth mindset, and social awareness.
Principals and counselors also analyzed the Panorama data alongside other information, such as class rosters, to identify students with an attitude that might help them thrive in Advanced Placement courses, even if they didn’t have the highest PSAT scores. And district leaders examined how their schools compared with national benchmarks—which Panorama develops by analyzing the anonymized data of millions of students who take its surveys each year.
Spokane officials describe their multi-pronged use of Panorama data as key to making better decisions.
“As we get more and more information from kids, I think we’ll respond to each student better and better,” said Travis Schulhauser, the district’s director of assessment and instructional technology.
Plenty of schools are embracing social-emotional learning without using technology or trying to measure individual students’ development.
New research has raised questions about whether classroom-based interventions around concepts like growth mindset are likely to have a significant impact.
And some of the leading pioneers in the field have warned about the limits of using surveys and questionnaires to measure social-emotional skills. Even Shelton said CZI invested in the company primarily because of its surveys related to school climate and culture—and that he wasn’t familiar enough with its “entire product suite” to say whether it provides “a valid and reliable measure of a developmental area.”
Regardless, Spokane is just one of hundreds of districts embracing the digital tools and data they’re currently being offered.
“We have a bigger responsibility than just cranking out kids who can pass tests,” Schulhauser said. “We take seriously our responsibility to develop students who are well-rounded and able to achieve their hopes and dreams.”
â€Just Un-American’
Why wouldn’t someone want schools to look beyond test scores, or use technology to encourage kids to persevere in class and build healthy relationships with peers?
“Sorry, but I don’t want the government knowing how my child’s mind works,” said Jane Robbins, an attorney and senior fellow with the American Principles Project Foundation, a think tank that promotes individual liberty.
Like many conservative and libertarian parents and activists, Robbins is strongly opposed to using technology to track student emotions and mindsets.
An individual teacher encouraging students to try hard is one thing, she said. But surveys like Panorama’s are “borderline mental-health assessments,” she believes, and they encourage a view of children as potential patients in need of treatment.
In addition, Robbins argued, public schools are an extension of government. And government should not be trying to monitor and mold what individual citizens feel.
Plus, some critics on the right view the social-emotional learning movement as a thinly disguised effort to promote ostensibly left-wing causes, such as gay and transgender rights. (“Might a student’s â€relationship skills’ be deemed deficient if, in keeping with the influence of his family and faith, he rejects the LGBT agenda such as same-sex marriage and normalization of gender dysphoria?” asked conservative activist Karen Effrem in a 2016 essay, “69´«Ă˝ Ditch Academics for Emotional Manipulation.”)
And all those concerns get amplified dramatically when large-scale digital data collection is added to the mix.
“You don’t think these databases are going to be interesting down the line to employers, or prosecutors?” Robbins asked.
Given such opposition, it’s not hard to imagine the reaction to newer entries into the social-emotional learning space, such as San Francisco-based startup Emote.
The company’s core service is a mobile app that makes it easier for a wide range of school staff, from bus drivers to teachers, to record and share their observations of when students appear sad, anxious, angry, and frustrated.
Launched in 2016 with $120,000 in support from Y Combinator, one of the most high-profile “accelerators” in Silicon Valley, Emote is already used by 16 districts in more than a dozen states.
How does it work?
Imagine a student gets in an argument with his parent before school, said Julian Golder, the company’s 32-year old CEO. That morning, the boy stomps into the school building. A staff member at the front desk notices, so she enters the observation directly into the Emote app on her smartphone, selecting from a menu of keywords (“sad”) and a color-coded scale (“blue.”)
That, in turn, prompts a notification to be sent to each of the student’s teachers.
If those teachers then notice something similar—the student has his head down in 1st period, or seems disconnected in 3rd period—they also record their observations in the app.
If a pattern emerges, Golder said, the student may be at risk of “escalation,” such as getting into a fight or flunking his 5th period math test.
Despite their best efforts, he said, schools often miss these kinds of developments in the moment, leaving them to respond to the trouble after it happens.
But Emote helps adults notice and communicate about kids’ feelings and behaviors in real time, Golder said. That way, they can respond before something bad happens.
“Sorry, but I don’t want the government knowing how my child’s mind works.”
—Jane Robbins,
attorney and senior fellow, American Principles Project Foundation
The company also allows schools to track students’ feelings longitudinally: Maybe the student feels angry and disconnected every Monday morning because there’s some underlying issue at home that needs to be addressed? Or perhaps the school’s African-American boys are consistently feeling disconnected after a particular class, and some observations of the teacher are warranted?
“I think this is a really exciting vision of what school can look like,” Golder said.
“There’s more interest than we can handle at this point.”
Still, Robbins of the American Principles Project Foundation isn’t buying it.
“The idea of telling children that even their feelings are not private, and that we’re going to constantly surveil them and analyze them, is just un-American,” Robbins said. “The only good thing I can see about this company is that they’re not hooking kids up to wearables.”
â€Expand the Realm of the Possible’
Broadly speaking, however, that’s exactly where the field appears to be heading.
In 2016, the World Economic Forum and the Boston Consulting Group issued a report on the future of ed tech and social-emotional learning. It highlighted wearable devices as a way to “expand the realm of the possible.”
Among the companies trying to deliver on that promise: Cambridge, U.K.-based Tinylogics, which is currently testing a wearable it calls FOCI. When clipped to a waistband, the device will track users’ breathing patterns, then tell them when they’re feeling focused, relaxed, fatigued, or stressed. Tinylogics describes an accompanying app as a “focus-enhancing mind coach.” A recent press release from the company billed the FOCI as a tool for managing digital distractions in the classroom.
Prominent ed-tech venture-capitalists are also touting virtual reality as an “empathy technology.” A company called Mursion, for example, has been working with the Alexandria, Va., public schools to use immersive VR simulations as part of an effort to help students with autism develop their social-emotional learning skills.
And the World Economic Forum and the Boston Consulting Group also touted the field of “affective computing,” in which machines are trained to recognize, interpret, and simulate human emotions.
Which brings the story back to West Palm Beach and Algebra Nation.
The effort to use clickstream data from the online platform in order to identify student emotions and engagement is being led by University of Colorado Boulder professor Sidney D’Mello, a national leader in the affective-computing field.
So far, all the information harvested by the team on the Algebra Nation project has been collected anonymously, in a way that is “intentionally insufficient to build affective profiles for individual students (which is not our goal at all),” D’Mello wrote in an email.
Because of that approach, the institutional review board tasked with approving the project determined that no parental consent was yet needed.
But that could change soon.
Beginning in spring 2019, pending the review board’s approval of a series of new consent protocols, some students may interact with the current version of Algebra Nation. Others might interact with a version of the software that will analyze their individual click patterns in real-time to identify when they’re becoming bored or frustrated. If the predictions prove sufficiently accurate, it will respond in the moment with personalized prompts or supports. The researchers will see if the software that adapts to students’ emotional states helps them learn algebra better.
At Conniston Middle School, Kaylee Carrell and her classmates had nuanced reactions, calling the idea both “cool” and “creepy.”
The one place where students consistently drew a line: facial recognition.
“I would feel like I’m being watched, like someone is spying on me,” said 8th grader Merlin Aguilar.
But even there, related technology is already being used in K-12.
Through the Emotive Computing Lab that D’Mello runs, for example, he’s conducting a separate research project using eye-trackers and webcams to track consenting students’ eye movements and facial expressions. Algorithms examine the resulting data to determine when kids’ minds are wandering. If a student is zoning out, the software will intervene.
D’Mello acknowledged the privacy concerns associated with such technologies. But he said some level of risk is necessary in order to pursue a promising new vision of education.
“If I could always have teachers that are adapting to me, looking at my mistakes, giving me motivation and supports, and then backing away and giving me room, I think I’d be a much better learner,” he said.
Clearly, that view has support from high places. The Algebra Nation project, for example, is being funded with an $8.9 million grant from the federal Institute of Education Sciences, the research arm of the U.S. Department of Education.
But recent controversies in other sectors have started to cast the use of technology to track student emotions in a critical new light.
Take the recent Facebook-Cambridge Analytica scandal, in which millions’ of users had sensitive information about their preferences and personalities misused as part of an effort to influence their votes during the 2016 presidential election.
The resulting headlines and Congressional hearings have helped make the K-12 world more aware that similar technologies are already making their way into schools, said Ben Williamson, a lecturer at the University of Stirling in the United Kingdom.
That, in turn, is heightening awareness of the possibility of unintended consequences.
Data breaches are the most obvious potential downside, Williamson said, although he wasn’t aware of any such incidents involving emotional or affective information collected from students.
But more pernicious, Williamson believes, is the potential for “psycho-compulsion and behavior modification.”
“If you generate detailed information about students’ feelings, then it becomes possible to target them in sophisticated ways in order to nudge them to behave in ways that conform with a particular, idealized model of a â€good student,’” Williamson said.
For supporters, that’s part of the promise: The possibility that technology might help students develop grit and focus—and improve their grades, job prospects, and long-term health—is reason to invest millions now.
For critics, though, it’s the wrong end of a slippery slope: Government agencies and Silicon Valley companies deciding how students should be thinking and what they should be feeling—then collecting massive amounts of data and deploying invisible algorithms to enact that agenda—is something to be fought now, before the horse is all the way out of the barn.
The one thing both sides agree on?
“The technology is powerful,” Williamson said, “and it could have real consequences.”