“What if we could read students’ brains and see what they’re thinking?”
That was the question posed to a group of education reporters last week by John Anderson, a professor of psychology and computer science at Carnegie Mellon University, where a cross-disciplinary team of researchers is seeking to push the boundaries of adaptive educational software.
Whether you find the idea exciting or creepy (or both), an early version of the technology is already here: Anderson’s most recent paper, “Hidden Patterns of Cognition Revealed in Patterns of Brain Activation,” was published this week in the journal Psychological Science.
The basic premise: Researchers can now use brain-imaging techniques to identify the mental stages humans go through while solving math problems. From there, they can use machine-learning algorithms to find the connections between patterns of human brain activity and patterns in the data generated by students as they interact with math software. Armed with that information, the researchers hope, they can build better educational software programs capable of quickly detecting how students are attempting to solve a given problem, then responding in a personalized way.
Imagine, for example, a student trying to solve a complex math problem through a “brute force” approach of random calculations, rather than by developing a formula. Instead of idling while the student flounders, such software would be able to recognize the student’s (lack of a) problem-solving strategy and quickly intervene, perhaps by redirecting the student to problems with built-in “scaffolds” that would help him or her learn how to develop the necessary formula.
Carnegie Mellon has long been at the fore of the adaptive-learning field. Two decades ago, researchers at the Pittsburgh-based university pioneered the software that eventually became known as Cognitive Tutor. Now owned by an independent spinoff company called Carnegie Learning, the adaptive tools are in use by roughly half a million K-12 students per year.
The CMU team’s calling card has always been what they dub “cognitive models.” Essentially, the idea is that good software, like a good teacher, needs to have a deep understanding of not only each step that students must go through in order to solve a problem, but also the common misunderstandings and wrong turns they’re likely to encounter.
Historically, building those cognitive models and incorporating them into software has been a labor-intensive process: content-area and learning-science experts construct a schema, developers bake it into the software, researchers collect extensive data on how students interact with the software, the schema is refined, and the whole cycle starts over again.
Brain scans to build ‘cognitive models’
Introducing brain-activity scanning into the equation holds the potential to improve that process and bring it to much deeper, more creative problem-solving exercises than is currently possible, Anderson believes.
Up until 2012, he told reporters, he and his colleagues were using such techniques to identify students’ brain-activity patterns when they were solving problems for which the researchers had well-established cognitive models. To use a rough analogy, the researchers already knew when a student was taking a left turn; the goal was to figure out what taking a left turn looked like in the brain. Big picture, the idea was to map out the connections between the math problem-solving patterns researchers already knew and the brain activity they could now observe.
Over the past few years, though, Anderson and his team have developed the capability to use brain-imaging technology to find entirely new problem-solving patterns. In other words, in situations where the researchers previously only knew that a student had gotten lost, they might now be capable of figuring out exactly where a student made the wrong turn, as well as mapping the unknown territory in which the student now finds himself or herself.
In the new research paper, for example, the team was able to identify the brain-activity patterns associated with four distinct stages of problem-solving involving a particular type of complex math problem: encoding, planning, solving, and responding.
After identifying those different stages, the researchers then measured how long each study participant spent planning how to solve a given problem.
Previously, the invisible mental processes that people used to solve such problems were a “total mystery,” Anderson said in a statement issued by the university to announce the publication of the new research study.
“Now, when students are sitting there thinking hard, we can tell what they are thinking each second,” he said.
In reality, that might be overstating it a bit, at least for now.
At the moment, Anderson told reporters, researchers are limited to measuring how much time students are spending in a given problem-solving stage.
But in time, he believes, they will be able to recognize each arithmetic computation the brain is engaged in, and possibly even the specific numbers that a student is thinking about.
And the “bleeding-edge, future work,” Anderson said, will come around tracking and responding in real-time to students’ actual brain activity, as opposed to the indirect process that happens now. With the advent of cheaper new tools such as high-quality commercial eye-tracking technology and even wearable EEG reading devices, that future might not be as far off as it sounds.
Researchers could also include in their models information gleaned from tracking students’ emotional, or affective, state, such as frustration or excitement.
And as to the question of leveraging the power to see into students’ minds to build better software is a good thing?
“One can always wonder whether a technology could be misused,” Anderson said in an interview.
“But so many educational interactions now are characterized by a teacher or software not understanding what students are really trying to do. I think many students would be only too happy to have some means of communicating what they’re doing.”
See also: