69´«Ã½

Special Report
Assessment

Digital Simulations Emphasize Problem Solving

By Benjamin Herold — March 10, 2014 9 min read
  • Save to favorites
  • Print
Email Copy URL

Multiple-choice exams measuring the breadth of students’ content knowledge have hardly disappeared.

But the creators of a leading international exam, the U.S. government agency behind “the nation’s report card,†and the makers of new classroom learning tools are all turning their attention to something new: digital simulations aiming to measure how students solve problems, communicate, and work with others.

Examples include simulations that ask students to collaborate with a computer avatar to keep a tankful of tropical fish alive, help a virtual village fix its broken water pump, develop a model of an atom, and help an online city reduce pollution while growing its economy.

“In the real world, there is often more than one route to the right answer, or more than one right answer,†said Michael Barber, the chief education strategist for education publishing giant Pearson and a strong backer of the new assessment trend. “With simulations, you can have complex problems and allow different students to find different ways through.â€

The use of simulations for assessment isn’t new: For decades, for example, airline companies and the military have used simulators to train and screen prospective pilots. Many educators have also had a long-standing interest in more traditional forms of low-tech simulated performance assessments, in which students are required to complete a sophisticated task, such as designing and conducting a science experiment.

But questions of cost, reliability, validity, and scalability have been stumbling blocks in the past to widespread adoption of both simulations and performance assessments, said Eva L. Baker, a research professor in the graduate school of education and information studies at the University of California, Los Angeles. She said those same challenges remain with today’s emerging technologies.

“We all want to avoid the gee-whiz factor,†said Ms. Baker, the director of the National Center for Research on Evaluation, Standards, & Student Testing, or CRESST. “We have to make sure there is adequate evidence these things work and are not simply the next big thing.â€

Real-Time Feedback

Proponents maintain that digital simulations, which experts in the field say generally involve less narrative, design, and purely fun elements than digital games, can increase student engagement and provide real-time feedback to educators. They also believe the torrents of data that simulations generate can be mined for evidence of deeper learning—not just whether a student selects the correct canned response, but whether he or she asks the right questions, uncovers all the necessary information to make a smart choice, and shares that information effectively with a partner.

“Knowing the right answer is just half of what we should be looking for,†said Peggy G. Carr, the associate commissioner of the assessments division at the National Center for Education Statistics, which this spring is field-testing its new simulation-heavy technology, engineering, and literacy assessment with 20,000 U.S. 8th graders.

But while the TEL exam—like the Program for International Student Assessment, or PISA, which will begin incorporating digital simulations in 2015—is a summative test that aims to provide an overview of student performance, the real power of simulations lies inside the classroom, said Christopher Dede, a professor of learning technologies in Harvard University’s Graduate School of Education.

“If we can develop better formative assessments that provide diagnostic information that is used to go back and change the course of instruction,†Mr. Dede said, “we can have a much greater impact.â€

Some groups are working on exactly that goal: Pearson, which is based in London and has U.S. headquarters in New York City, is playing a key support role in the development and rollout of SimCityEDU, a classroom adaptation of the classic urban-planning computer game. And PhET, a research and development laboratory at the University of Colorado at Boulder, is currently expanding the formative-assessment tools built into its 128 digital math and science simulations.

Barriers remain: It takes about six months and $50,000 to build a single PhET simulation, for example, and parents and privacy advocates are likely to worry about how all the student data that simulations generate is secured. On a practical level, the nation’s schools are likely to be occupied for the foreseeable future with the challenge of moving even more traditional assessments online.

“I don’t think this is going to change the face of education in a two- or three- or even a five-year period,†said Mr. Barber of Pearson. “But I think in 10 years, it will be transformative.â€

Digital simulations are making their way into a wide variety of assessments, used for an equally wide variety of purposes. Four of the prominent players in this emerging field include:

1. National Center for Education Statistics

How do you measure students’ ability to understand, use, and improve technology?

Test-takers must determine the best way to fix a broken well in a simulation for the new technology, engineering, and literacy exam by the NCES.

With “task-based scenarios†online, believe officials of the NCES, which administers the National Assessment for Educational Progress.

In one such task on the agency’s new technology, engineering, and literacy exam, students are asked to assume the role of an engineer who must figure out why a village’s water well has stopped pumping water. The item gauges test-takers’ troubleshooting skills: Do they access and use the most relevant information? Can they quickly diagnose the problem?

“It’s intended to provide an assessment that’s more authentic for students,†said Peggy G. Carr, the associate commissioner of the assessments division at the NCES, a branch of the U.S. Department of Education.

The exam is currently going through its second large field test, although no live administration has yet been scheduled. The NCES is investing in the new assessment, Ms. Carr said, in part because science, technology, engineering, and mathematics skills often cut across academic content areas and are untouched by subject-specific tests.

The reliability and validity of digital simulations remains a concern, but she said an earlier field test of the new exam helped: The NCES feels confident, for example, that it has identified the most efficient way for test-takers to troubleshoot the broken pump. Test-takers who pursue that path will get the highest score. “There has to be a way to score these tasks such that you can put them on a scale with higher or lower levels of proficiency,†Ms. Carr said.

2. Organization for Economic Cooperation and Development

A leading global assessment used to compare the reading, mathematics, and science skills of students in 70 countries will use digital simulations to measure students’ problem-solving ability—both alone and with others—beginning in 2015.

Beginning in 2015, students taking a new PISA exam will collaborate with a computer avatar to keep a tank of virtual tropical fish alive.

In science, students taking the Program for International Student Assessment, or PISA, next year will encounter “experimental situations†in which they must manipulate multiple variables in order to perform a real-world task, such as figuring out how to make the most efficient zeer pot (a clay refrigerator that doesn’t require electricity.)

PISA will also include in 2015 a new exam solely dedicated to measuring students’ collaborative problem-solving ability. In one sample item, the challenge is to keep a tankful of tropical fish alive. In order to succeed, students will need to interact with a computer avatar via a chat window to uncover necessary information, devise and implement a plan of action, and communicate the results of their experiment.

“Simulations test students’ thinking skills and creativity in far greater ways than we can do with pencil and paper,†said Michael Davidson, the head of the early-childhood education and schools division of the Paris-based OECD, which administers PISA. “If countries agree that these are skills 15-year-olds should have, then having a summative assessment like PISA to determine if those skills are present is useful.â€

3. PhET Interactive Simulations Project

For some, digital simulations are less about determining students’ proficiency than about helping them learn.

“Our simulations are providing dynamic feedback so students can build an understanding of cause-and-effect relationships, develop mental models, ask questions, generate evidence, and engage in the scientific process at the same time they’re learning core content ideas,†said Kathy Perkins, the director of PhET, based at the University of Colorado at Boulder.

A popular simulation from PhET lets science students manipulate the structure of atoms.

Since 2004, the PhET team has created 128 interactive digital simulations in physics, biology, chemistry, earth science, and math. All told, Ms. Perkins said, the simulations have been used more than 45 million times, primarily in middle and high schools.

In one popular example, students can drag and drop protons, neutrons, and electrons into a simulated atom.

Using the PhET simulations for formative assessment requires a lot from teachers, who generally must infer for themselves how students are interacting with the tools, have the skills to get students talking about what they’re learning, and know how to adjust classroom instruction based on the resulting insights.

To ease that burden, PhET is working to expand its professional-development resources for teachers and working with external partners such as Pearson and the Princeton, N.J.-based Educational Testing Service to think about new tools and functions that can be added to the simulations.

“We want to create tools that the broader education community can use and leverage,†Ms. Perkins said.

4. Pearson

Pearson has long been a titan in the standardized-testing industry, but officials from the London- and New York City-based company say they’re most excited about the potential of digital games and simulations to provide formative support inside classrooms. “I think [simulations] have a place in summative assessments, but you have to take their limitations into account,†said Kristen DiCerbo, a senior research scientist for the company.

69´«Ã½ can show their “systems thinking†skills by reducing pollution while improving the economy in the virtual city in SimCityEDU, a new classroom assessment tool from GlassLab.

Ms. DiCerbo has worked extensively with GlassLab—a team of researchers, designers, developers, and learning scientists making games for assessment, in which Pearson is a participant—on the development of the recently released SimCityEDU. A classroom adaptation of the classic urban-planning computer game, SimCityEDU—which Ms. DiCerbo describes as more game than simulation due to the level of design, narrative, and just plain fun that is built in—aims to promote and assess students’ “systems thinking,†or grasp of multiple overlapping cause-and-effect relationships.

As students try to reduce pollution while also improving the economy in a virtual city, their every digital action is tracked: 3,000 points of data or more in a 10-minute play cycle.

Ms. DiCerbo leads the effort to mine that information, searching for patterns that suggest evidence of learning. How to safely secure all that student data remains a sensitive issue, and there have been surprises—what to make, for example, of the 5 percent to 10 percent of players who decide the most efficient way to meet SimCityEDU’s goals is simply to bulldoze big sections of the city.

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Artificial Intelligence Webinar
AI and Educational Leadership: Driving Innovation and Equity
Discover how to leverage AI to transform teaching, leadership, and administration. Network with experts and learn practical strategies.
Content provided by 
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School Climate & Safety Webinar
Investing in Success: Leading a Culture of Safety and Support
Content provided by 
Assessment K-12 Essentials Forum Making Competency-Based Learning a Reality
Join this free virtual event to hear from educators and experts working to implement competency-based education.

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.

Read Next

Assessment Opinion 'Academic Rigor Is in Decline.' A College Professor Reflects on AP Scores
The College Board’s new tack on AP scoring means fewer students are prepared for college.
4 min read
The United States Capitol building as a bookcase filled with red, white, and blue policy books in a Washington DC landscape.
Luca D'Urbino for Education Week
Assessment Opinion 69´«Ã½ Shouldn't Have to Pass a State Test to Graduate High School
There are better ways than high-stakes tests to think about whether students are prepared for their next step, writes a former high school teacher.
Alex Green
4 min read
Reaching hands from The Creation of Adam of Michelangelo illustration representing the creation or origins of of high stakes testing.
Frances Coch/iStock + Education Week
Assessment Opinion Why Are Advanced Placement Scores Suddenly So High?
In 2024, nearly three-quarters of students passed the AP U.S. History exam, compared with less than half in 2022.
10 min read
Image shows a multi-tailed arrow hitting the bullseye of a target.
DigitalVision Vectors/Getty
Assessment Grades and Standardized Test Scores Aren't Matching Up. Here's Why
Researchers have found discrepancies between student grades and their scores on standardized tests such as the SAT and ACT.
5 min read
Student writing at a desk balancing on a scale. Weighing test scores against grades.
Vanessa Solis/Education Week + Getty Images