Understanding the mathematical formula to calculate lift and thrust is still a long way from designing a 747 airplane, and the U.S. Department of Education is trying to get students to cross that bridge with the development of a new way to gauge how well they both understand and apply technology and engineering principles.
The National Center for Education Statistics is nearing completion of a 15,000-student pilot test—the largest in the history of the National Assessment of Educational Progress—to craft a new technology- and engineering-literacy test, or the TEL.
“What we’re talking about here is trying to put the ‘T-E in STEM,” said NCES Commissioner Sean P. “Jack” Buckley, referring to the common term for science, technology, engineering, and mathematics. “We’ve been assessing the [science and math] for some time, but it’s been much harder to figure out the framework for an actual, practical, functional field assessment for technology and engineering components.”
The current pilot, on track to be finished by the end of this month, targets 8th graders. In 2014, a final version of the test is slated to be administered to a nationally representative sample of 20,000 such students, with results expected in 2015. Eventually, the TEL will cover the 4th, 8th, and 12th grades.
“This is really important, and I’m glad to see it,” said Adam Gamoran, a member of the National Board of Education Sciences, the Education Department’s research advisory group, and the director of the Wisconsin Center for Education Research at the University of Wisconsin-Madison.
While a few curricula, such as the International Baccalaureate program, include engineering and technology courses, Mr. Gamoran noted there is little research on how well even today’s “digital native” generation understands technology and engineering.
“From my vantage point as a sociology researcher, I suspect there remains a substantial digital divide—that children from different backgrounds will have vastly different experiences with these questions about technology,” he said. “This will provide evidence of something we have many suspicions about but virtually no evidence.”
New Direction
The TEL represents a significant shift for the battery of tests commonly dubbed “the nation’s report card.” It will be NAEP’s first entirely computer-based test and the first to use a majority of interactive scenario-based questions.
International assessments, in particular the Program for International Student Assessment, already gauge proficiency in more comprehensive and applied-science questions, which is in part why experts say American students’ performance tends to lag behind that of students in other countries on PISA.
More than 2,000 engineering and technology professionals from around the United States contributed to the development of the test’s framework, which covers three interconnected areas: the design process and principles of dealing with technology in daily life; information and communication-systems technology, such as computer networks and mobile devices; and the social and ethical implications of technology’s effects in the natural world.
“We’re pretty good at assessing students in science, but how do we assess the difference between a scientific solution—some sort of global, perfect universal solution—and engineering, which is a lot more about trade-offs and constraints in a given situation to get a solution that works?” Mr. Buckley said.
Testing ‘Scenarios’
The solution, he said, is to include “much more complex and higher-order-thinking items” than have previously been used in NAEP.
Roughly 20 percent of the test’s questions will cover concrete facts and information. The rest will use a new kind of question, which requires students to interact in engineering or technology “scenarios,” to apply ways of critical thinking and problem-solving that are associated with engineering.
Each scenario is 10, 20, or 30 minutes long and gauges a student’s mastery of engineering practices, such as systematically using technology, tools, and skills to solve a problem or achieve a specific goal, or using technology to communicate and collaborate with a team and consult experts.
For example, a student may be asked to collaborate with a simulated “boss” via videoconference to improve the consumer “life cycle” of a toaster.
“NAEP is not alone in the world of large-scale standardized assessment in trying to come up with ways to better assess how people work collaboratively,” Mr. Buckley said.
Moreover, the test will also begin to use student-activity data to report and evaluate how the student solves each problem. For example, NAEP’s writing test collects information about how students used the in-test word-processing software to check spelling and edit sentences, but does not use that information to evaluate students’ performance. In this assessment, a student might get more points for answering a problem efficiently and making the best use of the tools available.
“The intent is to be much more authentic and closer to a real project,” Mr. Buckley said.
During the test’s administration, the NCES will also collect data on students’ access to technology at home and teachers’ use of technology in the classroom.