69ý

Assessment

NAEP Crafts Plans to Deploy Tablets for Testing

By Sean Cavanagh — April 22, 2014 8 min read
  • Save to favorites
  • Print
Email Copy URL

The architects of one of the most highly regarded gauges of student achievement—the National Assessment of Educational Progress—are preparing for a dramatic expansion of technology-based assessment, while relying on a strikingly different approach from the one that will be used to give online common-core exams in the states.

Federal officials say the plan for administering NAEP, often called the “nation’s report card,” is for the government to rent tablet computers from a private contractor for the test, distribute them to the sample of participating schools, and retrieve those devices after the exam.

That strategy has been used in NAEP’s earlier, less-ambitious forays into computer-based tests, with the goal of ensuring that the tests are delivered securely, reliably, and consistently. It stands in contrast to the method that will soon become familiar in schools nationwide: About three dozen states have agreed to give tests aligned to the Common Core State Standards, exams that are scheduled to begin a year from now. Those assessments will be given using the eclectic array of computing devices currently in place in schools—expected to include a mix of desktop and laptop computers and tablets, with different features and operating systems, as long as they are compatible with test requirement

Testing officials familiar with the two assessment programs say the contrasting strategies are a reflection of the tests’ very different purposes and needs.

The common-core tests will become a core part of state accountability systems. As such, they will be taken by almost all students in participating states, and the results will carry potentially high stakes for schools.

The NAEP, by contrast, is given to a much smaller sample of schools and students, one designed to produce a nationally representative set of results. Since 1969, its role has been to provide a common benchmark for gauging student performance over time, at the national level and across states and subgroups of students.

Inevitable Shift

Renting and temporarily distributing the devices makes more sense than buying them, given the relatively small number of schools and students and testing window involved, federal officials say. And they also believe using the same stock of devices, rather than the different ones in place in K-12 districts, will help ensure that students have a standardized testing experience.

The officials who direct and oversee the NAEP have been planning for full-scale technology-based testing for years. They say plans for computer-based expansion represent a dramatic, and in some ways inevitable, shift for an exam that seeks to serve as a model in the testing field.

Computerized Testing Comes to the 'Nations's Report Card'

Key dates for the National Assessment of Educational Progress’ move to technology-based testing—although future developments are subject to change—include:

SOURCE: National Center for Education Statistics

BRIC ARCHIVE

“It’s a sea change in terms of the face of the NAEP and what it’s going to be able to do,” said Peggy G. Carr, an associate commissioner for the National Center for Education Statistics, which administers the NAEP. “We need to be a leader, an innovator, in this area.”

But some crucial unknowns surround the new approach, such as the cost. Neither NCES nor the contractor assigned to help distribute the computing devices would provide an estimate to Education Week of the total projected price tag of using the tablets for testing, saying that those costs could involve proprietary information from individual contractors.

The NAEP’s evolution into technology-based testing is part of a broader, nationwide shift from paper-and-pencil exams to assessments delivered with computing devices, one that is playing out at the state level and in individual classrooms.

A growing number of states have begun using computer-based assessments in recent years—about two-thirds of them conduct some form of online testing, according to the Assessment Solutions Group, a Danville, Calif.-based company that works with states and districts.

But the plans to deliver tests aligned to the common-core standards via computing devices represent a major acceleration of the online testing movement. Over the next few months, the two consortia of states designing exams tied to the standards—the Partnership for Assessment of Readiness for College and Careers and the Smarter Balanced Assessment Consortium—are staging field tests of those assessments. When that process ends, more than four million students will have taken near-final versions of the tests in English/language arts and mathematics.

NAEP tests are given to a nationally representative sample of schools and students, designed to ensure that the assessment reflects the student population. They produce a portrait of achievement at many levels, including across the nation and states, and among subgroups.

For instance, in an average state, about 2,500 students in approximately 100 public schools are assessed per grade, per subject, on NAEP tests to gauge state performance, according to recent federal estimates.

By comparison, individual states each typically test a much larger number of students—several hundred thousand—as part of their mandatory assessments.

Over the past few years, computer-based testing has been integrated into a number of NAEP tests, including in science and writing. But those tests were relatively small-scale compared with what’s ahead, noted Ms. Carr.

Phasing In All Subjects

After two years of pilot-testing, by 2017, tablets will be used for all reading, math, and writing tests for the “main” NAEP, which provides state-by-state comparisons of scores and national results. And by 2020, federal officials want to have all NAEP tests in all subjects delivered with computing devices, she said.

The current plan is for NCES to rent tablets from Westat, the Rockville, Md.-based sampling and data-collection contractor for the NAEP. Westat will bring the devices to the schools for the testing periods—as it has with paper-and-pencil and early computer-based tests—arrange for trained staff members to oversee that process, and then take back the devices when the testing ends. In 2017, each technology-based field team is likely to bring about 26 tablets to each school, Ms. Carr explained.

According to the plan, student responses will be sent to, and stored on, a test administrator’s computing device provided to each data-collection team, Ms. Carr said. Results will then be transmitted electronically and securely to another contractor, Pearson, which is charged with scoring the tests, she said.

Federal officials plan to use Microsoft Surface Pro 2 tablets, with attachable keyboards, for the testing, Ms. Carr added, though it’s possible the device could change after the pilot assessments. Microsoft currently lists the base cost of the Pro 2 tablets at $899 apiece.

Dianne Walsh, a vice president for Westat, referred questions about the project’s total cost to NCES, which declined to provide it. But she said distributing computing devices on a temporary basis makes sense, because it will relieve schools from having to free up their own devices to use during NAEP testing, and create greater assurance of a consistent test experience across schools. The early experiences distributing laptops to schools for NAEP testing has shown the model will work, Ms. Walsh said.

Moving from paper-and-pencil to technology-based tests brings myriad benefits, many testing experts say. Among the ones cited most often: scoring the tests is faster; students take tests using devices similar to those they’re likely to be using in their daily classes and at home; and potentially, tests can be designed to collect more sophisticated information about students’ knowledge, in some cases by tailoring questions based on students’ earlier responses.

Rich Results

Those potential payoffs were evident on NAEP’s nascent attempts at computer-based testing, such as in science, in which students were led through interactive tasks via computer, recalled Mary Crovo, the deputy executive director for the National Assessment Governing Board, which sets policy for the NAEP.

“69ý were extremely engaged, and the information collected was rich and useful,” Ms. Crovo said. 69ý’ interest in the test questions appeared to be strong, even among 12th graders, a group that federal officials have sometimes struggled to convince to take the NAEP seriously.

The goal is not to take paper-and-pencil questions and “throw them on a computer screen,” Ms. Crovo said, but to craft questions that produce more information about how students think and solve problems.

Gary Phillips, a vice president for the American Institutes for Research, a Washington-based research and evaluation organization, said NAEP’s shift to online testing is “a big change and a necessary change” given the overall shift to online testing, and the benefits it brings.

But distributing tablets is not the “most efficient” way to administer the tests, said Mr. Phillips, who favors using schools’ computing devices, the strategy being used by the two main state testing consortia. (AIR has contracts with Smarter Balanced to help develop and administer its tests.)

That said, Mr. Phillips predicted that many students would be receptive to using tablets, and would respond more positively to them on test day than they would laptops or desktops.

While NAEP’s turn to technology-based assessment is receiving much less attention than the two consortia’s plans for online testing, it’s crucial for the national assessment to make that switch, and deliver the tests effectively, said Mr. Phillips, a former acting commissioner of NCES.

Soon, some states will be judging their performance on tests designed by Smarter Balanced, while others will be using PARCC-designed tests, and still others that rejected both consortia will be producing their own assessments. Policymakers and the public will need an objective way of evaluating student performance that rises above those different approaches, Mr. Phillips argued.

“The role of the NAEP is even more important than it has been in the past,” he insisted. “NAEP is the only source of comparable data. When things are in flux, it’s important that you have an independent barometer of student performance.”

Related Tags:

A version of this article appeared in the April 23, 2014 edition of Education Week as NAEP Outlines Plans to Deploy Tablets for Tests

Events

School & District Management Webinar Crafting Outcomes-Based Contracts That Work for Everyone
Discover the power of outcomes-based contracts and how they can drive student achievement.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School & District Management Webinar
Harnessing AI to Address Chronic Absenteeism in 69ý
Learn how AI can help your district improve student attendance and boost academic outcomes.
Content provided by 
School & District Management Webinar EdMarketer Quick Hit: What’s Trending among K-12 Leaders?
What issues are keeping K-12 leaders up at night? Join us for EdMarketer Quick Hit: What’s Trending among K-12 Leaders?

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.

Read Next

Assessment Why the Pioneers of High School Exit Exams Are Rolling Them Back
Massachusetts is doing away with a decades-old graduation requirement. What will take its place?
7 min read
Close up of student holding a pencil and filling in answer sheet on a bubble test.
iStock/Getty
Assessment Massachusetts Voters Poised to Ditch High School Exit Exam
The support for nixing the testing requirement could foreshadow public opinion on state standardized testing in general.
3 min read
Tight cropped photograph of a bubble sheet test with  a pencil.
E+
Assessment This School Didn't Like Traditional Grades. So It Created Its Own System
Principals at this middle school said the transition to the new system took patience and time.
6 min read
Close-up of a teacher's hands grading papers in the classroom.
E+/Getty
Assessment Opinion 'Academic Rigor Is in Decline.' A College Professor Reflects on AP Scores
The College Board’s new tack on AP scoring means fewer students are prepared for college.
4 min read
The United States Capitol building as a bookcase filled with red, white, and blue policy books in a Washington DC landscape.
Luca D'Urbino for Education Week