69传媒

Assessment

What 150 Years of Education Statistics Say About 69传媒 Today

By Sarah D. Sparks 鈥 November 16, 2017 6 min read
  • Save to favorites
  • Print
Email Copy URL

Long before there was an independent federal education department鈥攂efore many states had school systems, in fact鈥攖here was a federal education statistics agency.

Today, the National Center for Education Statistics celebrates its 150th anniversary (albeit without a permanent commissioner in place). Though the agency remains independent of the Education Department, its work has laid a bedrock for education policy in the United States in areas from large-scale testing, to tracking students over time, to using surveys and local administrative data to understand changes in schools.

鈥淣CES, even if people aren鈥檛 aware of it, has played a huge role in shaping education research,鈥 said Sean P. 鈥淛ack鈥 Buckley, a former commissioner of NCES. 鈥淭he idea of standardized assessments in longitudinal studies 鈥 really all grew out of NCES and IES [the Institute of Education Sciences], and it drives so much research now that probably more than half of researchers aren鈥檛 aware of where that came from.鈥

Early Years

In 1867, Congress passed a law creating the first education department, focused primarily on collecting education statistics, in the Department of the Interior. (The Department of Education did not exist as a full-scale, cabinet-level agency until 1979.) The statistics agency鈥檚 first goal was 鈥渃ollecting such statistics and facts as shall show the condition and progress of education in the several States and Territories.鈥

鈥淵ou were looking at a growing nation and people were just trying to get a handle on the scope of education in the country,鈥 said Thomas Snyder, the current program director for annual reports and information at NCES.

Yet some of the key questions from those early reports would sound very familiar to education watchers today. The early annual reports often drew contrasts to international education systems in France or Prussia鈥攖hough one noted with pride that the United States spent more per pupil than any other country.

Equity was also an issue from the start. In its 1870 report, for example, the agency detailed the limited schooling available for newly freed black students鈥攖he 鈥渁chievement gap鈥 at that time meant that 80 percent of black adults and 20 percent of white adults couldn鈥檛 read or write their own names.

This print, published by the American Chromo Co. in 1872, shows an interior scene in a school classroom, a child, at right center, is being admonished by both the teacher, seated on a platform at center of the background, and a woman, possibly the child's mother, seated on a bench in the left foreground. The boy does not seem care; it is possibly his lack of initiative that has both teacher and parent concerned.

Data was hand-collected and varied wildly. For example, New Mexico did not provide any data in the first collection, because citizens voted against establishing any public schools that year. In Greeley, Colo., officials of a newly established school district complained that they were struggling to build a common curriculum 鈥... with as great a variety of textbooks as there were number of pupils.鈥

鈥淭he quality of the data was very dependent on what the states were able to put together,鈥 Snyder said of the agency鈥檚 early work. 鈥淎 lot of states just weren鈥檛 able to provide data because they didn鈥檛 have the resources. In a way, our use of administrative data has come full circle today.鈥

Throughout the first part of the 20th century, the agency continued to expand in scope鈥攖hough its staff never rose much above 120 people鈥攁s policymakers鈥 interest in education grew. After World War I, the agency began to study more vocational and career training, and after the G.I. Bill passed and returning war veterans began heading to colleges and universities, it added significantly more studies of postsecondary access.

The agency began its first major longitudinal studies in the late 1970s and 80s.

Emerson Elliott, the first NCES commissioner to get that title and a presidential appointment, took charge of the agency in 1985, just as NCES faced a blistering evaluation by the National Academy of Sciences. The Academy criticized NCES for having slow turnaround and not having established standards for its statistical practices.

鈥淚t was a wonderful report from the perspective that it said, in very authoritative terms, what we had been saying for years,鈥 Elliott said. The academy report argued that if NCES could not be turned around, it should be 鈥渇olded up, and the responsibility sent to the [National Science Foundation],鈥 recalled Elliott, now the director of special projects at the Council for the Accreditation of Educator Preparation.

Instead, the agency developed explicit standards for data quality and privacy, and guidelines for how to plan studies to answer educators鈥 and policymakers鈥 questions which are still in large part used today. 鈥淏y the time I left, there was a consistent feeling that the center was a respectable member of the federal statistical community,鈥 he said. 鈥淵ou could trust the data.鈥

Building NAEP

Elliott also expanded the National Assessment of Educational Progress from a single long-term trend study to state-level studies of math and reading, and national and internationally benchmarked studies of academic subjects, from civics and social studies to technology and engineering.

鈥淚 do remember that perceptions were very different鈥 in NAEP鈥檚 early years, Elliott said. He recalled standing next to Washington Gov. Booth Gardner, then president of the National Governors Association, just before the announcement of NAEP results in 1990. 鈥淗e said, 鈥楢h, this won鈥檛 make any difference because all the states [results] will be the same.鈥 He couldn鈥檛 possibly have been more wrong.鈥

Peggy Carr, the current acting NCES commissioner and an NCES staff member since the early 1980s, said the agency learned a lot as the NAEP expanded. For example, anomalies in NAEP reading scores led to a massive investigation and panels with other researchers. In the end, officials learned that brown ink in one of the automatic test booklets was read incorrectly by the grading machines, skewing the results. 鈥淭he color of ink makes a difference. ... Things like that matter,鈥 Carr said. 鈥淲e have learned from those nuance errors that we have to be methodical.鈥

Today, the agency is working to move NAEP from pencil-and-paper to computer-adaptive testing, and moving from rapidly shrinking surveys to studies that integrate more of the demographic, programing, and other data that schools collect for general reporting and accountability purposes鈥攃ommonly called administrative data鈥攊nto research surveys. But those changes, too, raise new issues for the agency.

Mark Schneider, a vice president and an institute fellow at the American Institutes of Research and the NCES commissioner from 2005-08, recalled that in the mid-2000s, one CD of data was being shipped on a delivery truck that got into a wreck. The CD was lost, with potentially personally identifiable information for more than 17,000 students; NCES had a backup of the data, but had to contact every family to let them know what had happened. While merging electronic data can protect against physical losses like that one, he said he worries that large merged data sets could become bigger targets for hackers. It will be a constant balance between ensuring that policymakers and researchers have the information they need and protecting the privacy of the students who provide the data, he said.

鈥淭he key thing is the administrative data is unstructured; you have data collected for X and you need to repurpose it for Y,鈥 Schneider said.

Schneider believes NCES will be grappling with how to best use administrative data for years to come. 鈥淲e had 100 years of experience in making surveys better and better鈥攖here was a whole science around question ordering, question writing 鈥 and what do we have on administrative data? Ten years, 20, maybe? It鈥檚 a new world.鈥

A version of this article appeared in the November 29, 2017 edition of Education Week as Happy Birthday, NCES! Agency Turns 150

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Artificial Intelligence Webinar
AI and Educational Leadership: Driving Innovation and Equity
Discover how to leverage AI to transform teaching, leadership, and administration. Network with experts and learn practical strategies.
Content provided by 
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School Climate & Safety Webinar
Investing in Success: Leading a Culture of Safety and Support
Content provided by 
Assessment K-12 Essentials Forum Making Competency-Based Learning a Reality
Join this free virtual event to hear from educators and experts working to implement competency-based education.

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide 鈥 elementary, middle, high school and more.
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.

Read Next

Assessment Opinion 'Academic Rigor Is in Decline.' A College Professor Reflects on AP Scores
The College Board鈥檚 new tack on AP scoring means fewer students are prepared for college.
4 min read
The United States Capitol building as a bookcase filled with red, white, and blue policy books in a Washington DC landscape.
Luca D'Urbino for Education Week
Assessment Opinion 69传媒 Shouldn't Have to Pass a State Test to Graduate High School
There are better ways than high-stakes tests to think about whether students are prepared for their next step, writes a former high school teacher.
Alex Green
4 min read
Reaching hands from The Creation of Adam of Michelangelo illustration representing the creation or origins of of high stakes testing.
Frances Coch/iStock + Education Week
Assessment Opinion Why Are Advanced Placement Scores Suddenly So High?
In 2024, nearly three-quarters of students passed the AP U.S. History exam, compared with less than half in 2022.
10 min read
Image shows a multi-tailed arrow hitting the bullseye of a target.
DigitalVision Vectors/Getty
Assessment Grades and Standardized Test Scores Aren't Matching Up. Here's Why
Researchers have found discrepancies between student grades and their scores on standardized tests such as the SAT and ACT.
5 min read
Student writing at a desk balancing on a scale. Weighing test scores against grades.
Vanessa Solis/Education Week + Getty Images