School districts will soon have opportunities to compare and learn from each other’s methods of collecting and managing data through technology, when the lessons from one of the largest studies of district data practices are unveiled in June.
The study, which focuses on 71 districts of all sizes and demographics, is a joint project run by the Austin, Texas-based Data Quality Campaign and APQC, a nonprofit education and research group in Houston.
While some previous studies have examined technology and data management in individual school districts, this effort aims to draw universal lessons from all the districts studied, says Aimee Guidera, the director of the Data Quality Campaign.
“As we keep increasing the ability and capacity of states and districts to produce data, there’s an increasing hunger surrounding what to do with that data and how to create an environment to use that data to improve school processes and student achievement,” Guidera says.
In the past, there have been small studies of district data technology and what Guidera calls “anecdotal islands of excellence,” in which an isolated district or a school had success. But the goal of the new study, she says, is to be able to look closely at data and technology practices and extract information that can benefit all districts.
“We want to create a model for how districts can think about using data,” she says. “It’s moving us from these islands of success into a way to understand how to do this and transfer it to another environment, another district, another school.”
Daniel Grayson, a project manager at APQC, which is formerly known as the American Productivity and Quality Center, is guiding the research. Through interviews and other research, Grayson and others identified “best practice” districts that featured unique or innovative data practices. Some received visits from the research team and often from representatives of other districts looking to learn more for their own data efforts. The visits included conference calls and webcasts of presentations so that officials from other districts who could not attend could still gather knowledge and ask questions. APQC also asked districts to fill out a comprehensive survey about data practices.
Avoiding Pitfalls
All that information will be used to build a customized report for participating districts that is designed to assess their strengths and weaknesses in collecting and using data. They’ll also be able to compare their techniques with those of other districts. Ultimately, the reports will be available online to all school districts, Grayson says.
In addition, a knowledge-transfer session is scheduled for July 9-10 in Houston, where representatives from each of the participating districts can take part in roundtable discussions that highlight tactics in data-driven decisionmaking.
“There’s a tremendous amount districts can learn from each other, especially districts at different places in terms of technology options,” Grayson says. “They can find huge shortcuts or pitfalls to avoid.”
Mathew K. Fail is the director of quality for the 20,000-student Iredell-Statesville, N.C., school district, which was chosen as a “best practice” district. Over a period of several years, the district has developed a vast data warehouse and has worked with a vendor to customize software to allow the system to collect and analyze data, and present it to those who need it, in a way that is tailored to the district and the user.
District data are presented in a way that’s easy to understand for teachers and administrators and doesn’t require much additional calculation. The district is working to put all school and student data collected into one system that can sort and analyze everything from demographic data, to the results of predictive assessments—which help determine whether students are struggling with subject matter before being tested—to attendance and state-testing data.
“It’s a growing journey with that particular piece of software, but it’s been a wonderful tool,” Fail says.
The Iredell-Statesville district also has been able to cut the amount of time it takes to get the results of district assessments of students to teachers to just a few days. Teachers then can evaluate the data and change their teaching styles and methods accordingly—and much faster.
“When we first started doing predictive assessments, two weeks later the teachers would get all the information back and have to do a lot of manipulation of the data and then analyze their strategies,” Fail says. The current system, he points out, returns the data, already analyzed for the most part, within four days.
But Fail says that while building the system, there were steps he would have taken differently. He hopes other districts can learn from his district’s journey.
For example, he says, Iredell-Statesville learned over time that customizing software or processes was crucial; now the district has a policy that it will not work with a key vendor if the vendor is not willing to customize and work collaboratively.
“If we had set that standard up front, it would have made it much easier,” Fail says. “We learned that when you’re beginning to work with the vendor, we should have been demanding up front.”