69´«Ã½ districts can choose from a dizzying array of software products to analyze student-achievement data. Now, three researchers have published a paper to help them make an educated choice.
The paper—by Jeffrey C. Wayman and Samuel Stringfield of Johns Hopkins University in Baltimore and Mary Yakomowski of that city’s public school system—reviews the range of commercially available software products. In addition, the authors have set up a Web site to provide ongoing updates of software reviews.
Although schools now gather vast quantities of data, the authors note, the information is typically not stored in ways that are useful to teachers and principals. But that’s changing rapidly with the advent of easy-to-use and efficient software programs.
The report, is available from the . See also the researchers’ for updates of software reviews.
“To improve achievement, we’re going to need to get data down to the classroom level, and we feel that means getting data into the hands of teachers,†said Mr. Wayman, an associate research scientist at Johns Hopkins’ Center for the Social Organization of 69´«Ã½.
“Teachers can do this,†he added. “These things are user-friendly. If you can order a book off of amazon.com, you can look at your students’ data.â€
Selecting from among the commercial packages is not easy, though, because they offer a wide range of features. While some offer strengths in data efficiency, for example, others specialize in data presentation or graphics.
“No one piece of software accomplishes everything,†the researchers write, “and some features are more common than others.â€
For instance, most companies offer already-formatted reports of student data that can be generated with the click of a mouse. Fewer companies permit users to perform a customized query of the data and then save that query for future use. Only two of the 13 companies reviewed offer storage and retrieval of student work samples or electronically accessible portfolios of such work.
‘Messier Than You Think’
|
Above is an example from a commercial software package that schools might use to see if they meet federal education goals. |
The researchers identified existing software through literature and Web searches and conversations with knowledgeable individuals, including directly communicating with people in the industry.
They included software in their evaluation if it enabled many different users to analyze existing student data. The researchers excluded software that permitted only district personnel to view such information. They also eliminated software that focused solely on school management issues, that was developed to generate assessment data, or that required ongoing data entry by teachers. Companies were included only if they could provide interactive demonstrations of their products.
Although the paper describes three well-respected, district-generated initiatives—in Broward County, Fla., Cleveland, and Houston—the authors believe that most districts would be wiser to rely on commercial vendors than to take the do-it- yourself route.
“School data is always, always messier than you think it is,†said Mr. Wayman. The authors argue that unless a district is certain it has the expertise to deal with data problems quickly and efficiently in-house, the experience outside vendors bring to the process is well worth the cost.
Phyllis Chasser, a senior data-warehouse analyst for the Broward County schools, said the district developed its own data-gathering and -analysis tools for several reasons, beginning with the fact that others were not available when the Florida district launched its warehouse in 1996.
Commercial products, she believes, can’t handle the volume or the specialized needs of large districts. “Some of the software that is available is built for small districts†that don’t have the expertise to build their own, Ms. Chasser said. “Some are smaller than the size of one of our high schools.â€
Any software Broward might buy, she added, would have to be customized, which would likely end up being more costly.
All the programs reviewed are Web-based, so that users can access them from any Internet connection. Each of the programs also can produce reports on student performance by subgroup, such as low-income or minority students, as mandated in the federal No Child Left Behind Act. All also offer some form of ongoing technical support.
Each evaluation starts with an overview of the company and the focus of its software. The reviewers then describe the preformatted reports available, as well as tools teachers can use to pose questions of their own and an assessment of their ease of use.
In addition, the researchers address whether the software meets data-exchange standards that permit software packages to communicate with each other, a feature known as the 69´«Ã½ Interoperability Framework, or SIF. The Baltimore-based group advises schools to ensure that the software they use can communicate with other products, as a way of protecting their investment. Finally, the reviews address any special features of the software, such as the ability to present and analyze data longitudinally.
In general, the researchers advise, good software for student-data analysis should be user-friendly, have a relatively rapid response time, offer query tools for less sophisticated users, and enable users to “drill down†to the level of individual grades, classrooms, and students.
What’s more, they assert, users should be able to access the software from home or the workplace and have the information available through a variety of means, including quick snapshots, query tools, and preformatted reports.
Making a Choice
In choosing a vendor, Mr. Wayman, Mr. Stringfield, and Ms. Yakomowski suggest that districts consider a variety of factors.
For example, officials should inventory the data currently stored by their district, assess the quality of that information, and estimate the cost of correcting or “cleaning†the data. 69´«Ã½ also need to evaluate their data and analytical needs. And they must weigh how much outside or third-party help they require to set up the system, including inventorying the initial data, cleaning the data, merging data sets, and housing the data over time. Districts also need to consider how long it will take to get the system up and running, as well as its costs.
Although costs depend on many variables, including the size of the district and the features it wants, they range from as low as $2 per student per year to more than $10, the authors note, sometimes with a higher cost in the first year of implementation.
“69´«Ã½ and districts would be well advised to contact other schools using the software, query them thoroughly, and visit as many as possible to get practical feedback on the types of products and levels of services each vendor offers,†the authors say.
Given the rapidly changing market, they add, schools also need to consider the long-term viability of the companies.
Denis P. Doyle, the co-founder of SchoolNet, one of the companies whose software is reviewed in the paper, said such independent, third-party evaluations are important to both school people and vendors.
“Superintendents, in particular, I think almost without exception feel their ability to make informed judgments is limited in this market because it’s so technical and so complex,†he said. “They’re scared to death of making the wrong decisions, so they really welcome some kind of independent verification.â€
Coverage of technology is supported in part by the William and Flora Hewlett Foundation.