Every day, Kenneth Grover, the principal of the 175-student Innovations High School in Salt Lake City, wades through printed ads and emails pushing everything from computers to lighted pens.
鈥淚f you read the brochures with beautiful and happy kids on them, you鈥檙e thinking, 鈥榃ow, this is what I鈥檝e been looking for,'鈥" Mr. Grover said.
From his experience, though, vendors cite research in their promotions only 20 percent of the time鈥攁nd, upon investigation, only about half that research is conducted by an independent party or self-administered under strict guidelines.
School districts are bombarded with marketing materials from companies claiming their products can help change the face of education and raise student achievement. Yet a common complaint is the lack of significant data to back up the slick slogans.
As educators try to balance a desire for evidence with the need for innovation, at a time when standards are rising and technologies are advancing at breakneck speed, many find themselves undergoing rapid-fire pilot projects to determine which products and services best suit their districts.
鈥淚f you鈥檙e waiting for all the evidence to be fully baked, you鈥檙e going to be waiting a long time,鈥 said Kenneth Zeff, the chief strategy and innovation officer for the 95,000-student Fulton County, Ga., school system in the Atlanta metropolitan area.
Independent research is often seen as the gold standard for authenticating effectiveness claims, and the U.S. Department of Education鈥檚 is considered the leading source of scientific evidence of what works in education.
But many small companies simply don鈥檛 have the budgets to pay for such research, and the largest, most respected studies cost a lot and can take years to complete. (Mathematica Policy Research, for instance, recently completed for the Knowledge Is Power Program, or KIPP, charter school network, which runs 125 schools in 20 states, serving 41,000 students.)
But even small companies with modest resources can demonstrate efficacy, according to the Education Industry Association, a trade group based in Vienna, Va.
The association has begun encouraging members to seek out independent validation to set themselves apart from their competitors during the relatively recent 鈥渆xplosion of entrepreneurship into the K-12 marketplace,鈥 said Steve Pines, the association鈥檚 executive director.
鈥淲hat鈥檚 missing,鈥 he said, 鈥渋s some third-party documentation that can separate the wheat from the chaff.鈥
A Discriminating Consumer
That absence of independent research about certain products can be particularly difficult for smaller districts with even fewer resources.
Read a related story, 鈥淏ig-Name Companies Feature Larger-Impact Research Efforts.鈥
The 7,000-student Henry County school system in Collinsville, Va., deals with that situation, in part, by empowering teachers to experiment with free apps, about 10 to 15 per month, which helps the district do its own research before purchasing a product or service. Administrators there also look for positive results from districts of comparable size and demographics before deciding to implement a program, according to Janet Copenhaver, the district鈥檚 director of technology and innovation.
Even large education providers struggle with technological advances often outpacing the speed of rigorous research.
鈥淭hat becomes a real challenge for school leaders, because being able to move quickly and accurately to make decisions in real time is critical,鈥 said Joseph Olchefske, the president of Mosaica Online at New York City-based Mosaica Education Inc. The private company manages 75 schools in the United States and overseas, serving 19,000 students.
Instead, Mr. Olchefske directs his company to look at research for broad direction, then commit to its own continuous review, analysis, and evaluation of student performance. He referred to that commitment when addressing criticism for lower-than-average student achievement data and higher-than-average disciplinary problems, among other issues, that Mosaica faced recently.
Mosaica examines its results quarterly and makes midcourse corrections, Mr. Olchefske said, which allows for improvements to be made even without waiting for the golden seal of approval from top-quality research.
鈥淵ou can be a discriminating consumer, but what you can鈥檛 really do is get to a place where there鈥檚 a definitive conclusion,鈥 he said. 鈥淎t some point, you have to take a risk. Research never does away with the need for judgment.鈥
The 39,000-student Cherokee County school district in Canton, Ga., puts more weight on its own standards than on independent studies.
鈥淚t really is about our own research,鈥 said Bobby Blount, the district鈥檚 assistant superintendent for accountability, technology, and strategic planning.
That philosophy was more out of necessity than choice nearly 10 years ago, when the district wanted to begin using new interactive whiteboards鈥攁 market that had not yet been scrutinized by research.
鈥淎ll we had to go on at that point were sales people telling us how great and wonderful their product was,鈥 Mr. Blount recalled.
The Cherokee County system decided to perform its own test, installing whiteboards in about a dozen classrooms in various grades. The whiteboards proved to be effective, improving both teaching and learning.
When it was time to make the larger investment in a districtwide rollout, 鈥渨e had two vendors come in and pretty much do a dog-and-pony show,鈥 Mr. Blount said, which led district officials at the time to choose one company鈥檚 product for elementary classrooms and the other company鈥檚 product for middle and high school classrooms.
School districts want to know the products and services they buy will be worth the investment鈥攁nd solid research can help them make that judgment. But how should administrators evaluate the studies companies cite when trying to land a sale?
Here鈥檚 some advice on what to ask, courtesy of Ellen Bialo, the president of New York City-based Interactive Educational Systems Design, which specializes in market and product research and analysis; Rob Foshay, a senior partner with the Foshay Group, a Dallas-based training and education company; and Kenneth Zeff, the chief strategy and innovation officer for the Fulton County, Ga., schools:
禄 Was the study conducted in a district that will allow school officials to observe the intervention in action? The opportunity to meet with those doing the implementation, as well as firsthand observations, can clarify nuances and success factors that would be lost in a written report.
禄 Do the players in the study鈥攂oth students and teachers鈥攔epresent what your district looks like? If they don鈥檛 have the same socioeconomic, cultural, and educational backgrounds, the findings may not be transferable.
禄 Can the company easily explain the product or service, and the confirming research, to a variety of stakeholders? If the methodology is too obscure, or the program seems counterintuitive, it will be harder to rally the support that is an important predictor of success.
禄 How meaningful are the measures used for each benefit claimed? For example, before-and-after gains are relevant only if both measurements are done with the same test, or tests designed to be compared. Also, a state-test passing rate or score may not be sensitive enough to measure what the product or service is designed to teach or facilitate.
禄 The study claims gains in achievement, but compared to what? If there鈥檚 no comparison group, you can鈥檛 tell if the product or service improved on what a district was already doing. And the comparison is meaningful only if both groups were similar at the start of the study, or if statistical adjustments were made to compensate for differences.
禄 Was the study conducted, written, and released or published according to professional standards for design integrity and research ethics? Ask the company how well the study conforms to guidelines from the American Educational Research Association, the American Evaluation Association, the Software and Information Industry Association, and the What Works Clearinghouse.
禄 What type of effectiveness research has been done by a third party? For supplemental products, has a white paper been done to tie it to other research? A case study is nice for anecdotal research, but is it also backed by ample data?
SOURCE: Education Week
Cherokee is now piloting five types of math software this school year, while examining findings about the software from other districts.
鈥淲e rely on each other quite a bit,鈥 Mr. Blount said. 鈥淭hat lends more credence than anything else.鈥
鈥楶ractical Considerations鈥
When the 28,000-student Colorado Springs School District 11, in Colorado, was considering a couple of years ago whether to buy ST Math software from the MIND Research Institute, a nonprofit education research company based in Irvine, Calif., the institute鈥檚 own data-collection process gave administrators a good first impression.
conducted in collaboration with the University of California, Irvine, sweetened the pot. Then visits to see the program in action at schools in Anaheim, Calif., and Chicago sealed the deal.
David Sawtelle, the math facilitator for the Colorado Springs district, has learned over time to press vendors who claim little more than that their products are 鈥渞esearch-based.鈥
鈥淲hat that turns out to mean is that a product is designed in accordance with research around best practices, and then there鈥檚 a citation of a study that was done in which that practice was potentially effective,鈥 he said. 鈥淲e鈥檝e become more discriminating. We ask, 鈥業f you鈥檙e research-based, how is your research validated?'鈥"
The right answer to that and other critical questions鈥攕uch as how the program was implemented, what kind of professional development is needed, and what is the right environment for it to be successful鈥攄epends on each district鈥檚 needs.
鈥淐ontext is very important,鈥 said Steven M. Ross, the evaluation director for the Center for Research and Reform in Education at Johns Hopkins University鈥檚 school of education in Baltimore. 鈥淚t鈥檚 not like picking a prescription out of a box. You have to be much more nuanced in your selection.鈥
Smaller companies without the budgets to perform independent studies can take heart from the fact that research only goes so far, experts say, and that for educators, it鈥檚 all about what happens during and after implementation that seems to count the most.
Isaak Aronson, the president and chief executive officer of SmartStart Education, an education and training provider based in New Haven, Conn., said that his company鈥檚 reliance on case studies and other self-generated research hasn鈥檛 stopped its ability to impress clients with in-house statistics.
鈥淧erhaps we sacrifice a bit of scientific rigor,鈥 Mr. Aronson said, 鈥渂ut I鈥檓 always cognizant of practical considerations versus valid and reliable research.鈥
Other companies are trying establish mutually beneficial research partnerships with schools.
Zane Education, a New Zealand company that provides subtitled videos to schools and has a U.S. office in Thousand Oaks, Calif., has started approaching schools about collaborating.
鈥淲e鈥檙e going to them and saying, 鈥楬ey, would you like to work with us on this research? We鈥檒l provide you with those results at no cost,'鈥" said the company鈥檚 director, Nicholas Tee.
In the end, even companies that can afford top-level research sometimes don鈥檛 measure up as well as expected.
Mr. Grover, with Innovations High School in Utah, felt a bit frustrated in 2012 by what he describes as 鈥渁 big company that must do a billion dollars of business a year.鈥
Though the company assured him it would provide a seamless transition to a digital curriculum, he said, there were obvious problems a month after implementation. The learning management system that was needed to run the curriculum wasn鈥檛 completed as promised, and Mr. Grover accused its representative of not delivering what was promised.
鈥淭hey made a strong effort to fix it with patches, and it鈥檚 working now,鈥 Mr. Grover said. 鈥淏ut the point is, had I not asked the questions that I did, how much more would they have hooked me?鈥