As online learning has entered the mainstream—with roughly a third of the nation’s high school students enrolled in at least one online course, according to a report released in June 2011—more states have created policies, procedures, and even organizations for evaluating the quality of such courses and other online content available to students.
But instituting those quality-control measures is not without challenges.
For starters, the creation of such systems often requires legislation, which must be crafted by lawmakers who aren’t always familiar with online learning’s nuances and may not be eager to launch new programs in tight budget times.
Further, the peculiarities of individual states’ education policies—especially on standards and materials approval—mean the kind of oversight that might work effectively in one state might not function well in another.
“There’s no state we could look at and say they’ve got it all figured out,” said John Watson, the founder of the Evergreen Education Group, the Durango, Colo.-based research firm that puts out the annual reports measuring growth and changes in K-12 online learning. Taking a functional system in one state and applying it to the policy realities of another would be “fairly difficult,” Mr. Watson added.
Even the , or CLRN, and the , or TXVSN, which partnered to write a second edition of the iNACOL National Standards for Quality Online Courses, released last October, have considerably different authority, responsibilities, and procedures. (The Vienna, Va.-based International Association for K-12 Online Learning, or iNACOL, is a leading advocacy group for proponents of online and blended learning for students.)
The Houston-based TXVSN, for example, must approve all courses offered to regular public or charter school students in Texas, whether the classes are a supplement to brick-and-mortar coursework or part of a full-time virtual program.
By contrast, the Modesto-based CLRN acts in a more advisory role to districts and charter schools seeking to ensure that they purchase high-quality content, said CLRN Director Brian Bridges. The organization can’t deny districts or charters the right to use their online courses of choice. However, CLRN was set this month to announce a new partnership with the University of California system in which only CLRN-approved online courses will be accepted as valid when state universities evaluate high school students’ transcripts.
Further, school districts and state universities are the main creators of the content reviewed by the TXVSN, while CLRN can review content created by districts, universities, or private companies. And unlike CLRN, the Texas group takes ownership of the courses it reviews and acts as the vendor for interested districts or charter schools.
Even the organizations’ roots are different; legislators in 2007 created the TXVSN explicitly for reviewing and approving online-course content. By contrast, the California group has morphed into an organization focused primarily on reviewing online-course content from its previous role analyzing and reviewing a larger variety of supplemental digital-content items, a role it played for more than a decade.
“Along the way, we saw the transition from supplemental resources that were relatively small, meeting maybe 15 or 20 standards, to monsters—we called them tanks—meeting hundreds of standards,” Mr. Bridges said. “That was the big ‘aha!’ moment for us where we saw that’s what is happening in this industry.”
Yet despite their differences, the two groups were among the first such organizations with the foresight to see the coming influence of online courses, and they became effective partners in leading the creation of the second version of iNACOL’s national standards for such courses. The 46-page document, issued in October of last year, includes 52 standards on which to grade courses across the categories of content, instructional design, student assessment, technology, and course evaluation and support.
Those standards have served as one of two templates to consider in CLRN’s review process for online courses. In evaluating courses, the three outside teachers who are hired for a Saturday at a time to act as reviewers also now grade the courses against any of the applicable Common Core State Standards. Reviews of English and math courses, at the pace of roughly four a month total, began last October, with similar reviews of science, social studies, and foreign-language online courses having begun this spring, Mr. Bridges said.
Several other states have followed California and Texas into online-course review based on the iNACOL standards or some variation on them. Ohio enacted legislation in 2010 to establish its own distance-learning clearinghouse that began with the review of approved Advanced Placement courses for all students in Ohio public schools. Georgia and Idaho launched similar clearinghouses this year. Washington state also has for several years had an agency similar in structure to the TXVSN, Mr. Bridges said.
Critical Questions
Still, about 30 percent of CLRN’s website traffic comes from out-of-state users wanting to access the organization’s public reviews, a sign Mr. Bridges said suggests states that follow the examples of CLRN and the TXVSN are in a minority. And iNACOL’s chief operating officer, Matthew Wicks, said even states that have taken those steps are mostly measuring inputs, or dimensions inherent in the course’s composition, rather than outcomes, or measures of a course’s effectiveness.
“We feel like the quality standards produced in the past are a good starting point, … but they themselves don’t guarantee quality,” said Mr. Wicks, who added that there is growing pressure to focus on outcomes. “This has become a hotter topic, in part because of increased activity in online learning, and in part because of the reports that have been critical of full-time online learning programs.”
Some critics of virtual education have suggested that for-profit providers have worked to stop legislators from enacting policies that would more accurately measure the quality of online-course outcomes because those measures would not look good for some providers who are trying to offer a course at the lowest possible cost to the company. Gene V. Glass, a senior researcher for the in Boulder, Colo., suggests most outcome-related accountability measures for online courses are vulnerable to abuse and manipulation, such as the use of computer login time as a measurement in states with seat-time requirements, or the administration of unproctored and unsupervised exams. For-profit providers are opposed to measures that are less vulnerable to abuse because they usually need more human involvement and cost more, said Mr. Glass, who is one of several researchers associated with the NEPC who have issued recent reports criticizing for-profit online learning.
“All of those moves have costs associated with them, and the cost of providing courses is [otherwise] so low,” he said. “You can see why the providers don’t want any part of that stuff. They’re going to fight it tooth and nail. “But public and media pressure is prompting some states to take a deeper look at how to measure the outcomes of online learning effectively. While Arizona Gov. Jan Brewer, a Republican, in May vetoed a bill that would have included several new accountability measures for online courses, including requiring live proctors for exams, Colorado Gov. John Hickenlooper, a Democrat, signed a measure the same month that commissioned a study to explore how his state’s online learning courses can be more effective.
Part of the study will focus on what measures could be implemented to more accurately measure the effectiveness of those online courses and programs, as well as the progress of a population that often includes a high number of mobile students and students who enter an online course well behind grade level.