69传媒

School & District Management

A 鈥楪ood News鈥 Story: Curriculum Can Help 69传媒 Better Evaluate Online Information

By Stephen Sawchuk 鈥 January 29, 2020 4 min read
  • Save to favorites
  • Print
Email Copy URL

It can be difficult to look at the increasing volume of online misinformation鈥攁nd its consequences on our civic life鈥攚ith anything other than despair.

But all is not lost. There鈥檚 a small beacon in new research concluding that students really can become more critical consumers of online information鈥攁 key skill in distinguishing legitimate news and sources of information from slickly produced ones designed to mislead.

The research, recently released by the Stanford History Education Group, is based on an empirical study of its own Civic Online Reasoning curriculum. SHEG made the curriculum freely (You must register to download it.)

Key elements of the curriculum include learning to use 鈥渓ateral reading,鈥 in the mold of professional fact-checkers, by opening up multiple browser windows to cross-check the source of the information. (69传媒 could investigate, for example, whether something they鈥檙e asked to evaluate was produced by a legitimate news outlet or by an advocacy group or other source that might raise questions about its accuracy.)

You may remember that last fall . In a previous study, it found that over half of high schoolers it tested took at face value a video purporting to show ballot-stuffing, and concluded it was strong evidence of voter fraud in U.S. elections. (The video actually showed footage from Russia.)

But in the new study, students who were taught using the SHEG materials showed growth in their ability to evaluate online sources critically of about two-and-a-half points on a 14-point scale, compared to just over a half-point of growth in those who didn鈥檛 use the materials.

鈥淚鈥檒l just say we are experiencing a time of profound pessimism of our ability to do something about the rapid misinformation and disinformation that envelops us every time we turn on a device and look at the screen,鈥 said Sam Wineburg, the founder of SHEG and an education professor at Stanford. 鈥淭he idea we can move the middle with a fairly minimum investment is a finding we believe we can celebrate.鈥

Filling in Research Holes

One reason the study matters is that media literacy is still, all things considered, a pretty nascent field, and research is catching up, noted Cyndy Scheibe, a professor in the department of psychology at Ithaca College. She also runs a media-literacy group at the college that offers curriculum and training, . (Scheibe did not contribute to the SHEG research.)

In general, 鈥淚 think the robustness of the [media literacy] research and the quality of the research varies a little bit. Some of it is qualitative in its assessment more than quantitative,鈥 Scheibe said. 鈥淯nlike other things we measure that may be relatively easy to assess, the issue with media literacy is if what you鈥檙e trying to do is look at how people interpret media messages or analyze media messages, ... there isn鈥檛 one right answer, typically.

鈥淲hat you鈥檙e really looking for is the depth and the probing of people鈥檚 responses and whether they can give evidence to back up their conclusions.鈥

To that point, outcomes like self-reports or multiple-choice questions don鈥檛 tend to do a good job of measuring students鈥 media-evaluation skills. And as my colleague Sarah Schwartz reported last year, when the RAND corporation tried to look through the literature, it found that and there were few studies of specific teaching approaches or programs.

The SHEG research, on the other hand, tests its own curriculum, which explicitly teaches lateral reading and other skills. The study is based on a sample of about 460 high school juniors or seniors taking a civics or government class in six high schools in an unnamed midwestern district. Researchers randomly assigned half the schools to use the SHEG materials.

Teachers in the treatment schools incorporated six of the lessons into their classes, while those in the comparison high schools received their normal civics and government programming. 69传媒 took a pre- and post test at either end of the semester requiring them to evaluate online sources.

Each pair of schools generally had similar demographics, and the researchers controlled for characteristics that tend to impact measures of learning.

The study also found some preliminary evidence that black students and students who don鈥檛 speak English at home did not improve as much as their peers. (The civics education community is increasingly concerned about this so-called 鈥渃ivics gap.鈥 Groups that have had to fight the hardest to exert their civic rights in the United States are often the least likely to be taught about their rights and the tools they can use.)

Even though the study was not done by independent researchers, Scheibe praised it for being soundly designed and an important addition to the literature. Nearly all the research focuses on high school students even though the consensus in the field is that students need to be taught media-literacy skills far earlier for it to become an automatic habit of mind.

鈥淲e have a long way to go, including how do you teach this effectively, at different grade levels, and different curriculum areas,鈥 she said. 鈥淔or us, we see media literacy as literacy. And therefore you can鈥檛 just start teaching it in high school鈥攜ou have to start teaching it in preschool, and then every year.鈥

The study is currently being submitted for publication.

Clarification: This post has been updated to clarify that Sam Wineburg is an education professor at Stanford University, but holds a courtesy affiliation with its history department.

Image credit: Syahrir Maulana/Getty

A version of this news article first appeared in the Teaching Now blog.