69传媒

Privacy & Security

69传媒 Are Deploying Massive Digital Surveillance Systems. The Results Are Alarming

By Benjamin Herold 鈥 May 30, 2019 17 min read
  • Save to favorites
  • Print
Email Copy URL

Last December, early on a Sunday morning, Amanda Lafrenais tweeted about her cats.

鈥淚 would die for you,鈥 the 31-year old comic book artist from Clute, Texas wrote.

To human eyes, the post seems innocuous.

But in an age of heightened fear about mass school shootings, it tripped invisible alarms.

The local Brazosport Independent School District had recently hired a company called to monitor public posts from all users, including adults, on Facebook, Twitter, and other social media platforms. The company鈥檚 algorithms flagged Lafrenais鈥檚 tweet as a potential threat. Automated alerts were sent to the district鈥檚 superintendent, chief of police, director of student services, and director of guidance. All told, nearly 140 such alerts were delivered to Brazosport officials during the first eight months of this school year, according to documents obtained by Education Week.

Among the other 鈥渢hreats鈥 flagged by Social Sentinel:

Tweets about the movie 鈥淪hooter,鈥 the 鈥渟hooting clinic鈥 put on by the Stephen F. Austin State University women鈥檚 basketball team, and someone apparently pleased their credit score was 鈥渟hooting up.鈥

A common Facebook quiz, posted by the manager of a local vape shop.

A tweet from the executive director of a libertarian think tank, who wrote that a Democratic U.S. senator 鈥渆ndorses murder鈥 because of her support for abortion rights.

And a post by one of the Brazosport district鈥檚 own elementary schools, alerting parents that it would be conducting a lockdown drill that morning.

鈥淧lease note that it is only a drill,鈥 the school鈥檚 post read. 鈥淭hank you for your understanding. We will post in the comment section when the drill is over.鈥

Such is the new reality for America鈥檚 schools, which are hastily erecting a massive digital surveillance infrastructure, often with little regard for either its effectiveness or its impact on civil liberties.

Social media monitoring companies track the posts of everyone in the areas surrounding schools, including adults. Other companies scan the private digital content of millions of students using district-issued computers and accounts. Those services are complemented with tip-reporting apps, facial-recognition software, and other new technology systems.

Florida offers a glimpse of where it all may head: Lawmakers there are pushing for a state database that would combine individuals鈥 educational, criminal justice, and social-service records with their social media data, then share it all with law enforcement.

Across the country, the results of such efforts are already far-reaching.

The new technologies have yielded just a few anecdotal reports of thwarted school violence, the details of which are often difficult to pin down. But they鈥檝e also shone a huge new spotlight on the problems of suicide and self-harm among the nation鈥檚 children. And they鈥檝e created a vast new legal and ethical gray area, which harried school administrators are mostly left to navigate on their own.

鈥淚t鈥檚 similar to post-9/11,鈥 said Rachel Levinson-Waldman, a lawyer with the liberty and national security program at the Brennan Center for Justice at the New York University law school. 鈥淭here is an understandable instinct to do whatever you can to stop the next horrible thing from happening. But the solution doesn鈥檛 solve the problem, and it creates new issues of its own.鈥

Monitoring 69传媒鈥 Online Lives

Why the growing push to monitor students鈥 online lives?

Consider the trail of digital footprints left by Nikolas Cruz, the disturbed teenager accused of killing 17 people and injuring 17 others at Marjory Stoneman Douglas High School in Parkland, Fla., in February 2018.

Before the shooting rampage, Cruz took to Instagram to post pictures of weapons and write that 鈥淚 wanna f---ing kill people.鈥 He searched the internet using phrases like 鈥渋s killing people easy鈥 and 鈥済ood songs to play while killing people.鈥 Cruz used his phone to record videos of himself planning the massacre. And he allegedly used school computers to look up instructions on how to build a nail bomb.

鈥淚f you鈥檙e responsible for the safety and security of a school, you have to pay attention to the places where harm is being foreshadowed,鈥 said Gary Margolis, the CEO of Social Sentinel, which claims 鈥渢housands鈥 of K-12 schools in 30 states are using its service.

See Also

A broad school safety law passed in the wake of last year鈥檚 mass shooting at Marjory Stoneman Douglas High School in Parkland has had Florida officials working to create a database that would share vast amounts of sensitive data in an effort to prevent school shootings. The project has been delayed by legal questions and bureaucratic snafus.
A broad school safety law passed in the wake of last year鈥檚 mass shooting at Marjory Stoneman Douglas High School in Parkland has had Florida officials working to create a database that would share vast amounts of sensitive data in an effort to prevent school shootings. The project has been delayed by legal questions and bureaucratic snafus.
Mike Stocker/South Florida Sun-Sentinel via AP

Margolis said it鈥檚 unfair to focus on the false positives that may slip through a company鈥檚 monitoring system. Any harms pale in comparison to the benefits of what is caught. He pointed to a recent incident in which Social Sentinel flagged a college student who threatened on Twitter to shoot his professor for scheduling an early morning exam. (The student, who said he intended no harm, was arrested.)

Margolis also noted that school shootings remain statistically rare, emphasizing instead Social Sentinel鈥檚 work around more prevalent issues of suicide and self-harm.

But it鈥檚 high-profile mass tragedies such as Columbine, Sandy Hook, and Parkland that are driving the national conversation and a lot of decision making around school safety and security. And technology companies are clearly taking note.

After the Columbine attack 20 years ago, for example, there was a dramatic increase in the percentage of schools using security cameras to monitor their schools, federal data show.

More recently, the trend has shifted towards vacuuming up digital data and scanning them for possible warning signs.

The embrace of such tools by parents and K-12 administrators alike has led to a fresh boom in the school safety technology market, with a handful of established companies and a growing crop of startups now competing to offer ever-more comprehensive surveillance capabilities.

In April, for example, a company called was at the annual conference of the Consortium for School Networking, pitching K-12 school technology officials on its rapidly expanding suite of services.

How Much Are 69传媒 Spending on Surveillance?

Part of the appeal of the new digital surveillance technologies deployed by schools is their relatively low sticker price.

In Michigan, for example, the 17,000-student Grand Rapids district this school year is paying Gaggle a little less than $71,000 to monitor its network traffic and alert staff members to troubling content.

Texas鈥檚 12,300-student Brazosport Independent School District, meanwhile, is paying $18,500 per year to Social Sentinel for its social media monitoring services. That cost of about $1.50 per student appears to be broadly typical of what the company charges.

The low fees belie the value of the service Social Sentinel offers, said CEO Gary Margolis.

When Securly launched in 2013, its lone offering was a web filter to block students鈥 access to obscene and harmful content. The federal Children鈥檚 Internet Protection Act requires most schools to use such tools.

A year later, though, Securly also began offering 鈥渟entiment analysis鈥 of students鈥 social media posts, looking for signs they might be victims of cyberbullying or self-harm.

In 2016, the company expanded that analysis to students鈥 school email accounts, monitoring all messages sent over district networks. It also created an 鈥渆motionally intelligent鈥 app that sends parents weekly reports and automated push notifications detailing their children鈥檚 internet searches and browsing histories, according to a presentation delivered at the conference.

Then, in 2017, Securly also began monitoring all that information for potential signs of violence and attacks. It added a tip line, plus a layer of 24-hour human review of flagged threats schools can opt into.

鈥淜ids cry out for help at all times,鈥 said Mike Jolley, Securly鈥檚 director of K-12 safety. 鈥淵ou don鈥檛 ever shut off caring for your students.鈥

That kind of language is now pervasive throughout the industry, said Amelia Vance, the director of education privacy at the Future of Privacy Forum, a Washington think tank.

Vance said it鈥檚 meant to deliver a clear message to schools:

鈥淵ou鈥檙e safer if you have us watching everything.鈥

鈥楶rivacy Went Out the Window鈥

In her 2019 book, scholar and activist Shoshana Zuboff described the new engine driving America鈥檚 economy: the ability to translate people鈥檚 online behavior into digital data that can be used to make predictions about what they鈥檒l do next.

That model allowed companies like Google and Facebook to quickly become multibillion-dollar behemoths, before the broader societal implications of their business models could be fully considered.

Something similar is now happening in the K-12 security market.

A Bloomington, Ill.-based company called offers a window into what the trend looks like in practice.

Every day, Gaggle monitors the digital content created by nearly 5 million U.S. K-12 students. That includes all their files, messages, and class assignments created and stored using school-issued devices and accounts.

The company鈥檚 machine-learning algorithms automatically scan all that information, looking for keywords and other clues that might indicate something bad is about to happen. Human employees at Gaggle review the most serious alerts before deciding whether to notify school district officials responsible for some combination of safety, technology, and student services. Typically, those administrators then decide on a case-by-case basis whether to inform principals or other building-level staff members.

While schools are typically quiet about their monitoring of public social media posts, they generally disclose to students and parents when digital content created on district-issued devices and accounts will be monitored. Such surveillance is typically done in accordance with schools鈥 responsible-use policies, which students and parents must agree to in order to use districts鈥 devices, networks, and accounts.

Hypothetically, students and families can opt out of using that technology. But doing so would make participating in the educational life of most schools exceedingly difficult.

It鈥檚 just the way the world works now, said Gaggle CEO Jeff Patterson.

鈥淧rivacy went out the window in the last five years,鈥 he said. 鈥淲e鈥檙e a part of that. For the good of society, for protecting kids.鈥

BRIC ARCHIVE

Earlier this year, the companydetailing its results between June and December of 2018. The report said Gaggle had successfully flagged 5,100 incidents that 鈥渞equired immediate attention for imminent and serious issues.鈥 Of those, 577 reportedly involved imminent threats of someone planning an attack or violence against others.

Documents obtained from Gaggle鈥檚 K-12 clients, along with interviews of administrators in those districts, illuminate the messy realities behind those numbers.

Take the 17,000-student Grand Rapids school district in Michigan.

A public relations consultant for Gaggle referred Education Week to the district, suggesting the company had helped prevent planned violence against a school there.

Indeed, last December, local news outlets were abuzz with involving a 15-year old student.

In an interview, Larry Johnson, the Grand Rapids district鈥檚 director of safety, described the incident. Threatening messages were initially posted on Snapchat, he said. The student involved then used the district鈥檚 network to send emails about those posts to friends. Gaggle flagged the emails, leading the company to alert district officials, who in turn called the Grand Rapids police.

The student was arrested before the next school day. The teen was later expelled.

But when asked if there had been a credible plan to attack the school, Johnson demurred.

The student 鈥渢ook it as a joke,鈥 he said. 鈥淲e have a criminal justice system in place that gets the opportunity to determine what is serious.鈥

Now, put the incident in context.

The shooting threat/joke was just one of nearly 3,000 incidents in Grand Rapids schools flagged by Gaggle between August and February of this school year, according to a dashboard summary provided by the district.

More than 2,500 of those were minor violations, mostly involving profanity.

And files obtained from the district via a public-records request offer a granular look at the details behind hundreds of incidents caught by Gaggle鈥檚 system:

  • More than three dozen Grand Rapids students were flagged for potential suicide or self-harm, usually for storing files or sending messages including words such as 鈥渉ate myself,鈥 鈥渉urt myself,鈥 and 鈥渆nd my life.鈥
  • More than two dozen students were flagged for storing or sending offensive or pornographic images or videos.
  • 69传媒 were flagged for possible violence towards others for storing files containing the words 鈥渁bused me鈥 and 鈥渞aped.鈥

And among those flagged for possible profanity & hate speech:

  • At least a dozen students who stored or sent files containing the word 鈥済ay.鈥
  • A student who stored a file named 鈥渂iology project鈥 with the word 鈥渟hit鈥 in it.
  • A student who stored a file named 鈥淧oetry Portfolio鈥 with the word 鈥減ussy鈥 in it.
  • A student who stored a file named 鈥淥dyssey Essay鈥 with the word 鈥渂astard鈥 in it.

How does a district balance the benefits, costs, and burdens of reviewing and following up on such a torrent of alerts, especially when they range from alarming to ambiguous to ridiculous?

Johnson, the Grand Rapids safety director, acknowledged the challenge.

The system can be a real time-suck. And he鈥檚 concerned about students鈥 rights.

But any such downsides pale in comparison to getting thanks from parents grateful that the technology alerted them that their child was contemplating suicide, he said.

鈥淚 think it鈥檚 a necessary evil,鈥 Johnson said.

鈥楤ig Brother Is Watching鈥

That鈥檚 exactly the mindset that Chad Marlow wants to challenge.

鈥淒oes it make sense to say we are going to hurt millions of students in an effort to prevent one child from being harmed?鈥 said Marlow, the senior advocacy and policy counsel for the American Civil Liberties Union.

A outlines what the ACLU considers to be the real threats related to school surveillance: chilling students鈥 intellectual freedom and free-speech rights. Undermining their reasonable expectations of privacy. Traumatizing children with false accusations. And systematically desensitizing a generation of kids to pervasive surveillance.

The experiences of other K-12 Gaggle clients help illuminate such concerns.

Evergreen Public 69传媒 in Washington state, for example, started using the company鈥檚 service this school year. Between September and mid-March, the system flagged more than 9,000 incidents in the 26,000-student district.

The overwhelming majority鈥84 percent鈥攚ere for minor violations, such as profanity.

A handful helped the district prevent fights and get help for kids thinking of hurting themselves, said Shane Gardner, the district鈥檚 director of safety and security. None could reasonably be considered to have prevented violence against a school.

鈥淲e haven鈥檛 ever unraveled an incident where it was, 鈥楤oy, good thing we caught this kid, because he had a gun in his guitar case,鈥欌 Gardner said.

Dozens of other alerts, however, have left Evergreen officials scrambling to figure out on the fly how to best respond to a wide range of situations they hadn鈥檛 anticipated.

What are the implications, for example, when a teen is flagged multiple times for 鈥渋nappropriate鈥 language in a college admissions essay that describes his difficult upbringing? What about when students are flagged for offensive language in plays or journal entries they鈥檝e written as class assignments?

Evergreen eventually decided to turn off Gaggle鈥檚 filters for profanity and hate speech, Gardner said.

Then there are the alerts generated by vague messages between friends. How is a school district supposed to respond when one student writes to another, 鈥淭omorrow it will all be over?鈥

In that case, Gardner said, the Evergreen district sent local police to a family鈥檚 home in the middle of the night to conduct a welfare check. It ended up being a 鈥渂reakup situation鈥 that wasn鈥檛 serious.

And perhaps most troubling, what are the legal and ethical considerations for schools when students plug their personal devices into district-issued computers, leading Gaggle鈥檚 filters to automatically suck up and scan their private photos and videos?

That鈥檚 happened numerous times in Evergreen schools, Gardner said.

One student was flagged for having photos of himself taking bong hits. Other students were flagged for personal photos showing fights and nude images that could be considered child pornography. Evergreen school administrators responded by notifying parents, police, and the National Center for Missing and Exploited Children.

Marlow of the ACLU described such situations as outrageous.

There鈥檚 a constitutional amendment barring the government from policing speech, he noted. There鈥檚 a reason it comes first in the Bill of Rights.

What about the students in a culturally conservative community who are questioning their sexuality, he asked, or the Trump supporters in a liberal community who are exploring their political beliefs? Is their freedom to research new identities and ideas compromised when principals and parents are alerted to everything they type and search?

In addition, Marlow asked, how do schools and companies know they鈥檙e not making things worse? If students know that administrators and parents are going to be alerted when they discuss self-harm or suicide with friends, for example, might that actually deter them from seeking help?

And schools should never monitor private digital content, Marlow said. Period.

鈥淚t should not be incumbent on students and families to figure out when they鈥檙e being placed at risk and adjust for it,鈥 he said. 鈥淭here are automatic adverse consequences when there is state surveillance.鈥

That鈥檚 not exactly the message Evergreen schools are delivering to its students and community, though.

鈥淓very time we talk to kids, we remind them that Big Brother is watching,鈥 said Gardner, the district safety director.

Accepting Constant Surveillance?

Many students seem well aware of that new reality.

Sometimes students with a concern simply email themselves, with the expectation that algorithms will flag the message for adults, said Jessica Mays, an instructional technology specialist for Texas鈥檚 Temple Independent School District, another Gaggle client.

One student 鈥渙pened a Google Doc, wrote down concerns about a boy in class acting strange, then typed every bad word they could think of,鈥 Mays said. At the end of the note, the student apologized for the foul language, but wrote that they wanted to make sure the message tripped alarms.

For proponents, it鈥檚 evidence that students appreciate having new ways of seeking help.

But for Levinson-Waldman, the lawyer for the Brennan Center for Justice, it raises a bigger question.

BRIC ARCHIVE

鈥淎re we training children from a young age to accept constant surveillance?鈥 she asked.

So far, at least, that鈥檚 a conversation that K-12 officials don鈥檛 seem to be having.

Determined not to become the next Columbine or Parkland or Sandy Hook, schools are eagerly searching out new technologies. Companies feed those fears, then respond by offering new services. The systems are then deployed with minimal forethought or oversight.

One more example.

Earlier this year, Social Sentinel flagged for the Brazosport school district in Texas another social media post. A recent graduate of the district tweeted a photo of herself pointing a shotgun at the sky, along with the message 鈥淚 shot my first flyer, and it was my first time shooting a gun,鈥 followed by smiley-face and clapping-hands emojis.

Brazosport school officials, who did not respond to multiple requests for comment, do not appear to have taken any action in response to the post.

Margolis, the Social Sentinel CEO, said he thus failed to see the possible harm.

鈥淲hy would it have a chilling effect if the superintendent of the school might see something that slips through the system about someone went hunting?鈥 he asked. 鈥淭here鈥檚 no threat.鈥

Last month, however, the over a similar situation that turned out differently. A New Jersey school district suspended two high school students for using Snapchat to share pictures of legally owned guns used during a weekend at a private shooting range.

The case is still making its way through the courts.

Will these powerful new surveillance tools become entrenched in schools, before any kind of carefully considered consensus can be reached?

That鈥檚 where recent history and current trends seem to point.

Even Patterson, the Gaggle CEO, acknowledged feeling conflicted about the dynamic.

He prays to never see another school shooting. He wants to believe Gaggle will benefit students whose cries for help are too often ignored. He hopes that can be done in ways that still allow kids to make normal childhood mistakes, without suffering life-altering consequences.

But the demands of the market could work against those wishes.

Five years ago, Patterson said, Gaggle would never have considered adding a social-media monitoring service. It was too invasive.

Now, he sees it as inevitable.

鈥淚 know I would have rebelled against some of my own products,鈥 Patterson said. 鈥淏ut the world has changed.鈥

Research assistance provided by Librarian Maya Riser-Kositsky.
A version of this article appeared in the June 05, 2019 edition of Education Week as 69传媒 Deploy Massive Digital Surveillance

Events

School & District Management Webinar Crafting Outcomes-Based Contracts That Work for Everyone
Discover the power of outcomes-based contracts and how they can drive student achievement.
School & District Management Webinar EdMarketer Quick Hit: What鈥檚 Trending among K-12 Leaders?
What issues are keeping K-12 leaders up at night? Join us for EdMarketer Quick Hit: What鈥檚 Trending among K-12 Leaders?
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Artificial Intelligence Webinar
Teaching 69传媒 to Use Artificial Intelligence Ethically
Ready to embrace AI in your classroom? Join our master class to learn how to use AI as a tool for learning, not a replacement.
Content provided by 

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide 鈥 elementary, middle, high school and more.
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.

Read Next

Privacy & Security Download A Tip Sheet to Help Teachers Prevent and Respond to Doxxing
Teachers can be a target for malicious actors. Use this tip sheet to prevent and respond to doxxing.
1 min read
Image of digital safety against doxxing and privacy invasion.
Laura Baker/Education Week via Canva
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Privacy & Security Quiz
Quiz Yourself: How Much Do You Know About Cybersecurity For 69传媒 And Districts?
Answer 6 questions about actionable cybersecurity solutions.
Content provided by 
Privacy & Security What 69传媒 Need to Know About These Federal Data-Privacy Bills
Congress is considering at least three data-privacy bills that could have big implications for schools.
5 min read
Photo illustration of a key on a digital background of zeros and ones.
E+
Privacy & Security Civil Rights Groups Seek Federal Funding Ban on AI-Powered Surveillance Tools
In a letter to the U.S. Department of Education, the coalition argued these tools could violate students' civil rights.
4 min read
Illustration of human silhouette and facial recognition.
DigitalVision Vectors / Getty