A whopping 96 percent of the apps schools require or recommend aren鈥檛 safe for children, primarily because they share information with third parties or contain ads, concludes a report by Internet Safety Labs K-12.
Apps that allow tech providers, marketers, and advertisers access to personal information about children and their families can, at minimum, be used to create highly targeted ads aimed at kids, says the .
These apps are 鈥渕onetizing your data, selling it to data brokers that are building these ever-growing portfolios on you,鈥 explained Lisa LeVasseur, the executive director of Internet Safety Labs. She is a former software engineer and an author of the report.
Worse, when personal information is abused, it can put kids at risk of predators, cause emotional trauma, and perhaps even physical danger, if location information is shared, the report warns.
To get their arms around the sheer number of apps used in schools, researchers examined a random sampling of 13 schools in each state and the District of Columbia, examining a total of 663 schools that serve about 456,000 students collectively. The total number of apps used by all those schools was 1,722 .
Apps that get the 鈥楧o Not Use鈥 label
The researchers labeled a particular app 鈥楧o Not Use鈥 if it contained any advertising, had deeply embedded software registered to a data broker or shared information鈥攊n ways that are difficult to detect or more explicitly鈥攚ith one of a number of big tech companies that profit from advertising and internet sales, including Amazon, Facebook, and Twitter.
Seventy-eight percent of the apps studied fell into that category. Another 18 percent of apps were considered 鈥渉igh risk鈥濃攚hich LeVasseur would not recommend for schools鈥攂ecause of similar, though slightly less pronounced, privacy and information-sharing problems.
While those criteria may seem to set a high bar, LeVasseur said it is an appropriate one when children are concerned.
LeVasseur said people often joke that these days, because of technology 鈥渨e all have no privacy. Haha, isn鈥檛 this funny? It鈥檚 really not funny. It鈥檚 really gross. It鈥檚 really harmful. And, you know, it鈥檚 really quite damaging.鈥
What鈥檚 more, custom-built apps that districts often use to communicate with families often have even more potential privacy red flags than off-the-shelf apps. And some of the educational apps that districts recommend students use really weren鈥檛 built with kids and their privacy needs in mind, LaVasseur said.
In fact, more than a quarter of the apps that districts recommend鈥28 percent鈥攚eren鈥檛 developed with children in mind first.
Another eyebrow-raising finding: More than two-thirds of the apps the organization studied send data to Google.
LeVasseur鈥檚 advice to school districts? 鈥淔ight the urge鈥 to take on dozens of apps. 鈥淟ess is more,鈥 she said. 鈥淵ou really have to scrutinize this stuff. And you have to vendor manage. You have to get in there and demand a lot more information鈥 from companies selling apps.
And echoing educators, she said school districts also need to build capacity for vetting technology used in schools, considering just how much is out there and how difficult it can be to figure out what privacy protections a particular platform has.
69传媒 鈥渄on鈥檛 have the resources that they need,鈥 LeVasseur said. 鈥淚f this is the scale of the thing, they need more support.鈥