69ý

Privacy & Security

FTC Complaint Focuses on Data Privacy of Internet-Connected Toys

By Michelle R. Davis — December 06, 2016 7 min read
  • Save to favorites
  • Print
Email Copy URL

An international coalition of organizations concerned about data privacy says internet-connected toys are recording and storing the audio of young children without parents’ knowledge or consent.

The groups on Tuesday that alleges an interactive doll named Cayla and a robot named I-Que are recording the voices of the children who interact with them. The parent company that makes the products, Genesis Toys, is sharing those audio files with a third-party company unbeknownst to parents in violation of the federal Children’s Online Privacy Protection Act, or COPPA, as well as FTC rules on unfair and deceptive business practices, according to the complaint.

In addition, the complaint claims that subtle advertising messages from the Walt Disney Company are being promoted through the popular My Friend Cayla doll.

Also on Tuesday, privacy and consumer advocates also filed similar complaints with the European Union’s data-protection agency and individual grievances in seven European countries. Research and initial concerns about the toys, My Friend Cayla and I-Que Intelligent Robot, were sparked by

“A child who plays with a toy as if it is a friend might say all kind of personal things that are then being recorded and distributed to third parties that do who knows what with it,” said Claire T. Gartland, the director of the Electronic Privacy Information Center’s Consumer Privacy Project, one of the organizations filing the complaint, in an interview.

Genesis Toys and Disney have not yet responded to a request for comment about the complaint regarding My Friend Cayla and i-Que Intelligent Robot.

In the FTC filing, the American groups allege that the interactive, electronic dolls are using personal information without the consent of children’s parents. COPPA protects the privacy of children 13 and under by requiring that companies post their privacy policies, notify parents of their information-collection practices, and get verifiable parental consent, among other steps, before collecting personal data from children.

The American groups raising the issues, including EPIC, the Center for Digital Democracy and the Campaign for a Commercial-Free Childhood, describe the process that Genesis Toys and the third-party company receiving the audio files, Burlington, Mass.-based Nuance Communications, use as deceptive and failing to seek parental consent.

In a post on its web site, a Nuance spokesman said the company takes data privacy seriously and had not received an inquiry from the FTC. The company emphasized that it does not sell voice data for marketing or advertising purposes or share that voice data with any of the company’s other customers.

The complaint brought to the FTC is not surprising, in the sense that tech companies’ ability to create more interactive, sophisticated toys that rely on web connectivity is constantly growing, said N. Rao Machiraju, the co-director of the Center for Applied Human Reasoning and the Internet of Things, known as , at the University of Southern California.

As toys are created with the capability to capture voice recognition, the risk grows that that information can be used for marketing and other purposes, Machiraju said. That’s why there need to be terms of use on privacy protections written for adult gatekeepers, particularly parents, in language they can understand, he added.

When the 18-inch Cayla, for example, is purchased, a parent downloads a companion app onto a smartphone or tablet that allows the doll to activate a Bluetooth microphone and speaker and connect to the internet, according to the FTC complaint. The app allows the doll to speak and interact with children in a way that is similar to Siri, the “intelligent assistant” on an iPhone. The doll can answer questions by searching the internet, playing games or reading to children.

When parents download the app, the company seeks permission to access the hardware, storage, microphone and WiFi connections on a users’ device, the organizations say. These terms of service arrive as a pop-up on the screen and are more than 3,800 words long, written in small font. Genesis advises consumers to “print a copy of the terms for future reference,” the FTC letter says.

While the terms of service state that the company may use speech data to improve their toys and products, it also refers users to the Nuance privacy policy page, according to the complaint.

However, the FTC filing notes that the doll records children’s voices and sends audio files to Nuance, a third party software provider for Genesis Toys.

Nuance’s privacy policy says information collected can be used to “develop, tune, enhance, and improve our products and services, and for advertising and marketing,” according to the FTC complaint. As part of its business, Nuance develops speech recognition software and voice biometric solutions for industries ranging from healthcare to law enforcement. The company says it has over 30 million voiceprints in its biometric system, according to the FTC filing.

While it’s unclear how Nuance may be using the audio collected by the toys, that’s part of the problem, said Josh Golin, the executive director of the Campaign for a Commercial-Free Childhood. The terms of service and privacy policy are unclear at best, he said.

“We promote creative play and these scripted toys are the antithesis of that,” he said in an interview. “Just because you can hook something up to the internet doesn’t mean you should.”

The FTC complaint also claims the companies do not secure adequate parental consent, as required by COPPA. The Cayla app requires users being asked to consent to the terms of service to solve a math equation of 11 plus 16, as a way to verify that it is an adult, not a child, providing that assent, Gartland said.

“Companies collecting personal data need to make sure that parents receive direct notice that completely and clearly describes the data practices of the company,” she said. “In this case, it’s hard to find the terms of service, the privacy policy is incomplete and vaguely worded. There is no direct notice.”

Jennifer Babich of Teaneck, N.J., gave her then-four-year-old daughter the doll last year. In an interview, she said her daughter found the doll entertaining and liked reading with her and playing games. However, more often now, her daughter plays with the doll just as a doll, not as an internet-connected toy. Babich is not involved in the FTC complaint.

Babich said she remembers accepting some sort of waiver when she first started using the toy, but knew nothing about the fact that the doll records audio and sends off audio files. “It’s unnerving,” she said. “I was having faith in a child-friendly company.”

Researchers looking at the My Friend Cayla doll also found the toy is pre-programmed with phrases that reference both Disneyworld and Disney movies, for example, saying that her favorite movie is Disney’s “The Little Mermaid” or telling children she loves going to Disneyland, the FTC complaint notes. It’s unclear whether there is any official connection or agreement between Genesis and Disney and the complaint asks the FTC to investigate.

Babich said she and her daughter spoke with Cayla about Disney, but the conversation was not initiated by the doll, she said. Her family had just returned from a trip there and her daughter asked Cayla about the theme park.

“It didn’t feel like it was product placement,” she said.

Critics also say the Bluetooth connection used with the doll and robot is not secure. Research by the Norwegian group found that when the doll and robot were on, but not already paired with a device, .

One alternative to creating cloud-connected “smart” Internet toys is for companies to fashion devices that rely on an “intranet” or platforms that don’t need to connect with the web in order to make voice-recognition and other features work, said Machiraju, who has extensive experience in tech research and development in the private sector and higher education.

Ultimately, companies need to ensure that their testing cycles for new products account for privacy questions before they bring new tools to consumers, he said. Machiraju said that the push-and-pull between privacy advocates, regulators, and companies can result in better products, with stronger privacy protections, coming into the market.

“It will create awareness,” he said. “This is good for companies and good for consumers. It brings those checks and balances to the process.”

Writers Sean Cavanagh and Michele Molnar contributed to this report.


See also:

A version of this news article first appeared in the Digital Education blog.