Facebook has been in the hot seat over revelations of data protection in recent weeks, but national concern over privacy may now be focusing on another tech giant: Google.
YouTube, a subdivision of Google, was accused on Monday of improperly collecting data from users under the age of 13 who log on and watch videos without parental consent.
A coalition of advocacy groups focused on privacy, children’s rights and consumer rights took YouTube filed a complaint with the Federal Trade Commission over the company’s practices. The complaint contends that the site is in violation of the Children’s Online Privacy Protection Act, which requires websites to obtain parental permission before collecting or using data from minors, including facts like their location and viewing preferences.
YouTube requires users to be at least 13 in order to register an account on the website, noting in their terms of service that the site “is not intended for children under 13.”
“If you are under 13 years of age, then please do not use the Service. There are lots of other great web sites for you. Talk to your parents about what sites are appropriate for you,” the terms of service state.
Yet YouTube does not require an account in order to watch videos and many videos on the site are specifically geared toward a young audience, the complaint noted. Though YouTube also has a separate YouTube Kids app, videos available on the app are still available through the regular YouTube site, and a majority of children watch videos on YouTube instead of the Kids app.
According to a Smarty Pants 2017 Brand Love Study cited in the complaint, YouTube is the “most-loved and most-recognized” brand among kids ages 6-12 in the U.S., and 80% of children in that age group use YouTube daily.
Among the groups taking part in the complaint are the Center for Digital Democracy, the Campaign for a Commercial-Free Childhood, Consumer Federation of America, Electronic Privacy Information Center and Parents Across America.
The complaint alleges that YouTube collects data from all its users, including those who are underage, that can include geolocation information and a user’s IP address that YouTube uses to target ads to users, among other things. That data is specifically prohibited from being unknowingly collected from minors under COPPA.
Officials at YouTube are also fully aware that the site is used by children, the complaint contended. Under COPPA, a website counts as being “child-directed” and subject to regulation if the site “has actual knowledge it is collecting personal information directly from users of a child-directed site, and continues to collect that information,” the complaint stated.
“YouTube knows children are watching content on their site, and has created content channels specifically aimed at them, but does not appear to obtain the required parental consent before collecting information about them,” Katie McInnis, policy counsel for the Consumers Union, one of the groups named in the complaint, told the Guardian.
Ed Mierzwinski, senior director of the federal consumer program at the U.S. Public Interest Research Group, one of the groups that filed the complaint, noted in an interview with Mic that while data protections in the U.S. are largely done by sector, children’s protections are “one of the strongest parts of data protection in the U.S.”
“You and I don’t have very strong protection, but kids under 13 do,” Mierzwinski said. “And that’s part of the message we’re sending here. Even when children under 13 are protected by a strong law, the companies have been ignoring them. That’s why we need a strong action by the FTC.”
In an emailed statement, a YouTube spokesperson said: “We are reviewing the complaint and will evaluate if there are things we can do to improve. Protecting kids and families has always been a top priority for us. Because YouTube is not for children, we’ve invested significantly in the creation of the YouTube Kids app to offer an alternative specifically designed for children.”
The company also noted that it prohibits advertisers from targeting personalized ads to users under 13 in its ads policies, and its ads systems do not allow for targeting to underage users. YouTube creators also have the ability to restrict personalized advertising on their channels, and users can adjust their own ad settings and watch history.
The FTC complaint comes at a moment of reckoning for data protection, as Facebook came under fire amid reports that at least 87 million users’ data was collected by data firm Cambridge Analytica. Facebook CEO Mark Zuckerberg is scheduled to appear before Congress on Tuesday and Wednesday to address the company’s data protection policies, and the company has taken steps to address the scandal in recent weeks.
U.S. internet users still report a fairly high level of trust in Google, however. According to a Reuters/Ipsos poll released in the wake of the Cambridge Analytica scandal, 62% of Americans say they trust Google to obey privacy laws, as compared with only 41% who believed the same about Facebook.
The company’s collection and use of user data, however, is particularly widespread. According to the Princeton Web Transparency and Accountability Project, 76% of websites contain hidden Google trackers, and the company is set to control 37.2% of the digital ad market in 2018, according to market research company eMarketer.
Google, which is owned by parent company Alphabet, has also faced its share of past privacy battles. In 2011, the company agreed to be audited for 20 years by the FTC over its use of deceptive privacy tactics in its short-lived Buzz social network, and the company was forced to implement a new privacy program after it conceded that its Street View mapping project had violated people’s privacy in 2013. A 2012 case over privacy settings in Apple’s Safari browser resulted in Google having to pay a $22.5 million fine — the largest civil penalty ever levied by the FTC, according to the New York Times.
Google, of course, has continued to collect and use data despite these legal challenges. With the spotlight now on data protection, though, is this the moment to bring about real change that goes beyond Facebook?
Tech companies worldwide are currently addressing their data protection policies, as the European Union prepares to impose a sweeping new data protection policy, the General Data Protection Regulation, in May. The new regulations will require companies to comply with its strict data protection requirements, imposing hefty fines on those who don’t. Facebook has said that it will implement its protocols for the GDPR worldwide, strengthening data protection for users even outside the E.U., and the current moment could spur pressure for other companies to similarly roll out stronger protections worldwide.
Lawmakers and privacy advocates, however, hope that new data protections won’t just be limited to Europe and the GDPR’s residual effects. Google CEO Sundar Pichai has been invited to testify before the Senate Judiciary Committee in April along with Zuckerberg and Twitter CEO Jack Dorsey, and in an interview on CBS’s Face the Nation Sunday, Republican Sen. John Kennedy suggested Congress could possibly impose regulations on Facebook in the wake of the Cambridge Analytica scandal. Such regulations could have a broad reach that would affect Google and other major tech companies.
“I think that we shouldn’t let this moment deny us from getting better protections,” Mierzwinski said. “We have all said, ‘Yeah, it’s great Mark Zuckerberg is promising to comply worldwide with the European rules ... but why doesn’t every company have to comply worldwide with strong rules that are written on the books in the United States?’”
“The question for Congress is will Congress pass a strong rule for privacy in the United States, and before that, will the FTC have the guts to impose many, many millions of dollars in penalties to these companies?”