The US government is in the late stages of an investigation into YouTube for its handling of children's videos, according to four people familiar with the matter, a probe that threatens the company with a potential fine and already has prompted the tech giant to reevaluate some of its business
YouTube under investigation over allegations it violates kids' privacy
Subscribe to listen
YouTube has attracted a massive audience of younger viewers. Photo/Getty Images
The Wall Street Journal first reported Wednesday that YouTube was considering moving all children's content off the service into a separate app, YouTube Kids, to better protect younger viewers from problematic material - a change that would be difficult to implement because of the sheer volume of content on YouTube, and potentially could be costly to the company in lost advertising revenue. A person close to the company said that option was highly unlikely, but that other changes were on the table.
YouTube Kids gets a tiny fraction of the YouTube's audience, which tops 1.9 billion users logging in each month. Kids tend to switch from YouTube Kids to the main platform around the age of seven, Bloomberg reported this week.
The internal conversations come after years of complaints by consumer advocates and independent researchers that YouTube had become a leading conduit for political disinformation, hate speech, conspiracy theories and content threatening the well-being of children. The prevalence of preteens and younger children on YouTube has been an open secret within the technology industry and repeatedly documented by polls even as the company insisted that the platform complied with a 1998 federal privacy law that prohibits the tracking and targeting of those under 13.

The FTC has been investigating YouTube about its treatment of kids based on multiple complaints it received dating back to 2015, arguing that both YouTube and YouTube Kids violate federal laws, according to the people familiar with the investigation. The exact nature and status of the inquiry is not known, but one of the sources said that it is in advanced stages - suggesting a settlement, and a fine depending on what the FTC determines, could be forthcoming.
"Google has been violating federal child privacy laws for years," said Jeffrey Chester, executive director of the Center for Digital Democracy, one of the groups that has repeatedly complained about YouTube.
Major advertisers also have pushed YouTube and others to clean up its content amid waves of controversies over the past two years.
A report last month by PWC, a consulting group, said that Google had an internal initiative called Project Unicorn that sought to make company products comply with the federal child privacy law, called the Children's Online Privacy Protection ACT and known by its acronym COPPA.
The company that commissioned the PWC report, SuperAwesome, helps that help technology companies provide services without violating COPPA or European child-privacy legal restrictions against the tracking of children.
"YouTube has a huge problem," said Dylan Collins, chief executive of SuperAwesome. "They clearly have huge amounts of children using the platform, but they can't acknowledge their presence."
He said the steps being considered by YouTube would help, but "They're sort of stuck in a corner here, and it's hard to engineer their way out of the problem."
Earlier this month, YouTube made its biggest change yet to its hate speech policies - banning direct assertions of superiority against protected groups, such as women, veterans, and minorities, and banning users from denying that well-documented violent events took place. Previously, the company prohibited users from making direct calls for violence against protected groups, but stopped short of banning other forms of hateful speech, including slurs. The changes were accompanied by a purge of thousands of channels, featuring Holocaust denial and content by white supremacists.
The company also recently disabled comments on videos featuring minors and banned minors from live-streaming video without an adult present in the video. Executives have also moved to limit its own algorithms from recommending content in which a minor is featured in a sexualized or violent situation, even if that content does not technically violate the company's policies.
- Washington Post