An app designed to make surfing on YouTube safer for children has come under fire for linking to “inappropriate” content.
A number of consumer advocacy groups sent a letter to the FTC Tuesday, asking for an investigation into Google’s YouTube Kids app. The groups allege the app includes ads that deceive its young users in ways that violate the FTC’s policies on how products can be marketed to children.
The letter, which was signed by 10 different consumer and child advocacy groups, contends the app disguises ads for toys and other products as “user generated” videos and fails to disclose the relationship between those users and the manufacturers.
YouTube said any inappropriate videos flagged up to it would be removed.
The complaint, filed by the Campaign for a Commercial-Free Childhood and the Center for Digital Democracy, claims that the groups found links to videos with explicit sexual language, jokes about paedophilia and drug use and adult discussions about violence, pornography and suicide.
“Google promised parents that YouTube Kids would deliver appropriate content for children, but it has failed to fulfil its promise,” Aaron Mackey, a lawyer representing the groups told the Wall Street Journal.
A YouTube spokesperson told the BBC: “We work to make the videos in YouTube Kids as family friendly as possible and take feedback very seriously.
“We appreciate people drawing problematic content to our attention, and make it possible for anyone to flag a video. Flagged videos are manually reviewed 24/7 and any videos that don’t belong in the app are removed.”
Parents can also turn off the search function in the app which limits what content children can access.
YouTube Kids was launched in the US only in February, claiming to offer specially curated video content suitable for children.
It found itself in hot water in April when a group of child safety experts complained that the app mixed programming with branded videos from companies such as McDonald’s, Mattel and Hasbro.
Via: BBC, Mashable