An app designed to make surfing on YouTube safer for children has come under fire for linking to "inappropriate" content.
Two child advocacy groups have flagged up videos that they say "would be extremely disturbing for young children to view".
They have lodged a complaint with the US regulator Federal Trade Commission, according to the Wall Street Journal.
YouTube said any inappropriate videos flagged up to it would be removed.
The complaint, filed by the Campaign for a Commercial-Free Childhood and the Center for Digital Democracy, claims that the groups found links to videos with explicit sexual language, jokes about paedophilia and drug use and adult discussions about violence, pornography and suicide.
"Google promised parents that YouTube Kids would deliver appropriate content for children, but it has failed to fulfil its promise," Aaron Mackey, a lawyer representing the groups told the Wall Street Journal.
A YouTube spokesperson told the BBC: "We work to make the videos in YouTube Kids as family friendly as possible and take feedback very seriously.
"We appreciate people drawing problematic content to our attention, and make it possible for anyone to flag a video. Flagged videos are manually reviewed 24/7 and any videos that don't belong in the app are removed."
Parents can also turn off the search function in the app which limits what content children can access.
YouTube Kids was launched in the US only in February, claiming to offer specially curated video content suitable for children.
It found itself in hot water in April when a group of child safety experts complained that the app mixed programming with branded videos from companies such as McDonald's, Mattel and Hasbro.
(BBC)
www.ann.az
Follow us !