Trolls allegedly uploading hidden suicide messages in YouTube Kids cartoons



Parents have been issued a chilling warning after trolls uploaded cartoons to YouTube Kids spliced with a hidden message encouraging children to kill themselves.
“This video was intentionally planted on YouTube Kids to harm our children. He waited until parents’ guards were down. How can anyone do this?” fumed one horrified mom, whose son watched the vile video.
It is not known who uploaded the sick clips — with YouTube sometimes leaving them up for days until they were taken down.
The now-deleted video, cut with an episode of popular kids cartoon “Splatoon,” features a clip from YouTube prankster Filthy Frank.
He appears on screen smirking and wearing sunglasses before describing how kids can harm themselves before signing off with “End it.”
Filthy Frank, who is followed by more than 6 million subscribers, racks up millions of views with “anti-PC, anti-social and anti-couth” videos.
Some feature him bathing in a bath of noodles and eating raw squid, but one was of him pretending to be a One Direction fan committing suicide.
He was unavailable for comment when approached by The Sun — but there is no suggestion he played any role in trolls editing the cartoon.
Free N. Hees, a pediatrician, reported the video to YouTube and got the platform to take it down.
She believes these videos could lead to a rise in child suicide.
“Exposure to videos, photos and other self-harm and suicidal promoting content is a huge problem that our children are facing today,” said Hees, who also blogs on child cyber safety under the pseudonym PediMom.
“We need to fight to have the developers of social media platforms held responsible when they do not assure that age restrictions are followed.”
The National Society for the Prevention of Cruelty to Children (NSPCC) in Britain was appalled when The Sun brought the videos to its attention and slammed YouTube and Google for failing children.
“The very fact that these videos are still there, and can be watched, shows that Google and YouTube are not doing enough to safeguard kids,” a spokesperson for the charity said.
“They’re taking them down when they’re eventually alerted — but it shouldn’t have to be up to organizations like The Sun Online to tell them to take these videos down. We don’t know yet what impact these videos will have on kids — but what we do know is that they will be highly distressing for any young child that sees them. It’s massively concerning; these things should not be available on sites like YouTube.”
The video in question has now been removed from YouTube Kids — a channel the platform markets as a safe place for youngsters online.
“We created YouTube Kids to make it safer and simpler for children to explore the world through online video — from their favorite shows and music to learning how to build a model volcano (or make slime), and everything in between,” says a blurb on the website.
“There’s also a whole suite of parental controls, so you can tailor the experience to your family’s needs,” the site continues.
This is not the first time sick content has been spliced with children’s cartoons on YouTube.
In 2017, Peppa Pig cartoons were edited to show distressing scenes — with Peppa getting her teeth pulled out, characters having sex and others being violently attacked.
The recent suicide videos were revealed after an investigation by The Sun showed kids as young as 8 are being targeted by predators and bombarded with sexually explicit messages on a new social media app.
TikTok, which lets users create and share short videos with music and camera effects, has been branded a “magnet for pedophiles” by concerned campaigners and parents.
Meanwhile, another mom has spoken out after her 7-year-old son told other kids they would be “killed in their beds” while playing the deadly “Momo” challenge.
The sick suicide game has swept the web and is already believed to have caused the tragic deaths of two teenagers in Colombia.
The NSPCC has ordered YouTube to improve its technology before easily influenced youngsters start harming themselves.
“We’re really concerned because these videos are being specifically targeted towards children with very dark content,” said Tony Stower, head of child safety online for the NSPCC.
“What worries us is that the YouTube algorithm isn’t clever enough for the site to notice that these videos are not acceptable for kids. As a result they are not putting them behind an appropriate age gate.
“We are pushing the government to introduce a duty of care on sites like YouTube and Facebook. They need to prioritize child protection online. But with products like YouTube Kids, all of that safeguarding is an afterthought. Before they start marketing, they need to make sure their sites are safe for children,” he added.
A YouTube spokesperson claimed many of its offensive videos are taken down before they are viewed.
“We work hard to ensure YouTube is not used to encourage dangerous behavior and we have strict policies that prohibit videos which promote self-harm.
“Every quarter we remove millions of videos and channels that violate our policies and we remove the majority of these videos before they have any views.”

Comments