View Full Version : YouTube ads pulled over 'paedophile ring'
arista
22-02-2019, 04:32 PM
[Several companies, including Nestle
and Fortnite
creator Epic Games, are pulling
advertisements from YouTube
over concerns that paedophiles
have been leaving disturbing comments
on videos of children.
Key points:
YouTube's recommendations
algorithm makes it easy to find videos
of young children in compromising poses
Many comments left on these videos
are paedophilic in nature, often leaving
timestamps for revealing moments
or links to other unlisted videos
YouTube has disabled comments on
tens of millions of videos and big
advertisers have suspended their ads
A video from popular YouTuber
MattsWhatItiS and a report from
Wired showed that some users
had been making unseemly comments
on innocuous videos of kids.
MattsWhatItIs said he had
discovered a "softcore paedophile ring"
operating on the website.]
What a mess
Good they are sorting it out.
https://www.abc.net.au/news/2019-02-22/youtube-ads-pulled-over-paedophiles-commenting-on-videos/10838800
Cherie
22-02-2019, 04:55 PM
disgusting pigs glad it is getting sorted,
Oliver_W
22-02-2019, 05:01 PM
Between this and the grooming on TikTok, I really wish parents would make more effort in monitoring their children's internet use...
Morgan.
22-02-2019, 05:13 PM
Yeah they've been disgustingly demonetising family vlog channels for no reason other than pedophiles exist.
I agree with most of the new rules and monitoring, I just think what I mentioned is a bit uncalled for.
Crimson Dynamo
22-02-2019, 05:23 PM
Between this and the grooming on TikTok, I really wish parents would make more effort in monitoring their children's internet use...
like how?
Oliver_W
22-02-2019, 05:23 PM
To be fair most family vlogs probably wouldn't be monetised anyway, they need to meet a certain amount of views and subscribers.
Morgan.
22-02-2019, 05:30 PM
To be fair most family vlogs probably wouldn't be monetised anyway, they need to meet a certain amount of views and subscribers.
The one I mentioned was The Ballinger Family, 1.2M subs and over 100k views per video. They're meeting the quota, and proved that the demonetisation came from YouTube's team that monitor videos and content after it had already been approved.
Oliver_W
22-02-2019, 05:56 PM
like how?
Not giving smartphones to under 12s might help!
But generally, stuff like:
- setting YouTube comments to "approval mode", so they won't show up until approved
- setting the notification so messages also go into the parents' email, so they can see what's being sent
Oliver_W
22-02-2019, 06:00 PM
The one I mentioned was The Ballinger Family, 1.2M subs and over 100k views per video. They're meeting the quota, and proved that the demonetisation came from YouTube's team that monitor videos and content after it had already been approved.
Pfft that's pathetic, what could be more wholesome than a Christian family singing songs?
If anything the comments should have been disabled, as it sounds like the issue is paedos either saying creepy **** or giving timestamps to leer over?
user104658
22-02-2019, 06:31 PM
like how?
TBF it's not really that hard with something like TikTok. My daughter uses it but her videos only publish to friends / family and comments and PM's are blocked.
I personally wouldn't want her posting to YouTube but the easiest way around that is her Google account being linked to one of our e-mail addresses so it would alert us.
That said... she is 9 and I'm not naive enough to think that it'll be easy to police her net access as a young teenager (or at all at 15+) so really all you can do is educate them on staying safe IMO... not sharing too much personal detail, being aware that they might not be talking to who they think they're talking to, etc.
I actually think it's far more sensible to do that early than to try to "keep them away from it" as if you do that, when they reach their later teens and you can't stop them getting access, they're going to be MORE vulnerable to the risks.
vBulletin® v3.8.11, Copyright ©2000-2025, vBulletin Solutions Inc.