ThisisBigBrother.com - UK TV Forums

ThisisBigBrother.com - UK TV Forums (https://www.thisisbigbrother.com/forums/index.php)
-   Tech, Movies & Video Games (https://www.thisisbigbrother.com/forums/forumdisplay.php?f=67)
-   -   Twitter gets sued for refusal to remove CP, claims it doesn't violate their policies (https://www.thisisbigbrother.com/forums/showthread.php?t=373046)

Calderyon 22-01-2021 04:33 PM

Twitter gets sued for refusal to remove CP, claims it doesn't violate their policies
 
Quote:

Twitter refused to remove child porn because it didn’t ‘violate policies’: lawsuit

By Gabrielle Fonrouge

January 21, 2021 | 10:35am
Twitter refused to take down widely shared pornographic images and videos of a teenage sex trafficking victim because an investigation “didn’t find a violation” of the company’s “policies,” a scathing lawsuit alleges.

The federal suit, filed Wednesday by the victim and his mother in the Northern District of California, alleges Twitter made money off the clips, which showed a 13-year-old engaged in sex acts and are a form of child sexual abuse material, or child porn, the suit states.

The teen — who is now 17 and lives in Florida — is identified only as John Doe and was between 13 and 14 years old when sex traffickers, posing as a 16-year-old female classmate, started chatting with him on Snapchat, the suit alleges.

Doe and the traffickers allegedly exchanged nude photos before the conversation turned to blackmail: If the teen didn’t share more sexually graphic photos and videos, the explicit material he’d already sent would be shared with his “parents, coach, pastor” and others, the suit states.

Doe, acting under duress, initially complied and sent videos of himself performing sex acts and was also told to include another child in his videos, which he did, the suit claims.

Eventually, Doe blocked the traffickers and they stopped harassing him, but at some point in 2019, the videos surfaced on Twitter under two accounts that were known to share child sexual abuse material, court papers allege.

Over the next month, the videos would be reported to Twitter at least three times — first on Dec. 25, 2019 — but the tech giant failed to do anything about it until a federal law enforcement officer got involved, the suit states.

Doe became aware of the tweets in January 2020 because they’d been viewed widely by his classmates, which subjected him to “teasing, harassment, vicious bullying” and led him to become “suicidal,” court records show.

While Doe’s parents contacted the school and made police reports, he filed a complaint with Twitter, saying there were two tweets depicting child pornography of himself and they needed to be removed because they were illegal, harmful and were in violation of the site’s policies.

A support agent followed up and asked for a copy of Doe’s ID so they could prove it was him and after the teen complied, there was no response for a week, the family claims.

Around the same time, Doe’s mother filed two complaints to Twitter reporting the same material and for a week, she also received no response, the suit states.

Finally on Jan. 28, Twitter replied to Doe and said they wouldn’t be taking down the material, which had already racked up over 167,000 views and 2,223 retweets, the suit states.

“Thanks for reaching out. We’ve reviewed the content, and didn’t find a violation of our policies, so no action will be taken at this time,” the response reads, according to the lawsuit.

“If you believe there’s a potential copyright infringement, please start a new report. If the content is hosted on a third-party website, you’ll need to contact that website’s support team to report it. Your safety is the most important thing, and if you believe you are in danger, we encourage you to contact your local authorities.”

In his response, published in the complaint, Doe appeared shocked.

“What do you mean you don’t see a problem? We both are minors right now and were minors at the time these videos were taken. We both were 13 years of age. We were baited, harassed, and threatened to take these videos that are now being posted without our permission. We did not authorize these videos AT ALL and they need to be taken down,” the teen wrote back to Twitter.

He even included his case number from a local law enforcement agency, but still the tech giant allegedly ignored him and refused to do anything about the illegal child sexual abuse material — as it continued to rack up more and more views.

Two days later, Doe’s mom was connected with an agent from the Department of Homeland Security through a mutual contact who successfully had the videos removed on Jan. 30, the suit states.

“Only after this take-down demand from a federal agent did Twitter suspend the user accounts that were distributing the CSAM and report the CSAM to the National Center on Missing and Exploited Children,” states the suit, filed by the National Center on Sexual Exploitation and two law firms.

“This is directly in contrast to what their automated reply message and User Agreement state they will do to protect children.”

The disturbing lawsuit goes on to allege Twitter knowingly hosts creeps who use the platform to exchange child porn material and profits from it by including ads interspersed between tweets advertising or requesting the material.

Early Thursday, Twitter declined comment to The Post but later in the day, reversed course and sent a statement by email.

“Twitter has zero-tolerance for any material that features or promotes child sexual exploitation. We aggressively fight online child sexual abuse and have heavily invested in technology and tools to enforce our policy, a Twitter spokesperson wrote.

“Our dedicated teams work to stay ahead of bad-faith actors and to ensure we’re doing everything we can to remove content, facilitate investigations, and protect minors from harm — both on and offline.”
https://nypost.com/2021/01/21/twitte...mpression=true

Nicky91 22-01-2021 04:33 PM

fake news :idc:

Vicky. 22-01-2021 04:34 PM

Ongoing issue. I have reported SO MUCH of this **** and always been told it does not violate terms. See also, actual death threats to women/young teens also. And targetted harassment, ongoing for long periods. Actual harassment, by multiple people, not 'the odd nasty message'.
But yeah, the child porn stuff is much worse. But twitter is currently a cesspit, and any action taken against them, I fully support.

Tom4784 22-01-2021 04:36 PM

It's a disgrace and I hope the lawsuit works out. There is absolutely no reason to keep stuff like that up and Twitter is big enough and profitable enough to be able to afford to keep on top of this ****. It's an outrage.

Nicky91 22-01-2021 04:38 PM

Quote:

Originally Posted by Dezzy (Post 10989530)
It's a disgrace and I hope the lawsuit works out. There is absolutely no reason to keep stuff like that up and Twitter is big enough and profitable enough to be able to afford to keep on top of this ****. It's an outrage.

as if, who made these claims, some far right bigot trying to avenge trump at trying to bring down twitter with fake claims of CP being allowed on twitter

Vicky. 22-01-2021 04:49 PM

Quote:

Originally Posted by Nicky91 (Post 10989537)
as if, who made these claims, some far right bigot trying to avenge trump at trying to bring down twitter with fake claims of CP being allowed on twitter

You clearly have no idea what you are talking about on this topic, as this has long been deemed an issue by people of all political persuasions..not 'just trump supporters'. Its been ongoing for bloody ages.

Part of the problem with having social media where users upload without mod approval is that some CP and various illegal content will inevitably make its way on there, BUT when reported it should be removed instantly. Not excused. This issue is actually quite similar to the pornhub thing. In that people report VERY dodgy stuff for ages, and nowt happens. Which kind of says its the policies that are the main issue, if this stuff really is deemed fine.

Nicky91 22-01-2021 04:51 PM

Quote:

Originally Posted by Vicky. (Post 10989552)
You clearly have no idea what you are talking about on this topic, as this has long been deemed an issue by people of all political persuasions..not 'just trump supporters'. Its been ongoing for bloody ages.

umm no, they use other social media platforms, not twitter

Vicky. 22-01-2021 04:51 PM

Quote:

Originally Posted by Nicky91 (Post 10989553)
umm no, they use other social media platforms, not twitter

?

DouglasS 22-01-2021 04:53 PM

They are banning Hopkins and Trump yet don’t even know their own rules and actively allow child porn to be promoted thereby promoting pedophilia :skull:

Priorities..

Nicky91 22-01-2021 04:53 PM

Quote:

Originally Posted by Vicky. (Post 10989556)
?

snapchat, and chatrooms in general

lesser security over there

but of course biased NY Post needs to make a vile dig towards twitter in light of the whole Trump having gotten banned thing, Trump is the NY Post their hero or something

Vicky. 22-01-2021 05:00 PM

Quote:

Originally Posted by Nicky91 (Post 10989559)
snapchat, and chatrooms in general

lesser security over there

but of course biased NY Post needs to make a vile dig towards twitter in light of the whole Trump having gotten banned thing, Trump is the NY Post their hero or something

It an issue on a few social media platforms for sure.

But that does not absolve twitter of blame in this. Twitter should be MORE held to account than others really, given its such a huge platform!

Oliver_W 22-01-2021 05:04 PM

Whatever one thinks of the likes of Katie Hopkins, banning actual CP and those involved in trafficking should come waaaaay before banning a random gobshyte.

But this whole thing is reason number 221856458 why social media is a cesspit and nothing of value would be lost.

I have thought bots (as in actual bots, not Russian Drumpf Bots lol) would have the capacity to recognise and automatically moderate such material? Or at least flag it up for human review.

A way to avoid such trash might be to ban all nudity and pornography? That way, the accounts sharing the material would be banned anyway...

Vicky. 22-01-2021 05:07 PM

Quote:

Originally Posted by Oliver_W (Post 10989567)

I have thought bots (as in actual bots, not Russian Drumpf Bots lol) would have the capacity to recognise and automatically moderate such material? Or at least flag it up for human review.

A lot of it is flagged this way. Its impossible though to have a fool proof system with bots, and people constantly find ways around it too. It needs kept ontop of AND human interaction. A LOT of human oversight, and even that CAN fail in odd cases. But this..is them actively reviewing stuff and saying 'nah, its fine', which is a different kettle of fish to it finding a way onto the site, then being deleted.
Quote:

A way to avoid such trash might be to ban all nudity and pornography? That way, the accounts sharing the material would be banned anyway...
Well yeah, but 'muh boobies'?1?!?!?!

Infact from memory, this openly accepting 'MAPs' and such happened when instagram banned pornography (and 'anime porn' stuff), or something. They migrated to twitter. And have been accepted seemingly. Might be wrong on that though..as thats just memory from a while back..

A easy ish way I see that would catch much of it is..images have to be approved before being posted. BUT, that would **** up the 'instantly available' part of social media too...

Oliver_W 22-01-2021 05:42 PM

Quote:

Originally Posted by Vicky. (Post 10989571)
A lot of it is flagged this way. Its impossible though to have a fool proof system with bots, and people constantly find ways around it too. It needs kept ontop of AND human interaction. A LOT of human oversight, and even that CAN fail in odd cases. But this..is them actively reviewing stuff and saying 'nah, its fine', which is a different kettle of fish to it finding a way onto the site, then being deleted.

Well yeah, but 'muh boobies'?1?!?!?!

Infact from memory, this openly accepting 'MAPs' and such happened when instagram banned pornography (and 'anime porn' stuff), or something. They migrated to twitter. And have been accepted seemingly. Might be wrong on that though..as thats just memory from a while back..

A easy ish way I see that would catch much of it is..images have to be approved before being posted. BUT, that would **** up the 'instantly available' part of social media too...

Accepting people who call themselves MAPs is ridiculous tbh. As for "muh boobies" there's no need for that kind of content to be on twitter, there are plenty of specialised sites people can go to...

Vicky. 22-01-2021 05:54 PM

Quote:

Originally Posted by Oliver_W (Post 10989617)
Accepting people who call themselves MAPs is ridiculous tbh. As for "muh boobies" there's no need for that kind of content to be on twitter, there are plenty of specialised sites people can go to...

You would think. I foresee a ****storm though, if they banned this kind of thing. In reality, I cannot see a way out for them at this stage that does not involve banning the content wholesale really.

Oliver_W 22-01-2021 06:01 PM

Quote:

Originally Posted by Vicky. (Post 10989629)
You would think. I foresee a ****storm though, if they banned this kind of thing. In reality, I cannot see a way out for them at this stage that does not involve banning the content wholesale really.

I really don't see why it shouldn't be :shrug: Like it's not exactly a free speech thing is it :laugh:

Marsh. 22-01-2021 07:54 PM

Imagine Katie Hopkins being the first person to come to your mind when discussing this. :crazy:

bitontheslide 22-01-2021 08:30 PM

Something has to be done about social media, it is having a corrosive effect across the entirety of society

Scarlett. 22-01-2021 09:15 PM

I wonder if with some reports, they are dealt with via bots?

DouglasS 23-01-2021 01:03 AM

Quote:

Originally Posted by Marsh. (Post 10989728)
Imagine Katie Hopkins being the first person to come to your mind when discussing this. :crazy:

Exactly my point they are wasting time on completely trivial matters over important issues going on on Twitter, thanks for proving my point :clap1:

Marsh. 23-01-2021 01:18 AM

Quote:

Originally Posted by DouglasS (Post 10989911)
Exactly my point they are wasting time on completely trivial matters over important issues going on on Twitter, thanks for proving my point :clap1:

Didn't ask.

Oliver_W 23-01-2021 09:21 AM

Quote:

Originally Posted by Marsh. (Post 10989728)
Imagine Katie Hopkins being the first person to come to your mind when discussing this. :crazy:

Who else would I say? Trump should have been banned for basically soft-condoning the rioters' actions at the Capitol, while Hopkins' only crime was being "problematic".

Marsh. 23-01-2021 05:14 PM

Quote:

Originally Posted by Oliver_W (Post 10990013)
Who else would I say? Trump should have been banned for basically soft-condoning the rioters' actions at the Capitol, while Hopkins' only crime was being "problematic".

That your response immediately falls to Katie Hopkins when discussing child porn on twitter is... strange at best.

She has nothing to do with it. People/users are banned from it everyday.

arista 23-01-2021 05:28 PM

Quote:

Originally Posted by bitontheslide (Post 10989780)
Something has to be done about social media, it is having a corrosive effect across the entirety of society


Maybe Biden
may go too far, though.

Denver 23-01-2021 05:29 PM

Quote:

Originally Posted by Scarlett. (Post 10989820)
I wonder if with some reports, they are dealt with via bots?

Yea I doubt Jack is looking into the hundreds of thousands of complaints they probaly get a day


All times are GMT. The time now is 07:39 PM.

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.