trust

Massive Scandal Causes Them to Jump Ship Like Rats

According to Elon Musk, before he took over, “the real CEO was the head of Trust & Safety.” That would be Yoel Roth. He’s gone. As the dump of Twitter Files continues, three members of his controversial team jumped ship, like the rats they are. Among the “inmates running the asylum” who just went into hiding is Lesley Podesta, niece of John Podesta. He is a slimy character of world class renown. It’s enough to know, for now, that he was “former campaign chairman for Hillary Clinton.” The reason they’re scrambling for cover is because the core issue that the big “censorship” scandal swirls around is child pornography and pedophilia as much as political shadow-banning and censorship.

No trust or safety for kids

Mainstream media isn’t reporting that three ranking members of Twitter’s Trust and Safety Council resigned on December 9. Anne Collier “was founder and executive director of The Net Safety Collaborative.

Eirliani Abdul Rahman is best known as “co-founder of Youth, Adult Survivors & Kin In Need (YAKIN).” Finally, there is Lesley Podesta, “an advisor to the Young and Resilient Research Center at Western Sydney University.” Nobody is mentioning that she’s John Podesta’s niece.

Twitter’s Trust and Safety Council, “formed in 2016, consists of several dozen people and independent organizations.” Former Twitter management claimed they were there to help “advocate for safety and advise us as we develop our products, programs, and rules.” The team is furious that Musk will actually allow people to say things which offend liberals.

What they aren’t talking about are the things they refused to censor. Things like kiddie porn. There’s a good chance that Twitter made a fortune off exploiting innocent victims. Child pornography was a lot more prevalent on the platform than responsible people would imagine.

We are announcing our resignation from Twitter’s Trust and Safety Council because it is clear from research evidence that, contrary to claims by Elon Musk, the safety and wellbeing of Twitter’s users are on the decline,” they wrote in a press release. It was written by Collier and quickly went viral on Twitter.

She cites dubious statistics from “two watchdogs, the Center for Countering Digital Hate and the Anti-Defamation League, that recently reported a sharp increase in hate speech — including slurs against black people and gay men, as well as antisemitic posts.” The actual data proves the exact opposite. The former team is obviously in panic mode, now that the truth is trickling out.

Let the robots do it

The new head of trust and safety, Ella Irwin, confirmed to Reuters that “the company has gotten rid of some manual reviews for content moderation, instead relying heavily on automation.” Robots follow the rules. Humans like Anne Collier bend them to suit their needs.

As Collier writes, “you really need human review on a lot of abuse reports because they can be very nuanced and highly contextual to offline life, and the platforms don’t really have that context, so it’s really hard for machine learning algorithms to detect all of it or make decisions on all of it.

That’s funny. Back on January 21, 2021, New York Post reported a big expose about how “Twitter refused to take down widely shared pornographic images and videos of a teenage sex trafficking victim.” We can trust the bots to do what the humans didn’t. Leslie Podesta went on record last week to claim, “the safety and protection of all users was always paramount.” Tell that to “John Doe.

trust

He was 17 when the Post story came out. He “was between 13 and 14 years old when sex traffickers, posing as a 16-year-old female classmate, started chatting with him on Snapchat.” After exchanging nude photos, he got blackmailed into providing more, with a second victim on cam at the same time. Eventually, the videos ended up on Twitter, where his classmates saw them.

The boy and his mom quickly contacted Twitter, only to be told by Trust and Safety, after a week long “investigation,” that there was nothing wrong with the reported videos and images. “On January 28, Twitter replied to Doe and said they wouldn’t be taking down the material, which had already racked up over 167,000 views and 2,223 retweets,” their lawsuit alleges.

Thanks for reaching out. We’ve reviewed the content, and didn’t find a violation of our policies, so no action will be taken at this time. If you believe there’s a potential copyright infringement, please start a new report. If the content is hosted on a third-party website, you’ll need to contact that website’s support team to report it. Your safety is the most important thing, and if you believe you are in danger, we encourage you to contact your local authorities.

They sent back a copy of the local police report. “What do you mean you don’t see a problem? We both are minors right now and were minors at the time these videos were taken. We both were 13 years of age. We were baited, harassed, and threatened to take these videos that are now being posted without our permission. We did not authorize these videos AT ALL and they need to be taken down.” The so-called “trust and safety” team did nothing. Homeland Security got involved.

Only after this take-down demand from a federal agent did Twitter suspend the user accounts that were distributing the CSAM and report the CSAM to the National Center on Missing and Exploited Children. This is directly in contrast to what their automated reply message and User Agreement state they will do to protect children.” This rabbit hole is going to go a whole lot deeper before it hits bottom.

Related Posts