Elon Musk says he can stop child abuse on Twitter. So far, he’s undermined things and fired guard dogs.

Less than a month after taking control of Twitter, Elon Musk has announced that it’s time to address child sexual abuse content on the social media platform.Priority #1

But according to interviews with four former employees, one current employee, internal records and interviews, there is little evidence that the company has taken more aggressive steps under its management or devoting more resources to the platform’s longstanding problem with child sexual exploitation content. with people working to stop child abuse content online.

Meanwhile, Musk has turned the issue of online safety into part of an effort to humiliate Twitter’s previous leaders and portray corporate ownership as part of a sociopolitical battle against the “awakened mind virus,” as he calls it far from the centre-left. -left ideals. This shift is coming as it increasingly embraces far-right online rhetoric, often including false allegations of child sexual abuse.

“It is a crime for them to refuse to take action against child abuse for years!” Musk tweeted on Friday in response to a resignation letter from a member of the company’s Trust and Safety Council, which works on child abuse issues.

“This is wrong,” former CEO Jack Dorsey replied.

Under Musk’s new management, said on twitter Thanks to new partnerships with unnamed organizations and new “detection and enforcement methods” that the company hasn’t disclosed, the number of accounts suspended for child sexual abuse content in November is higher than in any other month of 2022.

Meanwhile, in the wake of layoffs, mass layoffs, and company resignations, Twitter’s resources to tackle online child sexual exploitation content (and what’s sometimes called child pornography or child sexual exploitation materials) are weak.

As staff numbers continue to fluctuate at Twitter, internal records obtained by NBC News and CNBC show that as of early December, approximately 25 employees hold titles related to “Trust and Safety” out of the approximately 1,600 employees still with the company. In total, there are more than 100 people who Musk has authorized to work at Twitter but work at his other companies Tesla, SpaceX and The Boring Co., as well as various investors and advisors.

A former Twitter employee working in child safety said they know “a handful” of people at the company are still working on this issue, but that most or all of the product managers and engineers on the team are no longer there. The employee spoke on condition of anonymity for fear of retaliation for discussing the company.

Twitter headcount reached more than 7,500 by the end of 2021. Former employees said layoffs were likely even if Musk hadn’t taken over the business.

Twitter did not respond to requests for comment.

Under Musk’s direction, the company also withdrew from some of its external commitments with child safety groups.

Twitter, for example, dissolved its Trust and Security Council Monday, which includes 12 groups that advise the company on its efforts to address child sexual abuse.

The National Center for Missing and Exploited Children (NCMEC), the organization tasked by the U.S. government to track online reports of child sexual abuse material, said under Musk’s leadership, little has changed in terms of Twitter’s reporting practices so far.

“Despite the rhetoric and some of what we’ve seen people post online, the CyberTipline numbers are nearly identical to the numbers before Musk stepped in,” NCMEC representative Gavin Portnoy said, referring to the organization’s central CSAM reporting system.

Portnoy noted that one change the group noticed was that Twitter did not send a representative to the organization’s annual social media roundtable.

“The previous person was one of those who resigned,” Portnoy said. When asked if Twitter would like to send a proxy, Portnoy said Twitter refused.

More recently, Musk has used the topic of child sexual exploitation content to attack former Twitter employees, particularly Yoel Roth, who heads the company’s trust and security efforts and was praised by Musk when he became its first CEO in October. Roth left Twitter a few weeks after the US midterm elections.

Musk suggested that Roth’s doctoral thesis on the LGBTQ dating app Grindr advocates the sexualization of children while the opposite is true. Roth, who is openly gay, specifically stated that Grindr is not a safe space for minors, and discussed how to create age-appropriate content and spaces for LGBTQ youth to connect online and through apps.

Musk’s misleading claims have exposed Roth to widespread online harassment, including on Twitter, where users hurled homophobic and antisemitic threats, memes and insults at him. On Monday, it was reported that Roth had left his home due to threats following Musk’s baseless accusations against him.

Musk’s provocative tweets targeting Roth fit the growing far-right attacks on LGBTQ people using false claims such as “grooming,” said Laurel Powell, deputy director of communications for programs at the Human Rights Campaign, a nonprofit LGBTQ advocacy organization. .

“This grooming rhetoric really recycles hate speech towards LGBTQ+ people in many cases,” Powell said. “This is a really dangerous moment we’re in – someone with as broad a platform as Mr. Musk nurturing this refuted false rhetoric.”

Twitter’s flawed efforts to combat child sexual exploitation content are well documented. In 2013, the company said it would introduce PhotoDNA technology to prevent publication of child sexual abuse material (CSAM) already in a known CSAM database. However, this technology cannot detect newly created material.

Transparency reports of the company detailing issues such as legal requests and closing of accounts, Over 1 million accounts removed For violation of child sexual exploitation content rules in 2021.

In 2021, Twitter reported that 86,666 cases of CSAM were detected on its platform, Portnoy said the number could be higher. “We’ve always felt that there should be more complaints from Twitter for sure no matter how you cut it, and given the sheer number of users just there,” he said.

Child sexual abuse content remains a problem for Twitter, although most major social media platforms continue to deal with it in some way.

Some advertisers left Twitter earlier this year after ads were found appearing next to problematic content. Twitter filed a lawsuit in 2021 by a child sexual abuse victim and her mother, alleging that the company didn’t act quickly enough when it received a warning of a child video circulating on the platform. A second child was later added to the case now pending in the Ninth Circuit Court of Appeals.

Moderation of this content often relies on a combination of automated detection systems and expert internal teams and external contractors to identify and remove child abuse content. Twitter’s policy defines this content as “images and videos referred to as child pornography, as well as written requests and other material that promotes the sexual exploitation of children.”

According to people familiar with the situation and internal records, layoffs, firings and resignations have cut the number of engineers on Twitter by more than half; this includes the numerous employees and leaders in the company who are working on trust and security features and improvements. current platform. While Musk is also cutting contractors, the company is turning to high-tech automation for its auditing needs, Twitter’s current head of Trust and Security, Ella Irwin, told Reuters.

“You tend to think that more corpses means more safety,” Portnoy said. “I mean, that’s discouraging.”

It’s unclear how many Twitter employees are left to work on child safety issues.

A LinkedIn search for current Twitter employees who say they work on child safety found only a few accounts. Bloomberg and Wired have each previously reported that layoffs and terminations on Twitter, directed by Musk, have reduced the number of people working there on content moderation, including a focus on child sexual abuse content.

Despite this, Musk claimed that he is reorienting the company to prioritize child safety.

Twitter turned to at least one outside researcher for help – Italian cybersecurity researcher Andrea Stroppa, who says she’s friendly with Elon Musk and has praised Musk online since he took over Twitter.

Stroppa has previously analyzed bots and propaganda on Twitter and told NBC News that she’s working with company employees to find and remove child sexual abuse content and accounts. Irwin, Twitter’s current head of Trust and Safety, said: Thanks to Stroppa for its “partnership and commitment”.

Stroppa, who remains an independent researcher, said Twitter felt its previous efforts were lacking and is now moving quickly to find and suspend accounts that post child sexual abuse content. She said the company changed its policy from removing individual tweets to immediately banning accounts found to be violating their policies.

“I think this is a radical change,” he said in a phone interview.

Marijke Chartouni, who has suffered from human trafficking and harassment and is currently working to bring attention to the issue, said that she had previously achieved good results by flagging problematic accounts and content on Twitter from 2020.

The platform wasn’t perfect, but it wasn’t as negligent as Musk claimed. “The ‘old’ Twitter responded quickly and closed accounts,” he said in an email. “I felt like I was making some progress.”

Leave a Reply

Your email address will not be published. Required fields are marked *