Australia slams Twitter (now X) for 80% cut in trust and safety engineers
Australia has called out X, formerly Twitter, for slashing its online trust and safety resources and hindering its ability to respond to concerns about harmful content on its platform.
The Australian eSafety Commissioner released a transparency report detailing cuts the microblogging site made to its trust and safety teams since it was acquired in October 2022. This is the first time specific figures were given on where cuts were made, following a legal notice served to X Corp seeking information on what it was doing to comply with Australia’s online safety rules.
Also: Micro-social media: What is it and which tools should you try?
The transparency report summarizes X’s response, eSafety said, noting that the data provided revealed “deep cuts” to its safety and public policy staff.
Globally, X had reduced its trust and safety staff by 30%, while this number was 45% for the Asia-Pacific region.
The number of engineers dedicated to trust and safety issues was slashed by 80% globally, while content moderators hired by X were cut by 52%, according to eSafety.
Public policy personnel were cut by 78% globally and 73% in Asia-Pacific, with its Australian outfit losing its entire team.
“Companies with low numbers of trust and safety personnel may have reduced capacity to respond to online hate, as well as other online harms,” the government agency said. “The result is that the burden for safety tends to fall on the user or group experiencing the abuse, rather than the platform taking responsibility for harmful content and conduct on their service.”
It noted that the median time it took X to respond to user reports slowed by 20% since its acquisition, and a slower 75% for responses to direct messages.
Also: How to get a Twitter blue checkmark (and other minor features)
“Prompt action on user reports is particularly important given that X relies solely on user reports to identify hateful conduct in direct messages,” eSafety said.
Automated tools designed to detect volumetric attacks in breach of X’s targeted harassment policy also were not used on the platform, as of May 2023. In addition, hyperlinks to websites containing harmful content were not blocked on X.
The social media platform further reinstated 6,103 previously banned accounts between November 2022 and May 2023, which eSafety believes referred to accounts in Australia rather than globally. It cited media reports that estimated more than 62,000 previously suspended accounts were reinstated worldwide.
Of the 6,103 reinstated accounts believed to relate to Australia, 194 accounts previously were suspended for hateful conduct violations, eSafety said. It added that X did not implement additional scrutiny on reinstated accounts.
Also: How to get started with Mastodon
“It’s almost inevitable that any social media platform will become more toxic and less safe for users if you combine significant reductions to safety and local public policy personnel with thousands of account reinstatements of previously banned users,” said eSafety Commissioner Julie Inman Grant. “You’re really creating a bit of a perfect storm.”
She also noted X’s ability to respond as quickly to user reports of online hate on its platform, with some not receiving a response to their direct messages for up to 28 hours.
eSafety said it gave notification to X Corp, confirming its failure to comply with the notice under the Online Safety Act.
The government agency last month also commenced civil penalty proceedings against X Corp for its alleged failure to comply with an earlier reporting notice issued in February 2023, on how it was meeting the country’s online safety rules concerning child sexual exploitation and abuse material and activity.
Also: I tried Bluesky Social and it’s basically a far friendlier Twitter (for now)
This had followed an infringement notice of AU$610,500 served to X Corp in September 2023, over its failure to comply with the February 2023 notice.
X Corp has not paid the infringement notice and instead has sought judicial review of eSafety’s reliance on the transparency notice, the government agency said. It added that it has requested for the judicial review to be heard in tandem with the civil penalty proceedings.
ZDNET sent questions to X and received this presumably automated email response: “Busy now, please check back later.”
eSafety in February also served legal notices to several social media platforms, including Twitter, Google, TikTok, Twitch, and Discord, seeking answers on the steps each was taking to address child sexual exploitation and abuse, sexual extortion, and the promotion of harmful content by its algorithms.
Australian Broadcasting Corporation (ABC) last August culled all but four of its accounts on X, citing trust issues and a need to move to where its audiences are.
READ MORE HERE