Topics

former

AI

Amazon

Article image

Image Credits:TechCrunch

Apps

Biotech & Health

clime

Cloud Computing

DoC

Crypto

Enterprise

EVs

Fintech

fundraise

convenience

Gaming

Google

Government & Policy

Hardware

Instagram

Layoffs

Media & Entertainment

Meta

Microsoft

Privacy

Robotics

Security

societal

Space

Startups

TikTok

transportation system

speculation

More from TechCrunch

Events

Startup Battlefield

StrictlyVC

Podcasts

Videos

Partner Content

TechCrunch Brand Studio

Crunchboard

Contact Us

X , formerly Twitter , is trying to conciliate lawmaker about the app ’s safety bar ahead of a Big Techcongressional hearing on Wednesday , which will centre on how company like X , Meta , TikTok and others are protecting kids online . Over the weekend , the social medium company announced viaBloombergthat it would staff a new “ Trust and Safety ” center in Austin , Texas , which will include 100 full - time content moderators . The move come more than a class after Elon Musk learn the company , which saw him drastically reducing head count , including trust and safety teams , moderators , engineers and other stave .

In addition to this , Axios earlier reportedthat X CEO Linda Yaccarino had been meet last week with bipartisan extremity of the Senate , including Sen. Marsha Blackburn , in advancement of the coming hearing . The executive director was said to have discussed with lawmakers how X was combat child sexual exploitation ( CSE ) on its platform .

As Twitter , the company had a difficult chronicle with properly moderate for CSE — something that was the subject ofa nestling safety lawsuit in 2021 . Although Musk inherited the problem from Twitter ’s former direction , along with many other struggles , there has been concernthat the CSE problem has decline under his leadership — particularly pass on the layoff of the reliance and safety team phallus .

After take the reins at Twitter , Muskpromisedthat address the event of CSE content was his No . 1 priority , buta 2022 account by Business Insiderindicated that there were still posts where people were requesting the content . The company that yr also append a new feature film to report CSE material . However , in 2023,Musk welcome back an accountthat had beenbannedfor posting CSE persona previously , leading to questions around X ’s enforcement of its insurance . Last year , an investigation by The New York Timesfound that CSE imagery proceed to circularize on X ’s platform even after the company is notified , and that widely circulated material that ’s easier for companies to name had also remained online . This report stand in stark dividing line toX ’s own statements that claimedthe company had aggressively approached the issue with increased account suspensions and changes to search .

Bloomberg ’s report on X ’s plan to tot moderator was light on key details , like when the new center would be undetermined , for instance . However it did remark that the moderators would be employed full - time by the party .

“ X does not have a line of business focus on children , but it ’s important that we make these investments to keep terminate offenders from using our platform for any distribution or conflict with CSE content , ” an executive at X , Joe Benarroch , tell the outlet .

X alsopublished a web log Emily Price Post on Fridaydetailing its progression in combat CSE , noting that it suspend 12.4 million history in 2023 for CSE , up from 2.3 million in 2022 . It also send 850,000 news report to the National Center for Missing and Exploited Children ( NCMEC ) last year , more than eight times the amount sent in 2022 . While these metrics are meant to show an increased response to the trouble , they could designate that those seeking to divvy up CSE content are progressively now using X to do so .

Join us at TechCrunch Sessions: AI

Exhibit at TechCrunch Sessions: AI