YouTube said it was always working to strike a balance between allowing free expression and protecting online and offline communities from harm. Nicole Bell, a company spokeswoman, said YouTube removed six videos flagged by Media Matters for violating its policies and canceled a channel for uploading content from a banned creator. But most of the more than two dozen videos flagged by Media Matters did not violate the platform’s rules, she said.
Last year, the International Fact Check Network, which represents more than 80 organizations, warned in a letter He directed YouTube that the platform was “one of the leading conduits for disinformation and misinformation online around the world” and that it was failing to address the issue.
The consequences of softening the fight against misinformation have been made clear on Twitter. TO new report by two advocacy groups, the Network Contagion Research Institute and the Combat Antisemitism Movement, found an increase in antisemitic content when Musk took over.
He described a campaign organized by extremists who had previously been banned from the platform. One, Tim Goniet, who used the name Baked Alaska online, was recently convicted and sentenced to 60 days in prison for his role in the Jan. 6 riot at the Capitol. Tweeting this month, he held down what he called a conspiracy theory: “Twitter unbanned us all because their commitment was collapsing without us.”
“It’s true that the trust and safety efforts we’ve had to date have really broken down, but at least there were efforts,” said Mr Finkelstein, author of the report. “And there was a baby in the bathwater.”
Despite Musk’s statement to encourage unrestricted speech on the platform, he has also decided to suspend accounts, such as Kanye West’s, after a series of anti-Semitic comments.
Nora Benavidez, a senior adviser at Free Press, a digital rights and accountability group, said the experience at Twitter showed that moderating offensive content remains important to the viability of the platforms, regardless of economic considerations.
“Content moderation is good for business and it’s good for democracy,” he said. “Businesses aren’t doing that because they seem to think they don’t have a big enough role to play, so they’re turning their backs on it.”