Under pressure from critics who say Substack is profiting from newsletters that promote hate speech and racism, the company's founders said Thursday they would not ban Nazi symbols and extremist rhetoric on the platform.
“I just want to be clear that we don't like Nazis either; we wish no one had those views,” said Hamish McKenzie, co-founder of Substack. said in a statement. “But some people have those and other extreme opinions. With this in mind, we do not believe that censorship (even through demonetization of publications) will make the problem go away; In fact, it makes it worse.”
The answer came weeks later The Atlantic found that at least 16 Substack newsletters had “overt Nazi symbols” in their logos or graphics, and that white supremacists had been allowed to post on and profit from the platform. Hundreds of newsletter writers signed a letter opposing Substack's position and threatening to leave. About 100 other people signed a letter supporting the company's position.
In the statement, McKenzie said he and the company's other founders, Chris Best and Jairaj Sethi, had concluded that censoring or demonetizing posts would not make the problem of hate rhetoric go away.
“We believe that supporting individual rights and civil liberties while subjecting ideas to open discourse is the best way to strip bad ideas of their power,” he said.
That stance sparked waves of outrage and criticism, including from popular Substack writers who said they didn't feel comfortable working with a platform that allows hateful rhetoric to fester or flourish.
The debate has renewed questions that have long plagued technology companies and social media platforms about how content should be moderated, if at all.
Substack, which takes a 10 percent cut of revenue from writers who charge for newsletter subscriptions, has faced similar criticism in the past, particularly after it allowed some writers to use transphobic and anti-vaccine language.
Nikki Usher, a communications professor at the University of San Diego, said many platforms face what's known as “the Nazi problem,” which stipulates that if an online forum is available long enough, there will be extremists there. sometime.
Substack is establishing itself as a neutral content provider, Professor Usher said, but that also sends a message: “We're not going to try to police this issue because it's complicated, so it's easier not to take a position.”
More than 200 writers who publish newsletters on Substack have signed a letter oppose the company's passive approach.
“Why do you choose to promote and allow the monetization of sites that traffic in white nationalism?” the letter said.
The writers also asked if part of the company's vision for success included giving a platform to hateful people, like Richard Spencer, a prominent white nationalist.
“Let us know,” the letter said. “From there, each of us can decide if we still want to be here.”
Some popular writers on the platform have already vowed to leave. Rudy Fosterwho has more than 40,000 subscribers, wrote on Dec. 14 that readers often tell her they “can't stand paying for Substack anymore” and that she feels the same way.
“So here’s to a 2024 where none of us do that!” she wrote.
Other writers have defended the company. A letter signed by approximately 100 Substack writers says it's best to let writers and readers moderate content, not social media companies.
Elle Griffinwho has more than 13,000 subscribers on Substack, wrote in the letter that while “there is a lot of hate content on the Internet,” Substack has “come up with the best solution yet: giving writers and readers the freedom of expression without going out the surface”. that speech to the masses.”
He argued that subscribers receive only the newsletters they subscribe to, so they are unlikely to receive hateful content unless they follow it. That's not the case with X and Facebook, Griffin said.
She and the others who signed the letter supporting the company emphasized that Substack is not really one platform, but rather thousands of individualized platforms with unique and curated cultures.
Alexander Hellene, who writes science fiction and fantasy stories, signed Griffin's letter. In a post on SubstackHe said a better approach to content moderation was to “take things into your own hands.”
“Be an adult,” he wrote. “Block people.”
In his statement, McKenzie, the Substack co-founder, also defended his decision to feature Richard Hanania, president of the Center for the Study of Partisanship and Ideology, on Substack's podcast “The Active Voice.” The Atlantic reported that Hanania had previously described Black people on social media as “animals” who should be subject to “more policing, incarceration and policing.”
“Hanania is an influential voice for some in American politics,” McKenzie wrote, adding that “it is valuable to know his arguments.” She said he was not aware of Mr. Hanania's writings at the time.
McKenzie also argued in his statement that censoring ideas that are considered hateful only causes them to spread.
But investigation in recent years suggests he opposite It's true.
“Deplatforming appears to have a positive effect in decreasing the spread of far-right propaganda and Nazi content,” said Kurt Braddock, a communications professor at American University who has researched violent extremist groups.
When extremists are removed from one platform, they often go to another platform, but much of their audience does not follow them and their income eventually declines, Professor Braddock said.
“I can appreciate someone's dedication to free speech rights, but free speech rights are dictated by the government,” he said, noting that companies can choose the types of content they host or ban.
While Substack says it doesn't allow users to call for violence, even that distinction can be murky, Professor Braddock said, because racists and extremists can come close to the line without doing so overtly. But his rhetoric can still inspire others to violence, he said.
Allowing Nazi rhetoric on a platform also normalizes it, he said.
“The more they use the kind of rhetoric that dehumanizes or demonizes a certain population,” Professor Braddock said, “the more it is okay for the general population to follow it.”