Hamas is banned from Facebook, removed from Instagram and expelled from TikTok. However, posts supporting the group that carried out terrorist attacks in Israel this month continue to reach mass audiences on social media, spreading horrifying images and political messages to millions of people.
Several accounts sympathetic to Hamas have gained hundreds of thousands of followers on social platforms since the war between Israel and Hamas began on October 7, according to a New York Times review.
An account on Telegram, the popular messaging app with little moderation, reached more than 1.3 million followers this week, up from 340,000 before the attacks. That account, Gaza Now, is aligned with Hamas, according to the atlantic councilresearch group focused on international relations.
“We’ve seen Hamas content on Telegram, such as body camera footage of terrorists shooting at Israeli soldiers,” said Jonathan A. Greenblatt, executive director of the Anti-Defamation League. “We have seen images not only on Telegram but on other platforms of dead and bloody soldiers.”
These types of posts are the latest challenge for technology companies, as many of them try to minimize the spread of false or extremist content while preserving content that does not violate their rules. In past conflicts, such as the genocide in Burma either Other attacks between Palestinians and Israel.Social media companies struggled to strike the right balance, and watchdog groups criticized their responses for being too limited or sometimes overzealous.
Experts said Hamas and Hamas-linked social media accounts were now exploiting those challenges to evade moderation and share their messages.
Most online platforms have long banned terrorist organizations and extremist content. Facebook, Instagram, TikTok, YouTube and
Gaza Now had more than 4.9 million followers on Facebook before it was banned last week, shortly after The Times contacted Meta, Facebook’s parent company, about the account. Gaza Now did not publish the type of appalling content found on Telegram, but it did share accusations of wrongdoing against Israel and encouraged its Facebook followers to subscribe to its Telegram channel.
Gaza Now also had more than 800,000 collective followers on other social media sites before many of those accounts were also removed last week. His YouTube account had more than 50,000 subscribers before it was suspended on Tuesday.
In a statement, a YouTube spokesperson said Gaza Now violated the company’s policies because the channel’s owner had previously operated an account on YouTube that was terminated.
Telegram has become the clearest launching pad for pro-Hamas messages, experts said. Accounts there have shared videos of captured prisoners, dead bodies and destroyed buildings, and followers often respond with the thumbs-up emoji. In one case, users ordered each other to upload gruesome images of Israeli civilians being shot to platforms such as Facebook, TikTok, Twitter and YouTube. The comments also included suggestions on how to alter the footage to make it difficult for social media companies to find and easily remove it.
Telegram also hosts an official account for the Al-Qassam Brigades, the military wing of Hamas. Your number of followers has tripled since the conflict began.
Pavel Durov, CEO of Telegram, wrote in a mail last week that the company had removed “millions of obviously harmful content from our public platform.” But he indicated that the app would not ban Hamas entirely, saying those accounts “serve as a unique source of first-hand information for researchers, journalists and fact-checkers.”
“While it would be easy for us to destroy this source of information, doing so risks exacerbating an already dire situation,” Durov wrote.
X, owned by Elon Musk, was inundated with falsehoods and extremist content almost as soon as the conflict began. Researchers at the Institute for Strategic Dialogue, a political advocacy group, found that in a 24-hour period, a collection of posts on X supporting terrorist activities received more than 16 million views. The European Union said it would examine whether X violated a European law requiring major social networks to stop the spread of harmful content. X did not respond to a request for comment.
However, accounts not directly claimed by Hamas present thornier challenges for social media companies, and users have criticized the platforms for being overzealous in removing pro-Palestinian content.
Thousands of Palestinian followers said Facebook and Instagram had deleted or removed their posts, even when the messages did not violate the platforms’ rules. Others reported that Facebook had removed accounts calling for peaceful protests in cities across the United States, including planned sit-ins in the San Francisco area over the weekend.
Meta said in a blog entry on Friday that Facebook may have inadvertently removed some content as it worked to respond to a spike in reports of content that violated the site’s policies. Some of those posts were hidden due to an accidental error in Instagram’s systems that did not display pro-Palestinian content in its Stories feature, the company said.
Masoud Abdulatti, founder of a healthcare services company, MedicalHub, who lives in Amman, Jordan, said Facebook and Instagram had blocked his posts supporting Palestinians and that he had taken to LinkedIn to share his support for civilians in Gaza that were trapped. in the middle of the conflict.
“The people of the world ignore the truth,” Abdulatti said.
Eman Belacy, a copywriter living in Egypt’s Sharkia governorate, said she normally used her LinkedIn account only for business networking, but had started posting about the war after feeling that Facebook and Instagram were not showing the full picture of the devastation in Gaza. .
“It may not be the place to share war news, but excuse us, the amount of injustice and hypocrisy is unbearable,” Belacy said.
The challenges reflect the blunt content moderation tools that social networks have increasingly relied on, said Kathleen Carley, a researcher and professor at Carnegie Mellon University’s CyLab Institute for Security and Privacy.
Many companies, he said, rely on a combination of human moderators (who can be quickly overwhelmed during a crisis) and some computer algorithms, with no coordination between platforms.
“Unless you moderate content consistently, for the same story across all major platforms, you’re just playing Whac-a-Mole,” Carley said. “It’s going to resurface.”
Sheera Frenkel contributed with reports.