Meta's Oversight Board has now expanded its scope to include the company's newest platform, instagram Threads. Designed as an independent appeals board that hears cases and then makes precedent-setting content moderation decisions, the board to date has decided cases such as facebook's ban of Donald Trump, COVID-19 misinformation, the breast cancer photo removal and more.
Now the board has begun hearing cases arising from Threads, Meta's twitter/x competitor.
This is a major point of differentiation between Threads and rivals like x, where Elon Musk and other users rely heavily on crowdsourced fact-checks by Community Notes to supplement their otherwise light moderation. It is also very different from how decentralized solutions, such as Mastodon and Bluesky, manage moderation tasks on their platforms. Decentralization allows community members to establish their own servers with their own set of moderation rules and gives them the option to defederate from other servers whose content violates their guidelines.
Startup Bluesky is also investing in stackable moderation, meaning community members can create and run their own moderation services, which can be combined with others to create a personalized experience for each individual user.
Meta's decision to hand over tough decisions to an independent board that could override the company and its CEO, Mark Zuckerberg, was intended to be the solution to the problem of Meta's centralized authority and control over content moderation. But as these startups have shown, there are other ways to do it that allow the user to have more control over what they see, without stepping on the rights of others to do the same.
However, the Supervisory Board announced on Thursday that listen to your first case of Threads.
The case involves a user's response to a post containing a screenshot of a news article in which Japanese Prime Minister Fumio Kishida made a statement about the alleged underreporting of fundraising income by his game. The post also included a caption criticizing him for tax evasion and contained derogatory language, as well as the phrase “die.” He also used derogatory language for someone who wears glasses. Due to the “drop dead” component and hashtags calling for death, a human reviewer at Meta decided that the post violated the company's violence and incitement rule, despite sounding very similar to their run-of-the-mill x post. days. After his appeal was denied a second time, the user appealed to the Board.
The Board says it selected this case to examine Meta's content moderation policies and Threads' enforcement of political content practices. This is a timely measure, considering that it is an election year and that Meta stated that it would not proactively recommend political content on instagram or Threads.
The Board's case will be the first involving Threads, but it will not be the last. The organization is already preparing to announce another package of cases tomorrow focused on nationality-based criminal charges. Meta referred these latter cases to the Board, but the Board will also receive and evaluate appeals from Threads users, as it did with Prime Minister Kishida's case.
The decisions the Board makes will influence how Threads as a platform chooses to defend users' ability to express themselves freely on its platform, or whether Threads will moderate content more closely than on twitter/x. Ultimately, that will help shape public opinion about the platforms and influence users to choose one over the other, or perhaps a startup experimenting with new ways to moderate content in a more personalized way.