Meta’s Oversight Board has now expanded its jurisdiction to cover Instagram Threads, Meta’s latest platform. As an independent appeals board, the Oversight Board reviews cases and establishes precedent-setting content moderation decisions. Its past cases include Facebook’s ban of Donald Trump, COVID-19 misinformation, and the removal of breast cancer photos, among others.
Now, the board is addressing cases from Threads, Meta’s competitor to Twitter/X.
This marks a notable distinction between Threads and rivals like Twitter/X, which depend on Community Notes for crowdsourced fact-checking alongside minimal moderation. This approach contrasts sharply with how decentralized solutions, such as Mastodon and Bluesky, handle moderation. Decentralization empowers community members to create their own servers with specific moderation rules and allows them to de-federate from servers that don’t align with their guidelines.
Bluesky is also exploring stackable moderation, which permits users to develop and operate their own moderation services, combining them to craft a customized experience for each user.
Meta’s strategy of delegating difficult decisions to an independent board capable of overruling the company and CEO Mark Zuckerberg intended to resolve issues related to Meta’s centralized control over content moderation. However, these startups illustrate alternative methods that give users more control over their viewing experiences without infringing on others’ rights to do the same.
On Thursday, the Oversight Board announced it would hear its first case from Threads.
The case involves a user’s reply to a post featuring a screenshot of a news article where Japanese Prime Minister Fumio Kishida commented on his party’s alleged underreporting of fundraising revenues. The caption criticized him for tax evasion, included derogatory language, and the phrase “drop dead.” It also used an offensive term for people who wear glasses. A Meta reviewer determined the post violated the Violence and Incitement rule due to the “drop dead” phrase and hashtags advocating death, despite the post’s resemblance to common Twitter/X content. After repeated rejection of their appeal, the user approached the Board.
The Board chose this case to scrutinize Meta’s content moderation policies and practices regarding political content on Threads. This is particularly relevant during an election year, given Meta’s stance against proactively recommending political content on Instagram or Threads.
The Board’s examination of this case will be the first for Threads, but not the last. The organization is set to announce another set of cases tomorrow, addressing criminal accusations based on nationality. These cases were referred by Meta, but the Board will also take on appeals from Threads users, similar to the case involving Prime Minister Kishida.
The Board’s decisions will influence how Threads manages users’ freedom of expression on its platform or whether it will adopt stricter content moderation compared to Twitter/X. This, in turn, will shape public perception of the platforms, guiding user choices between them or towards startups experimenting with personalized content moderation approaches.