Search

Why Frameworks Will Help Social Media Giants Address Self-Regulation - International Policy Digest

sanirbanir.blogspot.com

Two weeks ago, the Facebook Oversight Board released its most highly anticipated decision so far— whether Facebook should reinstate former President Donald Trump’s Facebook and Instagram accounts. For those of us who have been closely following the messy and controversial geopoliticization of the Internet, this decision was a big test to see how the Oversight Board would handle complicated and difficult content moderation questions. For everyone else, this decision put the Oversight Board on the map.

In a move that found little enthusiastic support, the Oversight Board did not rule on whether Facebook must reinstate or terminate Trump’s accounts but rather pushed the decision back to Facebook. The ruling was based on the Board’s determination that the ban was appropriate but that an indefinite ban is not permissible because it does not follow a clear and standard procedure such as removing violating content, imposing a specified period of suspension, or permanently deleting an account. The Board requested that Facebook re-review the case and apply an appropriate standard penalty within six months.

The decision marginally comments on political speech by making it clear that political leaders cannot rely on political speech protections to promote violence on social media. Likewise, the Board affirmed that Facebook was within its right to de-platform any user—even a sitting head of state. It also sets a key precedent that elected officials and other influential users should be held to a higher standard than ordinary users because of their power to create harm.

Although there is disappointment in the decision for not providing a stronger position on platforms’ obligations in managing political leaders’ actions and speech (one way or the other), the Board did take a powerful stance—just not the one we were expecting. Specifically, the ruling calls out Facebook for attempting to use the Board as a shield to avoid its own responsibility in determining whether to reinstate Trump.

“In applying a vague, standardless penalty and then referring this case to the Board to resolve, Facebook seeks to avoid its responsibilities. The Board declines Facebook’s request and insists that Facebook apply and justify a defined penalty.”

The Board is clearly not interested in taking on Facebook’s work to make difficult decisions regarding the development and application of community standards. The Board’s role is not to define policies, standards, processes, or penalties that Facebook is required to adopt; it can merely suggest that Facebook does so while holding it accountable for acting within existing rules. These qualifications highlight the structural limitations of the Board to effect broader change in Facebook’s content policies. Facebook is unlikely to be happy finding itself back in the position of ultimate responsibility to determine how it handles tough topics in grey areas such as moderation of political figures. (Facebook’s Vice President of Global Affairs and Communications Nick Clegg has already stated that he finds the Board’s decision contradictory).

This outcome brings us to the larger issue of self-regulation. By establishing an oversight authority, Facebook attempted to build an independent mechanism that holds it accountable for content moderation decisions and ensures consistency and accuracy in the application of community standards. However, the Board’s Trump decision underscores that a defined set of governing guidelines is necessary for a successful self-regulation model, and creating those guidelines is simply not in the Board’s (nor Facebook’s) job description.

As Facebook CEO Mark Zuckerberg has previously called for, government bodies must develop stricter guidance for companies in areas such as online harms, political speech, privacy, and child safety to help guide more viable regulatory and governance models. Self-regulation through bodies such as the Oversight Board is only possible to the extent to which there is a consistent and universal framework by which individuals and organizations can govern against.

As such, to improve the efficacy of self-regulatory bodies and prevent future scenarios such as the deadly January 6 attack on the U.S. Capitol, Facebook, and its peer organizations should:

  • Collaborate with government bodies to develop a cohesive set of regulatory principles for online content and content moderation.
  • Investigate the key principles and success factors of effective self-regulatory models, such as that of the video game industry, to update current mechanisms.
  • “Stress test” existing community standards and policies with hypothetical scenarios (or ones faced by other platforms) to determine their weaknesses and gaps in coverage and application.
  • Establish consistent and transparent processes for moderating influential users, such as political leaders and elected officials, who present a heightened risk of harm.

Unfortunately, Facebook still faces a tough choice as it stares down the barrel of its near-term dilemma. On the one hand, it can either keep Trump’s accounts suspended (or permanently delete them) and alienate business-friendly conservatives and staunch supporters of the freedom of expression. On the other, it could reinstate his accounts and alienate its socially conscious employees and liberals interested in tackling online harms, misinformation, and societal division. Either way, Facebook and its peers will continue to find themselves in these difficult positions until there is an agreed-upon framework that platforms and government bodies can jointly use to tackle the growing problem of harmful online content.

If you're interested in writing for International Policy Digest - please send us an email via submissions@intpolicydigest.org

Adblock test (Why?)



"Media" - Google News
May 21, 2021 at 12:22AM
https://ift.tt/3u7DtDF

Why Frameworks Will Help Social Media Giants Address Self-Regulation - International Policy Digest
"Media" - Google News
https://ift.tt/2ybSA8a
https://ift.tt/2WhuDnP

Bagikan Berita Ini

0 Response to "Why Frameworks Will Help Social Media Giants Address Self-Regulation - International Policy Digest"

Post a Comment

Powered by Blogger.