Search

Opinion: We need governance of social media to stop violence and hate - Devex

sanirbanir.blogspot.com
Social media companies are increasingly facing dangerous posts on their platforms. We need governance solutions for social media, and we need them now. Photo by: dole777 on Unsplash

We know social media content can lead to violence, but is there a plan to stop it? Facebook has tried to improve its content moderation after it was used as a tool in spreading online incitement of violence in Myanmar. But recent violence in France and Ethiopia — which has drawn attention in the United States Congress — shows that there is work to be done.

Meanwhile YouTube, WhatsApp, Twitter, and TikTok are increasingly facing dangerous posts on their platforms. We need governance solutions for social media, and we need them now. And independent oversight is the only path to real change.

In many parts of the global south, Facebook depends heavily on third-party content flaggers, such as civil society organizations and other local organizations. Take Myanmar as an example. There, Facebook uses CSO third parties to identify dangerous content — which is certainly a more effective method than than its own detection mechanisms in some countries, since CSOs, being staffed with locals who speak local languages and understand the social and political country context deeply, are more suited to point out linguistic and cultural nuances in online posts and comments.

Facebook has made strides in how it consults with civil society: For high-profile events in high-profile countries, like the 2020 Myanmar elections, it holds group consultations involving multiple CSOs.

However, reports suggest Facebook is not always transparent with these CSOs. Perhaps the company would cite potential legal trouble for sharing user information, but a structural examination would show that as a profit-driven company, Facebook has no real transparency obligations to these organizations.

Social media companies should be monitored by a body made up of civil society, multilateral organizations, and researchers, rather than one staffed with those very companies’ picks.

Either way, the CSOs that do the heavy lifting on content moderation may not get to find out what happens in content moderation systems. Once they report content to Facebook, it may take some time for Facebook to respond, and in some cases CSOs may never know what action is taken on flagged posts, how such decisions are made, or where this data is stored.

The lack of transparency isn’t exclusive to the global south: Internal policy processes at Facebook aren’t always clear. A 2020 survey into Facebook’s internal processes revealed ad hoc decision-making and an environment where employees say they “are making rules up.”

In short, no one really knows how Facebook — or other social media companies — makes content decisions, and given the potential harms, this has to change.

Unfortunately, these companies cannot fix themselves. They have no reason to be transparent and so they struggle to hold themselves accountable. The most meaningful attempt at real change thus far is the Facebook Oversight Board, which has just started taking up cases.

But its powers are too limited: To our knowledge, its only real authority is issuing one-off decisions on content that arises through the appeals process, and the board is unable to oversee Facebook’s data practices, its rule-making processes, the enforcement of the Community Standards at scale, or its algorithms for content detection. As others have noted, it isn’t set up to address the relevant issues.

If self-regulation is off the table, what about oversight from governments? This is also not a great idea as there is a big conflict of interest — this year alone has seen multiple cases of governments pressuring these private companies with demands. Like when the Thai government ordered Facebook to block a group deemed critical of the country's monarchy, or when a U.S. House Committee pressured Twitter to disclose information on a data breach regarding alleged Saudi Arabian spies.

While the merits of these cases are different, they illustrate the same idea: Governmental interests clash with responsible governance, whether it is a politician’s reputation or national security. We cannot — and should not — expect social media companies to be completely transparent to states.

We propose a brand of oversight that is independent, collaborative, and accountable. Social media companies should be monitored by a body made up of civil society, multilateral organizations, and researchers, rather than one staffed with those very companies’ picks.

Its powers should be broad: auditing platforms’ enforcement of speech standards, algorithms, human reviewers, privacy practices, and internal policy processes, among other things.

This ideal oversight body should have an array of expertise: from international law backgrounds to software capabilities to local socio-political context in various countries. It should be able to tap into global networks of civil society and grassroots organizations. It should center a human rights approach — free of competing governmental interests. And of course, it cannot be a profit-maximizing initiative: To hold social media accountable, its first responsibility must be to good governance.

Crucially, this body would enable coordination. If multiple platforms are faced with threats in similar regions, they can communicate on risk mitigation — communication that is deeply needed in countries like Myanmar.

It would also harmonize communications among civil society from within and among countries. For content decisions, there isn’t always a single right answer to whether a post is a violation.

We are not suggesting that Facebook and Twitter must always make the same decisions as one another when faced with identical pieces of content. But with coordinated independent review, platforms would be guided by similar frameworks and evaluated by similar metrics, all starting from a place of local knowledge: civil society.

Just because governments won’t monitor platforms directly, that doesn’t mean they shouldn’t get involved. An independent body could still set standards for law-making.

If the body were running well, regulators could require social media companies to be audited by the body — adding yet another layer of coordination to social media oversight — because we haven’t made enough steps toward transparency and accountability, and stakes are too high to keep waiting.

Printing articles to share with others is a breach of our terms and conditions and copyright policy. Please use the sharing options on the left side of the article. Devex Pro subscribers may share up to 10 articles per month using the Pro share tool ( ).

Let's block ads! (Why?)



"Media" - Google News
December 18, 2020 at 05:51AM
https://ift.tt/3h1CfVA

Opinion: We need governance of social media to stop violence and hate - Devex
"Media" - Google News
https://ift.tt/2ybSA8a
https://ift.tt/2WhuDnP

Bagikan Berita Ini

0 Response to "Opinion: We need governance of social media to stop violence and hate - Devex"

Post a Comment

Powered by Blogger.