Search

Social Media Needs an Election Declaration of Conscience - The New York Times

sanirbanir.blogspot.com

On election night in 2012, Barack Obama trailed Mitt Romney by some 30,000 votes at the moment Mr. Obama was projected to win his re-election bid. By the time the votes were tallied, Mr. Obama had five million more votes than Mr. Romney.

In 2016, when Donald Trump accepted Hillary Clinton’s concession, he was leading the popular vote count by nearly one million votes. When all the votes were counted, he lost the popular vote by nearly three million.

This longstanding and well-documented phenomenon, known as the “blue shift,” opens the door to a troubling scenario on election night this year.

Imagine: It’s midnight, and the electoral map looks quite red. But news networks and election officials aren’t calling the swing states, as this year’s record numbers of mail-in and absentee ballots have yet to be fully counted. Mr. Trump, leading in the popular vote, decides he’s seen enough. He takes to his social media platforms and declares that he has won re-election and will accept no other result. He tells his tens of millions of followers that the Democrats and the press will try to change the result and steal the election. The door to unrest and constitutional crisis swings wide open.

Facebook, Twitter and YouTube have all pledged to crack down on misinformation around voting and electoral outcomes. Perhaps in the above scenario they append a label to the president’s posts saying that the information is disputed and that the results are not in. They could introduce friction into the algorithms to slow the reach of the posts.

But pro-Trump lawmakers and pundits most likely would have picked up the argument by then, amplifying the president’s message. What started as one prominent piece of voter disinformation easily could become widespread in the Republican Party and among a large segment of Americans. What would the platforms do then?

That one cannot answer that question with confidence, weeks before Election Day, is alarming. The platforms’ content moderation decisions are often arbitrary and, when public officials are concerned, left up to the judgment of a few top executives. Their corporate desires to avoid accusations of bias and appear politically neutral, however admirable in principle, break down in an information environment in which the political parties do not share the same relationship to the truth and to democratic norms.

To prevent such a nightmare scenario across social media, technology’s biggest platforms need to create a clear, explicit framework for what qualifies as electoral misinformation and disinformation. They must determine exactly what they will not tolerate and what the penalties will be for violating those rules. Then they ought to make those rules public.

Even better: The tech companies could form a consortium to formalize these standards across platforms. For a template, they could look to the work of the Election Integrity Partnership, which built a framework for grading Big Tech’s election security policies and has determined “that few platforms have comprehensive policies on election-related content.”

Such a united front wouldn’t just be symbolic. “If you had the platforms together making a statement of their values, then when they take action, it creates a permission structure for reticent platform executives to make difficult decisions quickly,” David Kaye, former United Nations special rapporteur on freedom of opinion and expression, told the editorial board. Such a move would also be a strong public signal of the gravity of the moment.

There’s precedent for this type of collaboration. In 2016, Facebook, Twitter, Google and Microsoft came together to combat extremist content. The companies created a shared database using unique digital fingerprints to flag videos, pictures and memes promoting terrorist activity and ideologies. Domestic political disinformation poses different challenges than terrorist threats, but both are urgent matters of national security.

A public, transparent effort from the platforms would offer additional accountability for those spreading disinformation in the weeks and months after the election. “Be very clear and publish a database the public can access,” Mr. Kaye urged the platforms. “Say, ‘These are the accounts we took action against. Here’s why.’ It doesn’t need to be a legal opinion, just a list. If there are privacy concerns, they can redact names.”

There are very few tools now for parsing how messages spread across social media. Three days after the 2016 election Facebook purchased the best one, a tool called Crowdtangle, which tracks online engagements with social media posts. It is the best available method to understand what is popular on the platform, though Facebook argues that tracking engagement is not a reliable indicator for how many people saw a post. At this pivotal moment for American democracy, Facebook owes it to the American public to provide metrics to evaluate that claim.

Facebook, Twitter and Google will most likely argue that they’re doing plenty of this work behind the scenes. In a recent interview, Nick Clegg, Facebook’s head of global affairs, said the company was war-gaming election night scenarios. “There are some break-glass options available to us if there really is an extremely chaotic and, worse still, violent set of circumstances,” he said. But Mr. Clegg stopped short of offering specifics.

Such vagaries are worrisome, especially since Mr. Clegg admitted that “any high-stakes decisions will fall to a team of top executives” like Mr. Clegg and Facebook’s chief operating officer, Sheryl Sandberg, with Mr. Zuckerberg “holding the right to overrule positions.”

These platforms have consolidated power to control the flow of information to billions of people. The power to judge which content is harmful to democracy on election night rests with a handful of tech executives. That Mr. Zuckerberg, the ultimate arbiter at Facebook, is accountable to no one, including his company’s board, is even more alarming.

A handful of unelected entrepreneurs — most of whom have no formal education on matters of election and First Amendment law or online extremism and misinformation — are asking a frightened and anxious American public to trust them. To trust that they’ll put the interests of the country over those of their corporations. To trust that they’ll remain politically unbiased.

The trust they’re asking for has not yet been earned. It’s time these companies came together and pledged their commitment to the public interest.

The Times is committed to publishing a diversity of letters to the editor. We’d like to hear what you think about this or any of our articles. Here are some tips. And here’s our email: letters@nytimes.com.

Follow The New York Times Opinion section on Facebook, Twitter (@NYTopinion) and Instagram.

Let's block ads! (Why?)



"social" - Google News
September 28, 2020 at 06:00AM
https://ift.tt/2S6f7to

Social Media Needs an Election Declaration of Conscience - The New York Times
"social" - Google News
https://ift.tt/38fmaXp
https://ift.tt/2WhuDnP

Bagikan Berita Ini

0 Response to "Social Media Needs an Election Declaration of Conscience - The New York Times"

Post a Comment

Powered by Blogger.