Arbiters of Truth

In the aftermath of the presidential election, many Americans have blamed Facebook for exerting undue influence during the campaign season. The company has been accused of contributing to the spread of misinformation, racist language, and “alt-right” memes that were employed by many pro-Trump supporters.  It has also been criticized for creating a so-called “filter bubble,” in which users’ newsfeeds only connected them to others who held the same political views, thus distorting their impressions of the level of support existing for each candidate.

In a lengthy status update, Facebook founder Mark Zuckerberg addressed the community’s call for curation of content, stating that while Facebook could find ways to tell the online community “what content is most meaningful,” it needed to be cautious about becoming an arbiter of truth itself.

Zuckerberg brought up a point that isbecoming increasingly important in discussions of free speech—the role of online communities. While previously monitored, albeit to varying extents, by states around the world, social networks now provide users with a platform that transcends the borders of any single nation. Turkey can charge its own newspapers and television stations that are critical of President Erdogan's policies with tax fraud and massive fines, but it is unable to levy fines on an international, US-based company such as Facebook. Similarly, the Hungarian government’s media authority can impose restrictions on local media content, but it can do little to restrict information passed through social media. 

Part of the dilemma faced by the
company concerns its self-perception.

Digital communication is reshaping the world, something to which Facebook’s 1.8 billion users can attest. And although the state can still monitor traditional sources of news, the barriers to information are far lower today than they were in the past. Furthermore, state attempts to monitor online interactions can be thwarted easily through the use of a proxy server or VPN.

Inaction

Part of the criticism Facebook faces is due its inaction on the fake news that was circulating on its website leading up to the elections. One widely-shared article claimed that Pope Francis had broken a tradition of nonpartisanship by endorsing Donald Trump for President. Despite the fact that the Pope, a staunch advocate for refugees, had done nothing of the sort, the post was shared almost a million times on Facebook, while its correction was barely heard. This article came from one of the many sites masquerading as legitimate news sources; it was only through the platform provided by Facebook that these sites were able to disseminate such false information for their financial and ideological advantage.

The ease and speed at which one can share information on social media makes its potential to shape political events immense. Facebook proudly lauds its role in the Arab Spring, also dubbed the “Twitter revolutions” in reference to the use of social networks by protesters and demonstrators. Despite government efforts to shut down Facebook pages related to the protests, the social network was critical for spreading information and coordinating mass actions. The grassroots nature of the revolution was made possible by the existence of a social network wherein protesters could directly and immediately contact each other on a large scale. International news organizations that were banned from reporting in Tunisia were able to pull videos and reports from Facebook and broadcast them around the world, drawing attention to events that would have otherwise been obscured by censorship. This rapid transfer of information facilitated a chain reaction of revolutions in the Middle East as citizens realized their discontent with government was not isolated, but part of a larger pattern.

However, the power of social media networks to inspire collective action has also been exploited by terrorist organizations and neo-radical organizations to recruit and encourage hate crimes. A study published by the Brookings Institution reported that at least 46,000 user accounts on Twitter were operated by ISIS. The group has used Twitter extensively, to share videos of hostage executions, spread hate propaganda, and even recruit new members internationally. Since Twitter has a platform of more than 288 million users, the potential for ISIS to spread its message across national borders is enormous. Accordingly, international security concerns have made regulation and censorship of tweets a priority. Despite Twitter’s continuing efforts to regulate its user accounts, it has still faced heavy criticism for not having acted sooner.  

Interference

The above situations describe the extreme ends of the spectrum: social media can be used to either catalyze or obstruct the flow of information. However, the extent to which Facebook should curate the information it publishes remains unclear.

Consider the two following attempts to influence the US presidential elections using Facebook. The Saturday before the presidential elections a fake news site dubbed “The Denver Guardian” published an article full of negative messages about Hillary Clinton, including a claim that the FBI agent connected to her email disclosures had murdered his wife and shot himself.[1] On Election Day, a Republican mayor posted the following Facebook status: “Remember the voting days: Republicans vote on Tuesday, 11/8 and Democrats vote on Wednesday, 11/9.”

In which situations should Facebook have interfered, and to what extent? Part of the dilemma faced by the company concerns its self-perception. Facebook remains dedicated to crafting a public image of its product as a blank canvas that depends on the users to personalize itself. Users’ newsfeeds, photos, and even the advertisements they receive are all shaped by the history of their preferences and previous behavior on the website. Facebook even dismissed its Trending Topics team, who formerly summarized trending headlines and delivered proprietary, bite-sized summaries of news stories to users. The employees were replaced with algorithms. Facebook executives said this change would remove the inevitable personal bias in news stories written by human beings. Any censorship of fake news, by contrast, would tarnish this carefully fashioned public image of neutrality.

Mark Zuckerberg may claim that the staff at Facebook are not “arbiters of truth,” right now, but the rampant misinformation that has occurred this election season has made one fact clear: they need to be.

Unfortunately, this reluctance to alter its image has also limited the social network’s growth and progress. Facebook is not registered as and does not see itself as a media company; it argues that it is under no moral or professional obligation to behave as an arbiter of truth. However, it is also one of the largest distributors of news online. A study by the Pew Research Center even concluded that nearly half the American adult population relies on Facebook as its primary news source. Traditionally, people have trusted that the news is reliable. Facebook’s insistence on leaving its content untouched and its sources unvetted has allowed for the propagation of false information and news. This has of course proved damaging to political figures. More troublingly, though, the rapid spread of outlets like “The Denver Guardian” lessens people’s faith in traditional news organisations.

Facebook’s vehement advocacy of free speech is undermined, though, by recent reports that the social network is developing software to suppress posts from appearing on user newsfeeds in specific geographic areas. A claim of supporting free speech for ethical reasons begins to sound hypocritical, since the software was created to curry favor with China’s notoriouslyto gain access to its enormous market.

Maintaining a status as a carrier company frees Facebook from any legal suits a media company is subject to, such as defamation, invasion of privacy, or copyright infringement. However, if Facebook wishes to present itself as a reliable news source, some level of editorial oversight is needed. Mark Zuckerberg may claim that the staff at Facebook are not “arbiters of truth,” right now, but the rampant misinformation that has occurred this election season has made one fact clear: they need to be.