Private Actors, Public Consequences

In its early days, social media was lauded as a tool that could bring down authoritarian regimes and usher in a new era of democratization. A decade later, these hopes have not only failed to bear fruit, but social media has helped repressive regimes surveil their citizens and manipulate popular opinion for their own strategic gain. Meanwhile, companies like Facebook, Google, and Twitter—each of which have user bases larger than the population of most countries—remain largely unregulated in the United States, in part by opportunistically claiming or rejecting the role of "publisher." But as long as these tech giants are able to elude classification (and therefore regulation), savvy governments will continue to use social media to incite violence, sow chaos, and increase oppression.

Introduction

"Without Twitter, the people of Iran would not have felt empowered and confident to stand up for freedom and democracy…With Twitter, they now shout hope with a passion and dedication that resonates not just with those on their street, but with millions across the globe."1

In his July 2009 op-ed for The Christian Science Monitor, former deputy national security advisor at the National Security Council Mark Pfeifle characterized Twitter as a powerful tool for democracy-one that would help citizens resist and challenge the actions of oppressive regimes. Twitter never won the Nobel Peace Prize Pfeifle imagined. Nearly a decade later, the tech giant has instead become mired in a range of controversies, including election meddling and terrorism.2 In September of 2018, Twitter CEO Jack Dorsey found himself testifying before the US Congress, as the company was once again called upon to explain its activities on topics ranging from political bias in content moderation to notifying users about fake or malicious accounts.3

Twitter and Mr. Dorsey are not alone. Since the 2016 US presidential election, representatives from major technology companies like Google, Facebook, Amazon, and Apple have been called to testify before US lawmakers dozens of times. More than once, the subject of their testimony has largely centered on charges of Russian interference in the 2016 US elections, much of which is believed to have been carried out through fake accounts and propaganda disseminated via social media. [End Page 81]

Though social media companies have occasionally acknowledged shortcomings in the protections their systems offered against misuse—as well as their own responses to potential threats—their business and product development activities remain largely unregulated. As new projects like Google's proposed re-entry into China with a pre-censored search product and Amazon's sale of facial recognition software to the State Department illustrate, the tools and platforms these companies control have the capacity to influence both public opinion and policy in the US and abroad.4,5 This powerful technology deserves increasing scrutiny as corporate interests threaten to conflict with fundamental principles of political non-interference and national security. The extent to which the technical protocols and moderation priorities of social networks shape and even drive public debate and policy also has substantial consequences for international relations. As Mr. Dorsey himself acknowledged before the US Congress in September of 2018: "I do believe there is growing concern around power companies like ours hold."6

Network Nations

With over 2 billion active users as of 2018, Facebook alone has a larger "population" than any single country on earth. Yet despite their size, the governance structure of Facebook and other social media companies—several of which boast user bases the size of countries—is both limited and opaque. The bulk of Facebook's terms and policies focuses on commercial activities (e.g. advertising) and intellectual property issues (e.g. copyright violations and others' use of Facebook's resources, such as their Application Programming Interfaces and branded content).7 While claiming that its mission is "to give people the power to build community and bring the world closer together,"8 Facebook's "Community Standards" section is the only section that even begins to address the way that the platform defines acceptable norms and behaviors. Yet, even this brief section provides only the most cursory guidance as to what may be allowed on the platform, and the guidance it does offer falls far short of anything that could be termed "governance." Though Facebook offers a handful of explicit rules for using the platform (e.g. one cannot create an account using a "false" name), there is neither an explanation of how violations are weighted (e.g. is an account with a false name more or less serious than posting hate speech?), nor a delineation of what the penalties may be for one or more such violations. Facebook also fails to offer explicit commitments to enforcement or redress. On the contrary, perhaps the most concrete piece of these terms is an almost unequivocal disclaimer of liability: [End Page 82]

Our aggregate liability arising out of or relating to these Terms or the Facebook Products will not exceed the greater of $100 or the amount you have paid us in the past twelve months.9

In other words, almost no matter what happens on Facebook, legally it is not the company's fault.

Section 230 Protection

This special immunity enjoyed by Facebook and other social media companies can be traced back to what is often referred to as simply "Section 230" of the US Communications Decency Act of 1996.10 In early lawsuits over online content in the United States, lower courts in the mid-1990s had found that Internet Service Providers (ISPs) were deemed "publishers" if they exerted "editorial control" (e.g. filtering, deletion or moderation of content) over the content posted on their services, and so could be held liable for libelous statements available there, even if posted by users.11 In providing "interactive computer service providers" with Section 230 immunity, lawmakers hoped to encourage providers to develop tools and policies for identifying and removing problematic content, like pornography.12

Though designed to incentivize the creation of tools to filter and classify content, critics would argue that the immunity that social media platforms enjoy thanks to Section 230 has had precisely the opposite effect. In fact, the law has allowed social media platforms to become petri dishes of misinformation, violence, and hate. Though the companies often cite a commitment to "free speech" as justification for declining to remove problematic users and content, this is, paradoxically, the same argument they use when defending their right to block both platform features and individual users.13,14

Unlike traditional media publishers, who bear significant legal and financial liability for the material they publish, social media companies are immune to such liability. Furthermore, since they simultaneously command the vast majority of the global online advertising market, they can efficiently monetize any interaction.15 In other words, whether or not the items in an outrageous Facebook feed are true, the platform still makes money. In fact, it's quite possible that the toxic, fantastical, and fanatical are even more profitable than "authentic" content, as some research suggests that messages evoking anger and anxiety can prompt further information-seeking.16 Since every interaction can be monetized, the ad-driven business model of most social media companies means that there is no such thing as a "bad" click. [End Page 83]

The Social Media Counterpoint?

Even a cynical view of social media does not dispel the reality that these platforms retain the potential to provide an important alternative to traditional media outlets, especially in regions where traditional media—such as newspapers, broadcast television, and radio—are subject to direct or indirect state control. While the idea that Twitter should be awarded the Nobel Peace Prize might seem naive today, there is little doubt that social media can provide citizens with a tool for challenging "official" versions of events—but only insofar as the platforms are willing to genuinely commit to the values of free speech and equal access that helped make them so powerful in the first place.

At times, platforms like Twitter have done just that. In 2011, Twitter actively resisted US law enforcement attempts to obtain metadata from the company about a user's location during a protest.17 The ubiquity and independence of the platform for many years made it an essential alternative to other forms of dissemination. During the Gezi Park protests in 2013, the national Turkish news channel NTV declined to broadcast the BBC program "Dunya Gundemi" (World Agenda), which included a segment on the decline of press freedoms in Turkey.18 As a result, the BBC terminated their broadcast relationship with NTV. Yet, thanks to social media, the end of the broadcast agreement did not mean the end of access to BBC programming in Turkey, as the news service made clear in their statement on the decision:

BBC Türkçe will continue to cover global events–including the events in Turkey–on all platforms…[and] will continue to engage with its audiences via social media, on Facebook and Twitter.19

In addition to providing alternative methods for traditional publishers to reach audiences, social media platforms have also fostered the growth of high-quality grassroots citizen-journalism efforts. Open-source investigation group Bellingcat, for example, relies tremendously on social media both for conducting their investigations, and to raise awareness of their findings.20 In Turkey, the group @140journos, which began in 2012 in response to the lack of mainstream coverage of bombings near Turkey's border with Iraq, relies on citizen journalists using Twitter, Instagram, and WhatsApp to report on events around Turkey, many of which are covered in only a cursory way—if at all—by mainstream media outlets.21 In a country that has imprisoned more journalists than any other, the potential for social media to serve as a source of information that has been suppressed in traditional media is crucial.22 [End Page 84]

Opportunity Lost

Yet the Twitter of today is not the Twitter of 2009. The global rise of social media companies was made possible by the permissive legal environment of the United States, and it appeared early on that their international expansion would bring with it core US values of genuine free speech and expression. Instead, however, these companies are increasingly complying with the type of repressive local laws that compound the effects of governmental media control. Just as the Turkish government uses both direct and indirect political and economic leverage to influence newspaper and broadcast coverage, Turkey has long been the world's leader in local "Content Withheld Requests" (CWCs), the tool through which authorities demand the blocking of certain content from Twitter within their jurisdiction.23,24 In Turkey, it is illegal to post content that "mocks" or "insults" President Recep Tayyip Erdoğan. Moreover, Twitter has increasingly complied with removal requests since 2015, including instances of blocking the accounts of what it calls "verified" journalists and media organizations.25 In the latter half of 2017, Twitter withheld 148 accounts in Turkey—nearly three-quarters of the global total of withheld accounts during that period.26 Far from offering an alternative to traditional media accounts where state control is most pronounced, social media has become yet another way to identify and stifle dissent.

In repressive regions, social media not only fails to be a safe haven for free speech, it has been increasingly weaponized against both professional and citizen journalists. Although Twitter was a crucial tool for protesters during Gezi Park, multiple journalists—many of whom have since been fired from their professional positions—reported hearing the same words, phrases and sentences in the hateful and threatening messages they received via social media. Several of them tied the appearance of these campaigns to the hiring of several thousand "AKP trolls" who, they believed, were being paid to aggressively target them on social media in an effort to censor their ongoing work. Their experiences are reflected in the global experiences of journalists who have reported substantial harassment on social media platforms, and in the findings of organizations like Freedom House, which tracks the use of such tactics in 60 countries.27,28

This weaponization of social media goes well beyond the censoring effects experienced by journalists. In March 2018, for example, a United Nations fact-finding mission concluded that Facebook had played a "determining role" in the experience of Rohingya Muslims in Myanmar, an estimated 650,000 of whom have fled to Bangladesh since 2017.29 Despite years of warnings from NGOs and human rights groups about how the platform was being used to incite violence and spread misinformation, it was not until August 2018 that [End Page 85] Facebook moved to ban the accounts and pages at the center of the ethnic cleansing movement.30 By that time, the rise of social media-fueled mob violence had become commonplace enough for the UN to call for owners of the accounts to be tried for genocide. 31,32

Far from acting as platforms to challenge authoritarian regimes or official accounts, social media companies have largely allowed themselves to become tools for authoritarian regimes to manipulate and forge public opinion. As long as these companies continue to provide mechanisms to surveil and manipulate discourse, their most avid users will naturally be the kind of commercial and political entities that stand to benefit from public persuasion on a large scale. In the case of Myanmar, the impact of social media was so profound that the UN identified Facebook as an essential weapon for the military's violent campaign against the Rohingya:

Facebook has been a useful instrument for those seeking to spread hate, in a context where for most users Facebook is the Internet...The extent to which Facebook posts and messages have led to real-world discrimination and violence must be independently and thoroughly examined.33

In addition to contributing to global violence and instability, the ease with which social media can be manipulated creates serious problems for global policymakers, for whom the ability to distinguish between genuine public sentiment and manipulated online representations is increasingly difficult. In late 2017, for example, the pro-democracy group Freedom House released a report detailing the existence of pro-government "bot armies" in 30 out of 65 countries surveyed. As a result, the report concludes, "It can be hard to distinguish propaganda from actual grassroots nationalism, even for seasoned observers."34

Platforms or Publishers?

In testimony before the US Congress, major social media companies continue to deny that they are "publishers," yet in legal proceedings in both the United States and the United Kingdom, Facebook and Google have attempted to claim exemptions on the grounds that they are "publishers" or doing "journalistic" work.35 In mid-2018, lawyers for Facebook defended the company's right to remove developer access to some data by aligning the company's rights and activities with those of traditional media companies, saying that Facebook had [End Page 86]

"the publisher discretion [which] is a free speech right irrespective of what technological means is used. A newspaper has a publisher function whether they are doing it on their website, in a printed copy or through the news alerts."36

In a "Right to be Forgotten" case in the UK a few months earlier, Google also cast itself as a publisher in an attempt to avoid a court order to de-list URLs in its search engine service. Claiming that the service should be exempted as "journalistic" because it was providing access to journalism, Google's plea was ultimately rejected by the judge as both invalid and "parasitic."37

Unsurprisingly, perhaps, policymakers around the world have begun to impose much more stringent rules on companies that trade principally in user data, social media companies foremost among them. In Europe, the General Data Protection Regulation (GDPR), which demands uniform, global protection of European residents' data, began enforcement in May 2018. Google and Facebook were immediately faced with almost $9 billion in lawsuits.38 In California, a law passed in September 2018 requires that "bots" involved in commercial or political activities must declare themselves as such when communicating with humans.39 Introduced by Assemblyman Marc Levine just after a recent New York Times article detailing the world of fake social media followers, the goal of the bill, according to Levine, is to "bring accountability for unregulated and misleading use of social media." In his view, "it's clear that self-regulation is failing society and damaging our democracy." 40,41

Whether they are publishers or platforms, and whether they remain self-regulated or face legislation from countries, the fact remains that the choices made by social media companies have increasingly costly and problematic real-world effects. And without adequate governance, either self- or externally-imposed, they will continue to foster both global extremism and instability, in a way that neither traditional media nor public policy will be able to effectively combat. [End Page 87]

Susan E. McGregor

Susan McGregor is Assistant Director of the Tow Center for Digital Journalism & Assistant Professor at Columbia Journalism School. Her main areas of research interest are information security, privacy, knowledge management and alternative forms of digital distribution.

Notes

1. Mark Pfeifle, "A Nobel Peace Prize for Twitter?," Christian Science Monitor, July 2009, https://www.csmonitor.com/Commentary/Opinion/2009/0706/p09s02-coop.html.

2. Cyrus Farivar, "Appeals court: Twitter can't be sued for "material support" of terrorism," Ars-Technica, February 2018, https://arstechnica.com/tech-policy/2018/02/appealscourt-twitter-cant-be-sued-for-material-support-of-terrorism/.

3. Cecilia Kang et al., "Twitter's Dorsey Avoids Taking Sides in Partisan House Hearing," The New York Times, September 2018, https://www.nytimes.com/2018/09/05/technology/facebook-twitter-congress.html.

4. Cate Cadell and Joseph Menn, "Google plans return to China search market with censored app: sources," Reuters, August 2018, https://www.reuters.com/article/us-chinagoogle/google-plans-return-to-china-search-market-with-censored-app-sourcesidUSKBN1KN09C.

5. Kate Conger, "Amazon Workers Demand Jeff Bezos Cancel Face Recognition Contracts With Law Enforcement," Gizmodo, June 2018, https://gizmodo.com/amazon-workersdemand-jeff-bezos-cancel-face-recognitio-1827037509.

6. Kang et al., "Twitter's Dorsey Avoids Taking Sides in Partisan House Hearing."

7. Terms of Service, Facebook, https://www.facebook.com/legal/terms.

8. Ibid.

9. Ibid.

10. Communications Decency Act, U.S. Code 47 (1996) §230

11. Stratton Oakmont v. Prodigy, 23 Media L Rep 1794 (1995)

12. Although other portions of the CDA were later found unconstitutional, Section 230 remains very much in force.

13. Issie Lapowsky, "Chuck Johnson's Twitter Free Speech Suit is Probably DOA," Wired, January 2018, https://www.wired.com/story/chuck-johnson-twitterfree-speech-lawsuit/.

14. Sam Levin, "Is Facebook a publisher? In public it says no, but in court it says yes," The Guardian, July 2018, https://www.theguardian.com/technology/2018/jul/02/facebookmark-zuckerberg-platform-publisher-lawsuit.

15. Matthew Garrahan, "Google and Facebook dominance forecast to rise," The Financial Times, December 2017, https://www.ft.com/content/cf362186-d840–11e7-a039-c64b1c09b482.

16. Timothy J. Ryan, "What Makes Us Click? Demonstrating Incentives for Angry Discourse with Digital-Age Field Experiments," The Journal of Politics, no. 4 (November 2012), https://www.journals.uchicago.edu/doi/abs/10.1017/S0022381612000540.

17. Russ Buettner, "A Brooklyn Protester Pleads Guilty After His Twitter Posts Sink His Case," The New York Times, December 2012, https://www.nytimes.com/2012/12/13/nyregion/malcolm-harris-pleads-guilty-over-2011-march.html.

18. BBC, "Gezi Park: Turkish protesters vow to stay put," BBC News, June 2013, https://www.bbc.com/news/world-europe-22917616.

19. Statement regarding BBC and NTV, Turkey, BBC, June 2013, https://www.bbc.co.uk/media-centre/statements/bbc-ntv.html.

20. Noor Nahas, "How to Collect Sources from Syria If You Don't Read Arabic," Bellingcat, July 2018, https://www.bellingcat.com/resources/how-tos/2018/07/10/how-to-collectsources-from-syria-if-you-dont-read-arabic/.

21. Piotr Zalewski, "Meet the Man Transforming Journalism in Turkey," Time.com, May 2015, http://time.com/collection-post/3896544/ngl-engin-onder/.

22. Elana Beiser, "Record number of journalists jailed as Turkey, China, Egypt pay scant price for repression," Committee to Protect Journalists, December 2017, https://cpj.org/reports/2017/12/journalists-prison-jail-record-number-turkey-china-egypt.php.

23. Onur Ant, "Half of All Requests to Remove Twitter Posts Come From Turkey," Bloomberg, March 2017, https://www.bloomberg.com/news/articles/2017–03–22/halfof-tweet-removal-requests-come-from-turkey-twitter-says.

25. Efe Kerem Sozeri, "Twitter refuses to explain censorship of verified journalist accounts in post-coup Turkey," The Daily Dot, August 2016, https://www.dailydot.com/layer8/twitter-censorship-journalists-turkey-coup/.

26. Removal requests, Twitter,

27. Gina Masullo Chen et al., "'You really have to have a thick skin': A cross-cultural perspective on how online harassment influences female journalists," Journalism, April 2018, dx.doi.org/10.1177/1464884918768500.

28. Manipulating Social Media to Undermine Democracy, Freedom House, December 2017, https://freedomhouse.org/report/freedom-net/freedom-net-2017.

29. Reuters, "Myanmar: UN blames Facebook for spreading hatred of Rohingya," The Guardian, March 2018, https://www.theguardian.com/technology/2018/mar/13/myanmarun-blames-facebook-for-spreading-hatred-of-rohingya.

30. Rhett Jones, "Facebook Bans Myanmar Leaders, Admits It Was 'Too Slow' to Stop Posts Promoting Genocide," Gizmodo, August 2018, https://gizmodo.com/facebook-bansmyanmar-leaders-admits-it-was-too-slow-t-1828625302.

31. Timothy McLaughlin, "How Facebook's Rise Fueled Chaos and Confusion in Myanmar," Wired, July 2018, https://www.wired.com/story/how-facebooks-rise-fueled-chaos-andconfusion-in-myanmar/.

32. Report of the independent international fact-finding mission on Myanmar, Thirty-ninth session, United Nations Human Rights Council, September 2018, https://www.ohchr.org/Documents/HRBodies/HRCouncil/FFM-Myanmar/A_HRC_39_64.pdf.

33. Ibid.

34. Manipulating Social Media to Undermine Democracy, Freedom House, December 2017, https://freedomhouse.org/report/freedom-net/freedom-net-2017.

35. "Zuckerberg: We're a tech company, not a publisher," PBS NewsHour, April 2018, https://www.youtube.com/watch?v=4HTae-X757g.

36. Levin, "Is Facebook a publisher? In public it says no, but in court it says yes."

37. Chava Gourarie, "Google seeks to limit 'right to be forgotten' by claiming it's journalistic," Columbia Journalism Review, April 2018, https://www.cjr.org/innovations/googlejournalistic-right-to-be-forgotten-by-claiming-its-journalistic.php.

38. Russell Brandom, "Facebook and Google hit with $8.8 billion in lawsuits on day one of GDPR," The Verge, May 2018, https://www.theverge.com/2018/5/25/17393766/facebookgoogle-gdpr-lawsuit-max-schrems-europe.

39. Dave Gershgorn, "A California law now means chatbots have to disclose they're not human," Quartz, October 2018, https://qz.com/1409350/a-new-law-means-californiasbots-have-to-disclose-theyre-not-human/.

40. Nicholas Confessore et al., "The Follower Factory," The New York Times, https://www.nytimes.com/interactive/2018/01/27/technology/social-mediabots.html.

41. John Myers, "California would require rules on social media 'bots' under new legislation," The Los Angeles Times, January 2018, http://www.latimes.com/politics/essential/la-pol-ca-essential-politics-updates-california-would-require-rules-on-social-1517265261-htmlstory.html.

Share