Sizing up social media’s threat to democracy: What it will take to fend off extremism
A New Zealand flag and flowers are laid down at a fence in the Botanical Gardens for the victims of an attack on two mosques in Christchurch, New Zealand in March (Photo: Peter Godfrey/picture alliance via Getty Images)

Sizing up social media’s threat to democracy: What it will take to fend off extremism

No alt text provided for this image

By Brad Smith and Carol Ann Browne

For decades one of the strengths of the world’s republics had been the ability to use open communication and public discussion to ensure broad and even bipartisan understanding, support for foreign policy issues, and a commitment to democratic freedoms. It was seldom an easy task, but as Franklin Roosevelt proved, new communication technologies, like the radio in his time, could be used to build public support for difficult measures such as supporting the United Kingdom before the United States entered World War II. And the United States used everything from radio to the fax machine to spread information and nurture democracy in the closed societies in Central and Eastern Europe in the decades that followed.

But now others have turned the tables on this very strength of a free and open society. Hacked emails might be the tip of Russia’s new spear, but their ambitions have a far broader reach. Cable news and then social media have created increasingly separate information bubbles in Western democracies, especially in the United States. What if information—true or otherwise—could be spread using platforms like Facebook and Twitter to rile up various factions and un­dermine political candidates likely to be more hostile to Russia’s interests? What if teams of technologists and social scientists joined forces in Saint Petersburg and Moscow to influence the American political and social narrative with the same creativity and speed as the creators of the platforms they were exploiting? And what if no one in the United States was paying enough attention to see what was happening?

In late 2018, a team from Oxford University and the American analytics firm Graphika analyzed subpoenaed data that Facebook, Instagram, Twitter, and YouTube provided to the Senate Intelligence Committee. The team was the first to document thoroughly how Russia’s Internet Research Agency (IRA) had “launched an extended attack on the United States by using computational propaganda to misinform and polarize US voters.” These disinformation efforts typically peaked near key dates on the American political calendar, a strategy that played into the interactive and viral nature of social media platforms. As the report found, between 2015 and 2017 more than thirty million users “shared the IRA’s Facebook and Instagram posts with their family and friends, liking, reacting to, and commenting on them along the way.”

Tools and Weapons: The Promise and the Peril of the Digital Age

By manipulating American ­made technology, the Russians were able to reach into and stir the US political pot. This foreign influence spilled over into the real world, notably during the IRA’s successful effort in 2016 to organize a synchronized protest and counter protest in Houston which ended with neighbors shouting at neighbors, unknowingly egged on by people in Saint Petersburg, Russia.

By late 2017, that reality had become increasingly clear. Still, when reports of Russian disinformation efforts on Facebook emerged, most people in the tech sector, including Mark Zuckerberg, were skeptical that the activity was widespread or that it could have much effect. But that soon changed. By the fall of 2017, Facebook found itself in the crosshairs of officials around the world. The social media giant was under more public scrutiny than any tech company since the antitrust cases against Microsoft almost two decades earlier. Having lived through those years at Microsoft, I appreciated the important and rising government demands on Facebook. I also understood the enormous difficulties the company faced. Facebook had not de­signed its services as a platform for foreign governments to use to disrupt democracy, but neither had it put in place measures that could prevent or even recognize such activity. No one at the company—or in the tech sector or the US government, for that matter—had antici­pated such a phenomenon until Russia turned Facebook against the very country that had given it life.

I was especially struck by the world’s focus on Facebook when we attended the Munich Security Conference in February 2018. Founded in 1963 and now led by respected former German diplomat Wolfgang Ischinger, the annual summit brings together defense ministers and other military and national leaders from around the world to discuss international security policy. In 2018, the attendee list included some of my peers from the information technology industry.

As I made my way through the wall of high-­ranking military service officers in the packed lobby of the Bayerischer Hof hotel, I felt a bit out of place. It was a homecoming of sorts as I squeezed into the elevator next to Eric Schmidt, then chairman of Google, and his team. It was an odd place to run into someone from Silicon Valley.

“Have you been here before?” he asked.

“Actually, it had never really occurred to me before that I needed to be here,” I replied.

But times had changed and in 2018, it was important that we were both in Munich.

Much of the discussion that week focused on the weaponization of information technology. At a lunch with CEOs, International Monetary Fund head Christine Lagarde was asked why she had come to a defense conference. She explained that she wanted to understand how information technology was being used to harm democratic pro­cesses to help her think about how it might be misused to attack financial markets. It was a sobering conversation, but I was reassured by her foresight.

While the conversations were difficult and a bit heavy, I couldn’t help but have a bit of sympathy for Facebook’s chief security officer, Alex Stamos, who was on the defensive throughout the entire conference. During a panel we sat on together, he was peppered with sharp questions by a rising Dutch member of the European Parliament. Later that evening during dinner with the Atlantic Council, government officials and other indignant attendees challenged him repeatedly, asking how Facebook “had allowed all this to happen.”

While the concerns were understandable, I was increasingly exasperated by the discussion. Everyone was pointing a finger at Facebook, but no one was pointing a finger at the prime culprit. It was like yelling at the person who forgot to lock the door without talking about the thief who broke in.

The bigger question for Facebook, the United States, the world’s democratic republics, and the entire tech sector was what to do. The reaction of some in government was to cast blame on Facebook and other social media companies, insisting that they solve the problem. While the companies who invented this technology indeed held much of the responsibility, such an approach seemed incomplete. The answer would require a mix of action by governments and by the tech sector itself.

In the summer of 2018 as Mark Zuckerberg testified before Congress, the tech sector had changed its tune on the magnitude of the problem and the response it required. “My position is not that there should be no regulation,” Zuckerberg said. “I think the real question, as the internet becomes more important in people’s lives, is what is the right regulation, not whether there should be or not.”

As his statement reflected, it’s one thing to recognize the obvious and acknowledge that regulation is needed. It’s quite another to fig­ure out what type of social media regulation makes sense.

One person leading the charge to answer the latter question is a former telecommunications executive serving in the United States Senate since 2009, Mark Warner from Virginia. In the summer of 2018, Warner released a white paper with a series of proposals designed in part to address disinformation campaigns through new legislation. He acknowledged some of the technical and privacy issues associated with these issues and called for more discussion.

As Warner reflected in his paper, an emerging issue for social media on the internet is the growing disquiet with its current immunity under the US Communications Decency Act. Congress in 1996 passed legislation to nurture the internet’s growth in part by shield­ing the publishers of “interactive computer services” from many of the legal responsibilities that more traditional publishers face. For example, unlike television and radio, social media services bear no legal responsibility in the United States under state and many federal laws for illegal content published on their sites.

But the internet is no longer in its infancy, and its impact today is globally ubiquitous. As nation­ states, terrorists, and criminals exploit social media sites for nefarious purposes, political leaders increasingly are joining traditional publishers in questioning whether social media sites should continue to get a legal pass. Warner points to the expected spread of “deep fakes,” or “sophisticated audio and image synthesis tools that can generate fake audio or video files falsely de­picting someone saying or doing something,” as an additional reason to impose new legal responsibilities on social media sites to police their content.

As the world has witnessed more horrifying acts amplified on social media, political pressure has grown. A decade from now, we may look back at March 2019 as an inflection point. As Kevin Roose wrote in the New York Times, the horrific terrorist slaying of fifty-­one innocent Muslims on March 15 in two mosques in Christchurch, New Zealand, in some ways “felt like a first—an internet ­native mass shooting, conceived and produced entirely within the irony­ soaked discourse of modern extremism.” As he described, “The attack was teased on Twitter, announced on the online message board 8chan and broadcast live on Facebook. The footage was then replayed end­lessly on YouTube, Twitter, and Reddit, as the platforms scrambled to take down the clips nearly as fast as new copies popped up to re­place them.”

Just two weeks later we were in Wellington, New Zealand’s capital, on a trip that had been months in the making. New Zealand’s Prime Minister, Jacinda Ardern, who had handled the shock and cri­sis with extraordinary judgment and grace, had given a speech that captured a marked shift toward social media. “We cannot simply sit back and accept that these platforms just exist and that what is said on them is not the responsibility of the place where they are published,” she said. She then referred to social media sites even more emphatically, “They are the publisher, not just the postman. It cannot be a case of all profit, no responsibility.”

When we met with Ardern and her cabinet members in New Zealand, I could not disagree. The episode demonstrated that tech companies needed to do more, including Microsoft’s own services such as Bing, Xbox Live, GitHub, and LinkedIn. And, more broadly, a regulatory regime established nearly a quarter century ago suddenly seemed insufficient to address the threats to the public from hostile nations and terrorists alike.

While there are clear distinctions between the exploitation of social media platforms by terrorists and state­-sponsored attackers, there are similarities. Both involve intentional efforts to undermine the social stability on which societies depend. And, as it turns out, politically the responses to both problems may reinforce each other, pushing governments to move toward a new regulatory model for social media sites.

There remains a big difference, however, between needing something new and knowing precisely what’s needed. It seems impossible for social media sites to follow the pre­publication editorial review processes that are used by traditional print, radio, or television out­lets. Imagine if every photo on Facebook or entry on LinkedIn needed to be reviewed by a human editor before it could be viewed by others. It would “break the mold” that makes it possible for hun­dreds of millions or even billions of users around the world to upload content and share it with family, friends, and colleagues.

This is a problem that needs to be solved with a scalpel rather than a meat cleaver. It’s not an easy challenge, especially in moments of political pressure. It was in part to avoid a hasty legislative reaction that in 2018 Warner encouraged a conversation with social media platforms—only to receive little or no feedback from some of the most prominent companies. Worrying about mounting Russian ex­ploitation of social media, he offered a menu of more tailored ap­proaches. One of his ideas, which was taken further by the Australian government, is to obligate social media platforms to prevent users from continuing to re­-upload illegal content, effectively increasing the legal responsi­bility to act once a problem has been verified. A more general vari­ant was proposed by the British government just two weeks after the Australians acted, recommending a new “statutory duty of care to make companies take more responsibility for the safety of their users” and backing this with oversight by an independent regulator. Warner also proposed rules that would impose a duty on social media platforms to determine the origin of accounts or posts, identify bogus accounts, and notify users when bots are spreading information.

Ultimately, there are two broader lessons that emerge. First, initiatives from the public and private sectors will likely need to move forward together and complement each other. And, second, despite the novelty of current technology, there is a lot to learn from the challenges of the past.

Interestingly, foreign interference in democracy is almost as old as the United States itself. A democratic republic by its very nature is subject to disruption—both foreign and domestic—by efforts to dis­rupt confidence and sway public opinion. The first person to realize this was an early French ambassador to the United States named Ed­mond Charles Genêt. He arrived in America in early April 1793, just a few weeks before President George Washington officially declared the United States’ neutrality in the expanding war between France and the United Kingdom. Genêt was on a mission to tip the young republic toward supporting France, including by persuading the United States to accelerate the repayments of its debt to his country and by enabling attacks on British shipping by armed privateers op­erating from US ports. If required, he was prepared to spark an at­tempt to overthrow the country’s young government.

Genêt’s arrival sparked increasing tension within Washington’s cabinet, with Thomas Jefferson sympathetic to the French and Alex­ander Hamilton sympathetic to the British. Genêt sought to appeal directly to the American public for his cause, a move that, in the words of one historian, did more than spark the origins of our two­ party system. “Political dialogue was impassioned, street brawls were not uncommon, and old friendships were severed.” In 1793, Wash­ington and the members of his cabinet overcame their differences and united in demanding Genêt’s recall to France.

That outcome has a lesson for our own generation. Foreign inter­ference with democratic processes can be met successfully only if the stakeholders in a republic set aside enough of their differences to work together to respond effectively. It may be difficult to remember that the differences between Jefferson and Hamilton and their re­spective supporters were as passionate as the disagreements between Republicans and Democrats today. But the Broadway musical Hamilton provides a powerful reminder that at least today our politicians no longer resort to armed duels. The reality is that passionate divisions and even vitriol are an inherent risk and constant challenge for any democratic republic.

It was against this backdrop and continuing French attempts to tamper with American politics that Washington used his farewell ad­dress in 1796 to warn against the risks of foreign influence. “A free people,” he said, “ought to be constantly awake, since history and experience prove that foreign influence is one of the most baneful foes of republican government.”

Brad Smith and Carol Ann Browne are the co-authors of 'Tools and Weapons: The Promise and the Peril of the Digital Age," from which this article is excerpted.

No alt text provided for this image


Martin Hoffmitz

Founder Global Flourishing Nexus | Leading Humanity's "Break Out" to Universal Flourishing with AI & Planetary Design.

5y

Wow - My Head Hurts! - Brad!!  So many great insights.  I still ask myself if Democracy is really the issue.  Hear me out... Most Humans on Spaceship Earth (Buckminsterfullerism). Really care about and are influenced by core issues - Rule of Law - Property Rights - Civil Law, Fair Justice, Free Markets, Financial, Physical, and Social mobility.  Fair Taxation and Representation - Transparency - Fiscal Responsibility and a Stable Financial system with a Solid Currency value.  I believe that Democracy is most often the "Rule of the Mob"  Undermining many of these core issues.

Like
Reply
Lewis Hill

Technology Consultant at Hill & Hill

5y

Brad and Carol Ann, thanks for writing this, the book, and for what you do!

Like
Reply
H Dozin

HealthCare & Community Apps, BSc, MCITP

6y

🌐~~~~~~~~~~~~~~~~~🌐 📱Community App 📝📷 📱 🌐~~~~~~~~~~~~~~~~~🌐 * 📝 * NOTEdecrypt * 📸 * LedCamEvi * 📱 * rtcTALK * 📗 * ChemiApp * 🌐 * G2C 🌐~~~~~~~~~~~~~~~~~🌐 DOZINAPP.COM 🌐~~~~~~~~~~~~~~~~~🌐 COMMUNICATION VOICE TEXT PICTURE 👫 TALK ✍🏻 Note 📷 Capture ℹ️ Pick up 🌐 G2C 🌐~~~~~~~~~~~~~~~~~🌐 Next generation communication platform 🌐~~~~~~~~~~~~~~~~~🌐

  • No alternative text description for this image
Like
Reply
Jeffrey Mayen

Quality Manager | Quality Supervisor | Quality Engineer | ISO13483 | ISO9000 | IATF16949 | Complaint Management | Warranty Analysis | Quality Auditing |

6y

Does anyone know where I can find a forum to participate in meaningful two-way discussions about issues like this without the bubbleheads, hate-mongers, agenda-driven activists, low information reactionaries, and the like?

To view or add a comment, sign in

Others also viewed

Explore content categories