Meta

Hard Questions: What Effect Does Social Media Have on Democracy?

This post is part of a series on social media and democracy. 

By Samidh Chakrabarti, Product Manager, Civic Engagement
This post is part of a series on social media and democracy. 

Around the world, social media is making it easier for people to have a voice in government — to discuss issues, organize around causes, and hold leaders accountable. As recently as 2011, when social media played a critical role in the Arab Spring in places like Tunisia, it was heralded as a technology for liberation.

A lot has changed since then. The 2016 US presidential election brought to the fore the risks of foreign meddling, “fake news” and political polarization. The effect of social media on politics has never been so crucial to examine.

All of this raises an important question: what effect does social media have on democracy?

As the product manager in charge of civic engagement on Facebook, I live and breathe these issues. And while I’m an optimist at heart, I’m not blind to the damage that the internet can do to even a well-functioning democracy.

That’s why I’m dedicated to understanding these risks and ensuring the good far overshadows the bad.

With each passing year, this challenge becomes more urgent. Facebook was originally designed to connect friends and family — and it has excelled at that. But as unprecedented numbers of people channel their political energy through this medium, it’s being used in unforeseen ways with societal repercussions that were never anticipated.

In 2016, we at Facebook were far too slow to recognize how bad actors were abusing our platform. We’re working diligently to neutralize these risks now.

We can’t do this alone, which is why we want to initiate an open conversation on the hard questions this work raises. In this post, I’ll share how we are thinking about confronting the most consequential downsides of social media on democracy, and also discuss how we’re working to amplify the positive ways it can strengthen democracy, too.

Foreign Interference

Let’s start with the elephant in the room. Around the US 2016 election, Russian entities set up and promoted fake Pages on Facebook to influence public sentiment — essentially using social media as an information weapon.

Although we didn’t know it at the time, we discovered that these Russian actors created 80,000 posts that reached around 126 million people in the US over a two-year period. This kind of activity goes against everything we stand for. It’s abhorrent to us that a nation-state used our platform to wage a cyberwar intended to divide society. This was a new kind of threat that we couldn’t easily predict, but we should have done better.

Now we’re making up for lost time. The Russian interference worked in part by promoting inauthentic Pages, so we’re working to make politics on Facebook more transparent. We’re making it possible to visit an advertiser’s Page and see the ads they’re currently running. We’ll soon also require organizations running election-related ads to confirm their identities so we can show viewers of their ads who exactly paid for them. Finally, we’ll archive electoral ads and make them searchable to enhance accountability.

As critical as this plan is, it poses challenges. How, for example, do we avoid putting legitimate activity at risk? Many human rights organizations commonly use Facebook to spread educational messages around the world. The wrong kind of transparency could put these activists in real danger in many countries.

But we’re committed to this issue of transparency because it goes beyond Russia. Without transparency, it can be hard to hold politicians accountable for their own words. Micro-targeting can enable dishonest campaigns to spread toxic discourse without much consequence. Democracy then suffers because we don’t get the full picture of what our leaders are promising us. This is an even more pernicious problem than foreign interference. But we hope that by setting a new bar for transparency, we can tackle both of these challenges simultaneously.

False News

But foreign interference isn’t the only means of corrupting a democracy. We recognize that the same tools that give people more voice can sometimes be used, by anyone, to spread hoaxes and misinformation. There is active debate about how much of our information diet is tainted by false news — and how much it influences people’s behavior. But even a handful of deliberately misleading stories can have dangerous consequences.

To take just one example, in Australia a false news story claimed that the first Muslim woman to be a Member of Parliament had refused to lay a wreath on a national day of remembrance. This led people to flood her Facebook Page with abusive comments.

In the public debate over false news, many believe Facebook should use its own judgment to filter out misinformation. We’ve chosen not to do that because we don’t want to be the arbiters of truth, nor do we imagine this is a role the world would want for us.

Instead, we’ve made it easier to report false news and have taken steps in partnership with third-party fact checkers to rank these stories lower in News Feed. Once our fact checking partners label a story as false, we’re able to reduce future impressions of the story on Facebook by 80%. We’re also working to make it harder for bad actors to profit from false news, eliminating their incentive to create this content in the first place.

Finally, since the best deterrent will ultimately be a discerning public, we’ve started sharing more context about the news sources people see on Facebook. By helping people sharpen their social media literacy, we can help society be more resilient to misleading stories.

Even with all these countermeasures, the battle will never end. Misinformation campaigns are not amateur operations. They are professionalized and constantly try to game the system. We will always have more work to do.

Echo Chambers

One of the most common criticisms of social media is that it creates echo chambers where people only see viewpoints they agree with — further driving us apart.

That’s a legitimate issue but it’s more complex than how it is sometimes portrayed. Compared with the media landscape of the past, social media exposes us to a more diverse range of views. A recent Reuters Institute Digital News Report found that 44% of people in the US who use social media for news end up seeing sources from both the left and the right — more than twice the rate of people who don’t use social media.

The deeper question is how people respond when they encounter these differing opinions — do they listen to them, ignore them, or even block them?

Think about how our minds work. It’s natural to seek out information that confirms what we already believe — a phenomenon social scientists call “confirmation bias.” Walter Quattrociocchi, Antonio Scala and Cass Sunstein found evidence last year that social media users are drawn to information that strengthens their preferred narratives and reject information that undermines it.

That makes bursting these bubbles hard because it requires pushing against deeply ingrained human instincts. Research shows that some obvious ideas — like showing people an article from an opposing perspective — could actually make us dig in even more.

A better approach might be to show people many views, not just the opposing side. We recently started testing this idea with a feature called Related Articles that shows people articles with a range of perspectives on the news they’re already reading about. We’ll see if it helps, and we’re eager to share our findings.

Political Harassment

While we want Facebook to be a safe place for people to express themselves politically, we need to make sure no one is bullied or threatened for their views.

To make matters more complex, governments themselves sometimes engage in such harassment. In one country we recently visited, a citizen reported that after he had posted a video critical of the authorities, the police paid him a visit to inspect his tax compliance. As more countries write laws that attempt to criminalize online discourse, the risk grows that states use their power to intimidate their critics. That could have a chilling effect on speech.

Even in more open societies, we’re seeing cases where government officials write hateful posts that make enforcing our Community Standards challenging. So far, we’ve kept such posts up on our platform since we view them as newsworthy information that citizens deserve to know. We’ve also found these posts often become important magnets for counter-speech, but we recognize reasonable people may disagree with this policy.

Our concerns with political hate speech aren’t limited to the online sphere — we also need to be vigilant that social media doesn’t facilitate offline violence.

Policing this content at a global scale is an open research problem since it is hard for machines to understand the cultural nuances of political intimidation. And while we are hiring over 10,000 more people this year to work on safety and security, this is likely to remain a challenge.

Unequal Participation

While foreign meddling, misinformation, echo chambers and hate speech get the headlines, what worries me most is how social media can distort policymakers’ perception of public opinion. People on Facebook tend to represent every walk of life, but not everyone is using their voice equally. Take women. They represent a majority of the population, yet are under-represented in public political dialogue on Facebook.

If politicians mistake the views of a few with the views of many, that can make for bad public policy. Vulnerable populations could end up ignored, and fringe groups could appear mainstream.

We’re trying to move the needle on this by studying, for example, why women participate less in political discourse online. In some of our civic features, we’ve incorporated these lessons and pioneered new privacy models that help to increase women’s participation. They still aren’t on par with men, but we’re getting closer. This is proof in my eyes that research-driven design can make social media a better medium for democracy.

Giving Voice

Clearly, there is no shortage of challenges at the convergence of social media and democracy. But there are also many bright spots that keep me coming to work every day.

First, social media has enormous power to keep people informed. According to the Pew Research Center, two-thirds of US adults consume at least some of their news on social media. Since many people are happening upon news they weren’t explicitly seeking out, social media is often expanding the audience for news.

More importantly, people aren’t just reading news — they’re actively discussing it. The implications for civic engagement are profound. It has long been observed that when people discuss the news, they’re more likely to be involved in their community, whether by volunteering or reaching out to elected officials. There is growing evidence that this is also true for social media — especially among young people.

Social media platforms are driving people not just to learn about issues but to take action. During the 2016 US election alone, we estimate our voter registration efforts on Facebook led more than 2 million people to register to vote.

Even more encouraging is that we’re seeing how social media can help people be more knowledgeable voters. During the last US election, we created Voting Plan, a tool to preview your local ballot and discuss it with friends. Millions of people did so. On average this increased people’s knowledge of their ballot by over 6%. That’s equivalent to raising the average ballot knowledge of the entire US Facebook community by a few grade levels.

But perhaps what inspires me most of all is that with social media, people can have a voice in their government everyday, not just on election day. Some 87% of governments around the world have a presence on Facebook. And they’re listening — and responding — to what they hear.

In Iceland, for example, when someone moves to a new neighborhood, the first thing they often do is join their community’s Facebook group. They tag their representatives in posts and push for the issues they want taken to Parliament. Conversations like these are quietly reinvigorating local governance around the world.

To bring this experience to more people, in 2016 we built a feature that makes it simple to follow all your elected representatives on Facebook with a single click. When we launched it in the US, it doubled the number of connections between people and their government. We’ve since seen a similar level of impact in other places like Germany and Japan.

This means that for the first time in history, people can keep up with their government as easily as they keep up with their friends. This is unlocking new waves of latent civic energy and putting power into more hands.

So, What Effect Does Social Media Have on Democracy?

If there’s one fundamental truth about social media’s impact on democracy it’s that it amplifies human intent — both good and bad. At its best, it allows us to express ourselves and take action. At its worst, it allows people to spread misinformation and corrode democracy.

I wish I could guarantee that the positives are destined to outweigh the negatives, but I can’t. That’s why we have a moral duty to understand how these technologies are being used and what can be done to make communities like Facebook as representative, civil and trustworthy as possible.

This is a new frontier and we don’t pretend to have all the answers. But I promise you that my team and many more here are dedicated to this pursuit. We’ll share what we learn and collaborate with you to find the answers.

What gives me hope is that the same ingenuity that helped make social media an incredible way to connect with friends can also be applied to making it an effective way to connect with the public square.

In the end, that’s why I believe that a more connected world can be a more democratic one.

Samidh Chakrabarti is a product manager at Facebook, where he is responsible for politics and elections products globally. Before coming to Facebook, he was the product lead for Google’s civic engagement initiative. His background is in both technology and public policy, and he’s spent his career working to combine them in service of the common good.