Facebook’s decision was a litmus test for social media platforms’ capacity to draw the line between the protection of free speech and public safety. But American social media users were deeply polarized long before Trump took office, and will continue to antagonize each other in the wake of this decision as well. Is there anything that we — the citizens of social media — to prevent this decision from launching yet another round of partisan warfare on social media?
That question may seem wrongheaded. By popular accounts, it was social media companies that trapped us inside ideological echo chambers, ignored misinformation campaigns that divided us even further and built algorithms that radicalize us for profit. But what if I told you that the evidence for each of these claims is surprisingly thin?
Four years ago, I founded the Polarization Lab at Duke University. We use the tools of computational social science to research the key drivers of political polarization and build new technology to help social media users implement insights from our research.
Social media companies are by no means blameless for our current situation. But the latest research indicates that most people are not stuck inside political echo chambers, misinformation can have surprisingly little impact on our views and algorithms probably only radicalize a tiny fraction of people.
These findings may seem surprising, but they are actually quite consistent with decades of research about public opinion. Most people don’t care very much about politics, and those that do usually have very strong views that are difficult to change. The small group of people who follow politics closely enough to erect strong echo chambers around themselves also see and share the vast majority of fake news.
Though we might like to think that Facebook, Twitter or other platforms could simply tweak some code to save us from our current predicament, these studies hint at a much more unsettling truth: the root cause of political polarization on our platforms is us. And it’s not going away until we find a way to solve it.
Americans are deeply divided, and many are not yet willing to cross the chasms that separate us — especially on divisive issues such as race. But what about the majority of Americans who want political compromise? What would a bottom-up movement to counter polarization look like for them?
Though there is no single solution to defeat political polarization, my colleagues and I have identified three things everyone can do to form better habits — and we have created new technologies to assist in this process.
First, we can learn to combat false polarization, or our tendency to exaggerate extremity on the other side and minimize radicalism on our own side — making us think that political polarization is more pervasive than it really is.
False polarization existed long before social media, but social media users have set this process into hyperdrive. A recent report from the Pew Research Center revealed that about 6% of all Twitter users generate 73% of posts about national politics — and a majority of these individuals have extreme views. Meanwhile, the majority of Twitter — more moderate users — rarely post about politics, making it seem like they do not exist in the political sphere at all.
That’s why it’s so important to learn to see Twitter trolls for what they really are. And we have developed a tool to do just that. Our troll-o-meter, built after tracking the language and characteristics of trolls on Twitter, helps to identify social media users who not only have extreme views, but engage in the type of highly uncivil behavior that can make political compromise seem impossible.
Second, we must all become more introspective about our own behavior on social media — and whether it contributes to false polarization. Becoming a more reflective social media user does not simply mean tamping down our inner trolls. Instead, it requires asking ourselves more fundamental questions about how our behavior shapes the bigger picture. If we are part of the moderate majority of Americans who never posts about politics, for example, we must consider whether our lack of engagement helps fuel the fire.
Of course, becoming more reflective about our own behavior is extremely difficult — especially with the seemingly infinite distractions of social media. This is why we built tools that help people see what their posts say about their politics. Our technology will place you on a spectrum that ranges from “very liberal” to “very conservative” so that you can get a sense of how other people might perceive you — and verify that your online persona reflects your offline views.
Third, we can learn to find moderate voices on the other side more effectively. The idea of a moderate on social media can sometimes seem like an oxymoron, but this is because the loudest voices drown out those in the middle. In addition to learning to avoid extremists — and not feed the trolls — we also need help learning how to see the middle.
Once again, technology can help us get there. Our Polarization lab studies patterns in the content liked by a large group of Republicans and Democrats to build models that produce a “Bipartisanship Leaderboard” where you can find public figures and organizations who appeal to both sides and bots that retweet their messages. These messages will not please everyone — but they can help you begin to turn up the volume of people whose more moderate views are so urgently needed to pull our conversation back toward a more rational and pragmatic middle.
Keeping our eyes trained on the middle will be even more important during the maelstrom of hot takes, incivility and anger over Facebook’s decision about Trump. Each of our decisions about what to post, share or like in the coming days will determine whether we continue to fan the flames of partisanship or begin to have the difficult conversations about how to put ourselves back together again.
Though a bottom-up movement to counter polarization on social media will not solve all of our problems, our current predicament is not sustainable. Content moderation by the platforms has an important role to play — but we’ve spent too much of our time focused on rooting out bad behavior on our platforms and far too little thinking about how to incentivize civility and compromise.
This article has been updated to reflect the decision by Facebook’s oversight board not to readmit former President Donald Trump.