is an associate professor at the University of Washington and a co-founder of the Center for an Informed Public, which researches the spread of misinformation and disinformation on social-media platforms. She has been studying social media, especially its use during crises, since 2009, the year that
became the dominant online social-media site.
The Wall Street Journal talked to Prof. Starbird about what her research says about the impact of social media in 2020, and the work that still needs to be done to realize the benefits that social media can bring. Edited excerpts follow.
WSJ: What would you say is the most important takeaway from the social-media world in 2020?
PROF. STARBIRD: We’re really focused right now on a lot of the negatives with social media. But there are still positive dimensions. We’re still seeing people being able to reach out and connect with other people at a time when we can’t connect in person.
The Year in Review
Also, social media seem to be taking a lot of heat that should perhaps be focused more broadly on our media and ourselves.
Sometimes we think of social media as something separate, but it is actually integrated into this complex information space. Social-media content gets picked up and put into other kinds of media. We see news articles being based on things that might have circulated on social media. We see content move back and forth between social media and cable news and hyperpartisan news outlets as well. Because of social media, people have a new ability to participate in and shape this broader information system, and yet that brings a lot of responsibilities that I don’t think we’ve yet recognized for ourselves—not just as information consumers, but as information participants.
WSJ: Is misinformation on social media a problem? If so, what is the problem?
PROF. STARBIRD: The problem of misinformation was relatively very small back in 2009, and in fact, we talked about it: “Oh, don’t worry too much about it. Most of the information we see is true and from people who are well-meaning.”
Over the last year, researchers feel this massive acceleration of misinformation, starting with rumors and theories about Covid-19 and going right through the 2020 election. Communities that are forming around misinformation and disinformation are growing and pulling more people into them.
WSJ: When you talk about misinformation on social media, what does that really refer to?
PROF. STARBIRD: The terminology around misinformation and disinformation is still in flux. A lot of people use misinformation as an umbrella term that includes false information that is accidental as well as intentional.
In academia, the definition we use for misinformation is information that is false but not necessarily intentionally false, whereas disinformation is information that is false or misleading and it is intentional, so there is some political or financial objective behind it.
WSJ: What is the most prevalent type of misinformation on social media?
PROF. STARBIRD: In March, when Covid-19 hit, our research team started pouring all our resources into research on rumors around Covid-19—everything from the lockdown rumors to the home-remedy rumors to those friend-of-a-friend-type rumors.
In the spring and summer, we began to shift into more politicized rumors and misinformation that was misinterpreting the science, taking the uncertainty around the science and then misrepresenting that to tell certain kinds of stories that map to political objectives.
Share Your Thoughts
How has your social-media consumption changed over the past year? Join the conversation below.
The antivaccine movement then became more central in the conversation. Some of these rumors started to become conspiracy theories and started to map to older conspiracy theories—for example, about rich philanthropists who are trying to control the world through microchips.
We started seeing those conspiracy theories take root, and communities organized around those conspiracy theories begin to connect. Those networks began to come together in ways that you might not have expected and were spreading Covid-related misinformation and disinformation.
Then, through the summer, Covid-19 begins to run straight into the election and we begin to see how misinformation becomes even more politicized and wrapped into a certain kind of political narrative for people who are trying to shape political outcomes.
In August, we shifted gears. The Center for an Informed Public partnered with Stanford Internet Observatory and other groups to form the Election Integrity Partnership to conduct real-time analysis of misinformation and disinformation, particularly around voting.
We started tracking claims about mail-in ballots and voter fraud. And this culminated in election week, and the weeks after, with, for example, false claims about sharpies being used to disenfranchise voters and diverse theories based on bad statistics pushing false claims of massive voter fraud.
WSJ: Who is behind the most viral misinformation and disinformation on social media?
PROF. STARBIRD: Antivaccine groups, conspiracy-theory groups, hyperpartisan news outlets and President Trump’s
WSJ: What is the most effective way for social-media users to reduce the spread of rumors and disinformation?
PROF. STARBIRD: Slow down, tune into your emotions, and then recognize our own responsibility. It’s so easy to say and so hard to do.
One of the most important things to learn is that misinformation and especially disinformation don’t just affect us cognitively, they affect us emotionally. They often trigger an emotional response. They can make us outraged or angry, and self-righteous. And this can activate us to do something, like press the share button. Tuning in to our emotions, to how information makes us feel, can help us recognize when someone might be trying to manipulate us into becoming part of their disinformation campaign.
WSJ: Can we combat misinformation and disinformation on social media with facts?
PROF. STARBIRD: When we see misinformation spreading online, it’s not about necessarily informing someone. It’s about demonstrating your identity. It’s about building social connections with others that might share the same kinds of worldviews. And so trying to combat that with facts and sending people to Snopes [a fact-checking site], it doesn’t really work. People will say, ‘Oh, the fact checkers, they don’t share my values, and so they’re not someone that I can trust.’
WSJ: Should platforms take measures to reduce the amount of misinformation and/or disinformation?
PROF. STARBIRD: Yes. The platforms were doing almost nothing on misinformation prior to around eight months ago. And now they have put policies in place around medical misinformation and Covid-19, and some policies around election disinformation, most related to voting procedures and election resultsas opposed to misinformation about policy.
While we can commend some platforms for the actions that they’ve taken and policies put in place for the last year, there’s still work to be done.
WSJ: What does success look like?
PROF. STARBIRD: That we, as people who are using these platforms, as participants in the public square, still have the ability to share a variety of ideas. We can talk about differences of opinion of policy and about different ideologies and different religions and different kinds of things. We can still have those conversations and yet that those conversations become more anchored in some shared sense of reality, and a shared sense of at the end of the day, there are certain facts. At the end of the day, there are certain things that happened in the world. Success looks like our shared realities coming back, growing closer together.
Ms. Kranhold is a writer in California. She can be reached at [email protected]
Copyright ©2020 Dow Jones & Company, Inc. All Rights Reserved. 87990cbe856818d5eddac44c7b1cdeb8