This is the first blog post by student science writer Mary Magnuson.
It’s easy for anyone monitoring the pandemic through months of 24-hour news cycles to pick up on false information or conspiracy theories.
We talked to University of Wisconsin–Madison experts to figure out why global events like the COVID-19 pandemic might give rise to this on social media – and how to avoid sharing false information.
As the pandemic continues to affect people around the globe, conspiracy theories about the virus have spread through social media and the internet — the notable one a 26-minute long video called “Plandemic.”
In this video, a discredited scientist shares debunked conspiracies about how a group of elites used the virus and potential vaccines for profit. The information in the video was deemed markedly false by experts, and sites like YouTube, Facebook and others worked to take down every iteration of the video.
But questions remain. How do conspiracies like this spread, especially in times of uncertainty, like a pandemic? And what can we do to stop them?
Two UW–Madison professors, Dietram Scheufele and Ajay Sethi, helped provide some answers. Scheufele works in the Department of Life Sciences Communication (LSC), where he studies public attitudes around science and science policy. Sethi works in the School of Medicine and Public Health, where he studies the spread of infectious diseases.
How does what you typically study inform your expertise during the pandemic?
Scheufele: A lot of our work with the scimep group here in LSC tries to figure out how we all make sense of complex emerging science that we — in most cases — know little about. COVID-19 is exactly that. Not only are most of us not experts in virology, epidemiology or public health, but the science on COVID-19 is very much in flux, with new findings constantly proving yesterday’s science wrong.
Sethi: I’m an infectious disease epidemiologist. My research focuses on factors associated with the transmission and natural history of infectious diseases, including HIV and healthcare associated infections. Although I have not previously studied coronaviruses, common methods used in infectious disease epidemiology can be applied to the study of most if not all pathogens.
How do I know if the information I’m reading about COVID-19 is accurate and trustworthy?
Scheufele: The fact that much of the science on COVID-19 is far from conclusive at this point doesn’t mean that there is not good expert advice to go by. The CDC (Centers for Disease Control and Prevention), WHO (World Health Organization) and many other organizations are maintaining websites devoted to COVID-19. Those include advice and best practices related to wearing masks, social distancing, if it’s safe to get takeout, etc.
In spite of the bad rap they sometimes get, social media are also a great tool for learning from some of the best experts on COVID-19. Journalists like Helen Branswell or Maryn McKenna (who have actually been both science writers in residence here at UW) have spent their careers writing on and researching infectious diseases and routinely share their work on Twitter. I follow them there not just for their own work, but also because they do a great job vetting and contextualizing the constant stream of information that’s coming our way on corona.
Sethi: First, it is important to recognize that there is a lot of new information about COVID-19 coming out all the time. New knowledge learned is subject to change as the science and study of COVID-19 advances. So, what we thought was true yesterday is not necessarily so tomorrow. That can make it challenging to know whether what you are reading about COVID-19 is accurate. It’s important to evaluate the source to be sure it is reputable and unbiased. Look for peer-reviewed information when possible. When reading information found on a website, I suggest evaluating the website for its credibility, and there are a number of checklists and tools available to do that.
What determines what information people are drawn to consuming and sharing?
Scheufele: That’s a complicated question. We live in a time that is very paradoxical when it comes to the information we all receive. On the one hand, the internet has made it easier than ever before to find the best information quickly, no matter where we are and with little effort. What would have required a trip to the Library of Congress even just 25 years ago, is now one click and a couple of swipes away on our smartphones. On the other hand, apps and algorithms have also made it easier than ever before to avoid any information we don’t want to see or that doesn’t fit our worldview.
Sethi: We are all susceptible to living in bubbles, and getting comfortable in our echo chambers. It can be human nature to surround ourselves with people and ideas that confirm what we believe to be true about the world, which in turn makes us feel good about ourselves and reinforces our worldview.
Are outbreaks like this especially ripe for conspiracy theories?
Scheufele: There’s little systematic evidence that we’re seeing more or fewer conspiracy theories on COVID-19 that we normally do. Of course, it seems like they’re everywhere, but we also need to realize that there is very little news other than COVID-19 right now, and we’re all spending a lot more time online and on our phones than we usually do. But looney ideas like the idea that the Gates Foundation is promoting vaccines for population control or economic gain have been around for years. COVID-19 has just given them new visibility.
But it’s also important to keep in mind that this is a time with almost unprecedented uncertainty and unpredictability for most of us. We have little control over the emergence of viruses like COVID-19. We don’t know what our future holds. And there is no good way out of the crisis that doesn’t require disruptions to our way of life. As a result, it is not particularly surprising that many of us are trying to find ways of making sense of this highly uncertain and deeply unpredictable situation. In the 1940s, social psychologists Fritz Heider and Marianne Simmel showed clips of animated geometric shapes to participants, only to find that many of them attributed human characteristics, motivations and intentions to what were randomly moving circles and triangles. That human tendency to attribute structure and meaning to fairly random sets of events is also what explains the intuitive appeal of movies like “Plandemic”: They give the appearance of meaning and convey a sense of control during a deadly pandemic which likely emerged somewhat randomly, and that has left us with limited control over the spread of a deadly virus.
Sethi: Yes, and there are many examples in history. During times of uncertainty and fear, we can have feelings of losing control. Denialism can also be a reaction. To make sense of stressful situations that develop suddenly with no signs of going away, like the COVID-19 crisis, we may be drawn to explanations to help us feel better about the realities of what we are facing.
I am not a psychologist by any means, but I read research related to the psychology of adopting and perpetuating conspiracies to include in my course, Conspiracies in Public Health. I also find it is useful to read the literature to keep myself from adopting misinformed views.
Why does misinformation about the virus spread so quickly?
Scheufele: There’s little social science that suggests that misinformation about COVID-19 spreads any faster or slower than correct information. In fact, I think we need to be very careful about how we talk about misinformation.
Of course there are things that are clearly wrong. Neither snorting cocaine nor injecting bleach will cure or prevent corona. And they were debunked pretty quickly on both social and legacy media.
What makes things more complicated for science during the current pandemic is what I would call the corona Catch-22: In the public arena, we can only get predictive modeling or mitigation right, but not both. The more successful we are at mitigation, the more inaccurate initial models will appear in hindsight. In other words, looking back people will think that initial models of how COVID-19 would spread had it wrong, precisely because those models encouraged the right policies that helped us avoid worst case scenarios.
The second problem is that there is little settled science on COVID-19. Much of the scientific work on the virus, its spread, and the effectiveness of different interventions is in flux, to say the least. New science constantly proves previous findings wrong. And that’s the way science is supposed to work. It’s supposed to self-correct and iterate toward the best possible explanations. During normal times, that’s just fine. Science plays out over long periods of time, with policy following in due course. For COVID-19, science and policy are emerging at the same time and with breakneck speed. This raises two problems: (a) The uncertainties surrounding science and policy end up overlapping in public perception, and science gets blamed for the inevitable missteps of public policy. (b) Battling misinformation on COVID-19 with science that itself might turn out to be wrong is not a winning proposition for the scientific community. We wrote about that here.
What should I do if someone I know shares or promotes misinformation or a conspiracy theory about the virus?
Scheufele: Debunking is a double-edged sword. It typically requires repeating and — especially on social media — giving additional visibility to misinformation. Some research suggests that this can reinforce rather than debunk inaccurate beliefs or even conspiracy thinking. This doesn’t mean that there is no value in pointing your friends or social media contacts to Snopes.com or any credible resource that debunks misinformation. The idea is to do it in a way that’s constructive, and to keep in mind that we’ve all shared misinformation at some point, even if we don’t remember it.
But all of that is based on the assumption that we’re sharing misinformation because we cannot tell that it’s fake. And sometimes that is true. But often, we share information without checking because it fits what we already believe. If I don’t like Trump, I am motivated to find information that makes him look bad. There was a Tweet about President Trump saying that “HUNDREDS of Governors” were calling him that made the rounds on my social media feeds recently, and was retweeted by many of my academic friends. It was fake of course, and a 3-second Google search would have shown that. So, it’s not that people couldn’t tell it was fake. They didn’t care, because it so perfectly fit their expectations and prior attitudes on Trump. One of our doctoral students and I wrote about many of those motivations that often make us believe in misinformation on an open-access article in PNAS (Proceedings of the National Academy of Sciences).
Sethi: I think it depends on your comfort level and how well you know the person. Some people might choose to avoid confrontation, which is understandable. If their actions cause your blood pressure to go up, it would be best to calm yourself down before saying anything. I also think it’s important to re-visit why you disagree with what they shared just to make sure you have your own facts straight. Things are rarely clear-cut.
So, after all that, if you decide to engage with them, I think it starts with active listening. As an aside, for a while after college (a long time ago!), I was a volunteer crisis counselor. So, my own instincts were formed from that training and experience. I’m no expert, but again, I know listening is important. So is asking questions. And then listening some more. Understand where they are coming from. Identify shared interests and emotions. You may or may not choose to volunteer your own views on the subject. It depends if you are asked for them and if you have established trust with them. That can take time to build, maybe many conversations. Avoid launching into explanations or proving how knowledgeable you are. It causes people to stop listening.
Science is filled with uncertainty, while misinformation often promotes concrete “facts” and “solutions.” Is there a way responsible science communication can achieve both?
Sethi: Understand your audience each time and start by asking what people want to learn from you. Go from there. Always be honest about what you know and what you don’t know. Be consistent, and don’t overstate findings. Learn to communicate nuance artfully. Avoid “dumping” information on people.
What can governments or corporations do to halt misinformation or conspiracies? In an ideal society, what should their respective roles be to curb conspiracies?
Sethi: All institutions have to decide when the spread of misinformed opinions and conspiracies require intervention. It’s important to respect people’s autonomy and rights to express themselves, but we should not tolerate the proverbial “shouting fire in a crowded theater.” I have my ideas as to where to draw the line and what institutions could do, but when I begin to apply them situation-to-situation, I realize it’s not an easy problem to solve.
What role does higher education play in creating citizens equipped to evaluate information? How will the pandemic inform your teaching going forward?
Scheufele: My colleague Dominique Brossard has written extensively on the idea of deference toward scientific authority. Why do we have faith in experts? What is it about science as a way of producing knowledge that makes us follow it more than other ways of knowing? Is it peer review? The scientific process? Her work shows nicely that our faith in scientific institutions is strongly related to K-12 and even K-16 schooling. In other words, education is partly about learning facts, but those facts change over time, especially for COVID-19. Instead, the power of education comes from building faith in science as our best way of knowing.
We actually talk about that in my large undergraduate lecture course in Science, Media and Society. It enrolls students from five or six different colleges at UW who major in genetics, politics, business, engineering, and communication, to just name a few. And COVID-19 already ended up being a large part of this past semester, even before we shifted to online teaching after spring break. How do we all make sense of this global pandemic? How can societies navigate very difficult trade-offs between economic considerations, public health, and individual rights as we’re trying to contain its spread? And what does it mean for Google to work with government and academia to track citizens’ cell phones to model and monitor new infections? LSC 251 going to be offered again this summer and the fall, and I am pretty sure that COVID-19 will be a permanent and probably growing part of what we’ll be talking about.
Sethi: Institutions of higher education are places where ideas and knowledge are learned and exchanged. It’s where “sifting and winnowing” occurs. It begins with teaching and reminding ourselves how to be objective, curious learners.
I began teaching Conspiracies in Public Health three years ago because I grew increasingly concerned about the unraveling of longstanding public health achievements and how previously innocuous topics suddenly became hot button issues. Learning about popular and less popular conspiracies is not the focus of the course. I created the class so that students could explore the psycho-social basis for conspiracy thinking and develop or refine their skills in listening and talking to people with differing views on health and public health topics. Misinformation and conspiracies about COVID-19 provide opportunities for me to fortify the class with contemporary material and opportunities for students to draw connections between course content with what we are reading in the news every day.