Misinformation and Myth During the Pandemic

Tim Caulfield, Professor in the Faculty of Law and the School of Public Health, and Research Director of the Health Law Institute at the University of Alberta, speaks to Vardit Ravitsky about the spread of misinformation on social media during the pandemic, and how we can address this issue through action and policy changes.

 

 

EPISODE TRANSCRIPT

Vardit Ravitsky: Living through a pandemic was a first for all of us. It was also a first for the social media phenomenon. Never before had the world experienced a deadly pandemic during which practically anyone with a smart phone could discuss the situation, promote any theory at all and even garner a substantial following. Tim Caulfield, a Professor in the Faculty of Law and the School of Public Health and Research Director of the Health Law Institute at the University of Alberta, followed the results with fascination. 

 

Timothy Caulfield: As you know Vardit, I do a lot of health law and science policy stuff. And increasingly it really has focused on how science, health, pseudoscience, how these things are represented in the public sphere. And I mean that broadly. I mean in the context of pop culture, in the scientific literature on social media. And so our team, I have this fantastic interdisciplinary team. We do empirical work on that. We do policy work on that. And look, I'll tell you, a year ago, I had no idea how bad it was going to be during the pandemic. I had a sense it was going to be bad. And by that I mean the spread of misinformation. This is even worse than I anticipated. This is a huge issue. I mean, battling misinformation, I think has become one of the defining challenges of our time. Now, that sounds like hyperbole, but I really don't think it is. I really don't think it is. If you look at the impact that misinformation has had, not just in the context of politics and we all know how that's played out, not just in the context of the marketing of misinformation, but look at the harm it has done over the past year deaths, hospitalizations, skewed health and science policy, and just increasing the chaotic information environment. So this really is, I think, one of the defining challenges of our time. 

 

Vardit Ravitsky: I couldn't agree more at this point in time. Misinformation kills. It's not just an undesirable social phenomenon. It kills. So to kick off our conversation about how misinformation is spread, what you've learned about how to fight it. I want to make you listen to an excerpt.

EXCERPT from President Trump says he is taking hydroxychloroquine - YouTube

A lot of good things have come out about the hydroxy. A lot of good things have come out. You’d be surprised at how many people are taking it, especially the frontline workers, before you catch it. The frontline workers, many, many are taking it. I happen to be taking it. I happen to be taking it. I’m taking it. Hydroxy chloroquine. Right now, yeah. Couple of weeks ago, I started taking it. Because I think its good, I’ve heard a lot of good stories.

 

Vardit Ravitsky: How does that exemplify for you what misinformation can do and how it is spread? 

 

Tim Caulfield: Just listening to that, it is a trigger. I can't believe it's already a year that since that kind of nonsense was spreading. But it is a fantastic example of the impact of misinformation and also how misinformation can spread. 

 

We have to remember, all this noise about hydroxy chloroquine, what I call the hydroxy chloroquine debacle, started with a small preprint study in France. So you have that study, and you have comments from Donald Trump and a few other prominent individuals like Elon Musk. And the debacle is born. And because of his statement, we saw an incredible spike in interest in hydroxy chloroquine, not just among the public, but we saw prescriptions go up two thousand percent. So this had a real damaging impact. Number one, it created the misperception that there was an effective drug out there to combat covid not true. Number two, it resulted in a shortage of this drug that was actually useful for individuals that needed it, that had a clinical indication.  And then another interesting thing happens, Vardit. Belief in hydroxy chloroquine takes on an ideological spin. Who would have ever guessed that a pharmaceutical product hydroxy chloroquine could become an ideological flag? So if you are of a particular ideological leaning, this is one of the cluster of beliefs that you're supposed to adopt hydroxy chloroquine is effective. And, you know, the amazing thing is, despite the fact that now we have, I would say, a really robust body of evidence with clinical studies, we have observational studies. This does not work, right? Hydroxy chloroquine does not work in the context of COVID. I think we can say that really definitively. Despite that, because of this ideological component to the story, there are many people that still believe hydroxy chloroquine works and.

 

Vardit Ravitsky: Still, still now?

 

Tim Caulfield:  Vardit, I get hate mail about it. We could bring a real time experiment right now, I can post go post something about hydroxy chloroquine on my Twitter feed, and instantaneously there'd be people saying it works and they and they point me to these studies that allegedly support them. Now, the other really important part of the hydroxy chloroquine story, I think, is this. It demonstrates also how misinformation spreads. We know that this is largely not entirely, but largely a social media phenomenon. And what happened, of course, is Donald Trump started talking about this and people that follow him or even people on the on the margins of that of that kind of ideological movement started sharing this content. So it really highlights how prominent individuals can shape public discourse.

 

Vardit Ravitsky: So this is a great opportunity to ask you to tell us about your #ScienceUp first project. 

 

Tim Caulfield: So just over a year ago, we got a couple of big research grants to explore how how is misinformation being spread about covid?  I think there's enough research now coming at it from different methodological directions that we can say with some certainty that this is largely a social media phenomenon. And social media really does have an impact on the spread of misinformation and on people believing misinformation. So we thought it was essential, absolutely essential to create really a movement. Vahedi, we want this to be a movement that that counters misinformation in those spaces. Right. We want to go where the misinformation resides. So Twitter, Facebook, Instagram and soon on tick tock also. And we wanted to create good content. We wanted to share good content on those spaces. And in addition to that, we really wanted to to, as I said, create this movement. This movement, hashtag #Science Up First, that allows people to kind of embrace this idea of accuracy and credibility. the other thing about this, this movement is we know this works. So sort of building on the research that we've done and research other people have done, like Gordon Pennycook at the University of Regina, we want to make sure that the messaging is evidence based. So in other words, that the strategies that we're using to counter misinformation are evidence based. So it's the content is scientifically accurate and the messaging is based on the best available evidence. So we're trying to do all of that. And it's been really successful, but it hasn't been out much. We started late January and already tens of millions of interactions. We have thousands of people joining the team. It has been fantastic. 

 

Vardit Ravitsky: So give us an example of a good way and a bad way of fighting misinformation on social media. Are you, for example, in favor of complete transparency? Should we repackage things to make them easier to understand? What works?

 

Tim Caulfield: Debunking does work and we have good empirical evidence to back it up. And I think we should think of it like a public health intervention, because you want to you want to measure the impact on a population level. And so what does a good debunk look like? You want to use good, credible sources of information and refer to the body of evidence. And there is evidence to suggests that really does work. So you talk about the scientific consensus. There's work from the climate change area, there's work from GMOs, there's work from vaccine hesitancy that suggests doing that can have an impact. Number two, you want to highlight the rhetorical tricks that are used to push misinformation. So what do I mean about those two? Let me give you an example. Someone will say, “Tim, I hear that these vaccines can change your DNA” or a better one maybe. “Tim, I hear that vaccines cause infertility,” and you can go, “Well, you know, there's actually no evidence to back that up. And here is scientific consensus from these scientific organizations, these professional organizations. And by the way, the person that is pushing that misinformation is using an anecdote. They're using a testimonial. They're misrepresenting risk. They're relying on a conspiracy theory.” You put those two things together and you really can have an impact. The other thing I think it's really important to do is, is to be nice, to be empathetic, to be humble, it can be really hard to do that. Also, you should always aim at the general public. That's your audience, not those hardcore deniers. And this goes to that that question you had about “What's the wrong way to do this?” Well, the wrong way, I think, to do it is to get into a fight with a troll on your social media, to feed them. It’s just a waste of time from a psychic perspective. You're just giving oxygen to this individual. Really think of the general public. You can use absurd statements from conspiracy theorists and deniers and celebrities as a sort of a pop culture moment to talk about what the real science says. But don't get pulled into that into that vortex. I really think about the general public or perhaps a particular community. Don't waste your time on the denier.

 

Vardit Ravitsky: We'll talk about social media in a minute. But I want to ask you a personal question. You talked about hate mail, you talked about trolls. How do you cope with working in a domain that is so volatile, explosive that exposes you even personally, your family to so much public attention and hate? 

 

Tim Caulfield: It is ridiculous, isn't it, that people are this polarized, that they put this much energy into and to hate, really? And I do get a ridiculous amount of hate mail. I got another death threat just just days ago. You get involved in a lawsuit from an anti-vaxer. It is exhausting. I don't think this is the experience that most people have. It is exhausting. But I think this highlights something a really important point. And this is something that I'm thinking about doing more work on in the future. It's really important, I think, now to support individuals that are doing this work and who are fighting misinformation, World Health Organization has asked more scientists and clinicians to step up and fight misinformation. If governments have done that, institutions have done that, and if we're going to make that request. We have to make sure those individuals have that support. So that means that universities, hospitals, whatever the institution is, they have to reward people, to incentivize them for doing this. So it's got to be part of their job. We have to give them the training if they want it. You and I coauthored a piece for the Royal Society of Canada on science communication. And that was one of our recommendations, right. That more individuals need to get out there. In fact, we even said that it's the responsibility of the scientific community at some level, at some level to to counter misinformation. But we also say in that report that it's also the responsibility of these institutions to back individuals up who are doing this work.

 

Vardit Ravitsky: Tim, let's talk a little bit about social media, because you kept referring to its prominence in the spread of misinformation. What role do you think social media companies should play to address this issue? Are they doing enough? Are they meeting their ethical responsibility here? Who should be the referee? Are we dealing with a threat of censorship and threats to freedom of expression, or is this free for all reality just too dangerous and has to be controlled?

 

Tim Caulfield: Wow. This is a complex, complex topic. You know, at the beginning at the beginning, I said this is one of the great the spread of misinformation is one of the challenges of our time. And I think what you just described is one of the policy challenges of our time, our social media platforms doing enough to battle misinformation. And I think the short answer is no. The good news is every platform now recognizes the dominant role that they play in the context of this issue. And so we are seeing more and more action by all of the social media platforms. Recently, you saw Facebook get very aggressive about misinformation in the context of vaccines. They need to do something, right? They absolutely need to do something because this is where the misinformation is spreading. So then the next question is, are the strategies that they're utilizing effective? So they're doing a broad brush stroke. They're sending out warnings, flagging misinformation, they're posting redirects. They're inviting, Twitter does this, inviting you to read the article right before you post it. All of those things, I think are good. And I think in the aggregate, if you look at the literature, they do seem to help. But we need to be careful. I think we need to do more research on this. Are there unintended consequences? For example, there was some interesting work, again, done by my colleague Gordon Pennycook, where he found that that flagging something as misinformation. Might we need more research on this? Have the unintended consequences of making something that does not have a flag seem more accurate than it really is? So I think we need more empirical research on what kind of social media interventions actually work and what are the potential downsides of these interventions. I think short term that I get asked a lot about these bans and served platforming. I think I think short term they are a good thing. And you get this interesting right slanguage that emerges around, you know, freedom of expression and censorship. But this is really not a good representation of what's going on. You don't have a human right to be on Twitter. I think people think that, you know, what's fascinating is the people are making that claim. Your censorship, freedom of expression. But I think that we do need ideally, I would like to see and this isn't going to happen, but this is sort of an independent oversight entity that was accountable to the public sort of monitoring the behavior of these social media platforms. And it would have to be international in scope. How would you ever? So I don't know what the answer is. I really don't know what the answer is, because there are sort of freedom of expression issues here. We're asking private companies to make decisions about what we see. And the other platforms are involved, and yes, this probably doesn't really trigger freedom of expression in the context of charter issues. But these are big questions and how we resolve them, I'm not sure.

 

Vardit Ravitsky: So you talked about the empirical research showing what an important role social media plays in general and the spread of misinformation. But you also mentioned those super spreaders, right? Celebrities, politicians, people with huge presence that once they push something, it gets visibility out of proportion, visibility. Some people have proposed to ban super spreaders from social media platforms, What do you think about that proposal?

 

Tim Caulfield: OK, so my heart says, yes, do it, please. So that's what my heart says. And I also think that there is some evidence to suggest it works, right? That it does work. It slows the spread of misinformation. We saw that with de platforming of Trump. It did slow the spread. So I think in the aggregate, if the goal is slowing the spread of misinformation in the aggregate, it probably works, especially for those who are the movable middle. So do I mean by that people that are who are vaccination hesitant will use them as an example. They're not all hardcore deniers, obviously, right there on a continuum. And the ones that we really want to try to target with our messaging is that movable middle right. The and I think when you do platform people, the group that benefits the most is that movable middle because they're the ones that aren't going to see the misinformation as much, because we've got a platform that it's not going to wash over them in their daily life. And that's really what you're aiming. So that takes me to the potential downside of platforming, because, again, I think we need empirical evidence to see if this is true. But I think it it it makes sense that there's concern it's going to lead to further polarization, where these individuals are going to go to different platforms and it's going to create a really intense echo chamber. And those that are sort of more hardcore deniers are going to be even more in an echo chamber. And they feel they're going to feel that this platform you just. It plays to their conspiracy theory, world view right platform that may see it is a conspiracy theory. It is a conspiracy. So I think that there's that concern. But why I still say in the aggregate it's beneficials or probably it's very difficult to change the minds, as I said earlier, of those hard-core deniers, right? So if if you think of fighting misinformation as a public health intervention, de platforming is the right move. But I do think we need to talk about what that means for long term policy. Is this really the right way to deal with misinformation, but short term in the middle of a pandemic when you're trying to get people vaccinated? I think it makes sense.

 

Vardit Ravitsky: So interesting what you just said, that at least now we're all present in the same space and we talk to each other. And you've painted this really dark scenario of the two groups not even interacting because they have their own platforms. And you have to kind of peek to the other worlds to see what their what is being told there. Thank you so much for all your insights. This has been tremendously Eye-Opening and fascinating. Thank you, Tim.

 

Tim Caulfield: Thank you very. And thank you for all the stuff that you do.