Why We’re Losing the Misinformation War

With Claire Wardle - Co-founder of the Information Futures Lab at Brown University


Episode description:

It was eight years ago, when Brexit and the US Presidential election showed how misinformation enables real-world damage. Since then, researchers, content managers, regulators, journalists and others sprang into action to counter misinformation and now misinformation pollutions is even worse. Why?

Claire Wardle has some ideas. She’s been in the fight since the beginning. In 2015, she was the founder of the pioneering research and training organization, First Draft News. She’s led teams on misinformation and verification at the BBC, Columbia Journalism School, and the UN among others. She’s now the co-founder of the Information Futures Lab at Brown University.

Claire and Eric discuss the backlash against content moderation; the perverse incentives that work against collaboration against misinformation; the role of journalists in rising mistrust of media; artificial intelligence and falsehood; and everyone’s personal responsibility for standing up for truth.

 

Topics

  • The Role of Information in Public Health
  • Encouraging Collaboration and Cross-Disciplinary Work
  • Community-Centered Approach to Addressing Misinformation
  • The Role of Media in Information Pollution
  • Journalism’s Responsibility and Trust Decline
  • Misinformation in Officialdom: Florida Surgeon General
  • Undermining of Expertise and Trust in Science
  • Individual Responsibility and Media Literacy
  • The Need for Regulation and Oversight
  • The Challenges of AI and Content Moderation
  • The Role of Courts in Addressing Social Media Harms
  • Hope for Regulation and Oversight
  • The Importance of Curating Newsfeeds and Avoiding Information Bubble

 

Transcript

Eric Schurenberg (00:01.454)
Claire Waddle, welcome to In Reality.

Claire Wardle (00:04.718)
Thanks for having me.

Eric Schurenberg (00:06.896)
Claire, tell us about the work you’re doing at the Information Futures Lab. And why is it part of Brown University’s public health school?

Claire Wardle (00:16.354)
Great question, because I think some people find this quite strange because 2016 was a moment that we know that we had an election in the Philippines with Duterte that people weren’t really expecting. And then we had Brexit and then we had the unexpected Trump election. And of course, everybody became very concerned all of a sudden about misinformation, but it was very much in the context of democracy and elections. And so that has really been lots of university labs were in political science departments. And that’s the way it’s been framed. Of course…

…pandemic showed us that misinformation does not only affect elections. And so I was brought to the Brown University School of Public Health because the Dean, G. Shah, basically said, you know, we didn’t take, and we said we, public health, did not take communication and information seriously enough. We felt that our challenge was to get all the vaccines to people and we never considered that they wouldn’t want to take the vaccines. It was just, can we invent and then deploy?

And so there was a big learning curve there. So we are here in a school of public health, but we don’t just do health, right? Because what we know is politics and health and climate have all become interwoven. And certainly those who are trying to discredit health information are also trying to discredit election information, climate information. And so that’s, we’re really trying to say the silos by which universities are designed are not the ways that people themselves think about life, right? So if we continue in that vein, we’re actually doing ourselves a disservice.

Eric Schurenberg (01:49.55)
Okay, I guess you’re right. There’s a kind of chaos or disinformation industrial complex in which conspiracy theories just kind of blend across borders, which is an interesting point because Claire, you have been involved in cleaning up the information environment for a long time. Since the beginning when the problem began to be realized, it seems. And…

From the beginning, you’ve been concerned about the fragmentation and siloing of the many people who are working on countering misinformation. While, as we just said, it seems like conspiracy theorists all kind of like to work together because they share a particular suspicious view of the world. But let’s talk about why the people on our side of the controversy are so fragmented. Why is it? What are the forces that… divide us, even though we’re all working on the same problem.

Claire Wardle (02:50.734)
I mean, I ultimately think we have to look at incentive structures, right? So we published a paper in November where we did a review of all of the different studies around COVID health misinformation. And we found just, we couldn’t actually say what works because every research team was using different definitions. They were using different stimuli. They were measuring different things. And that’s because if you’re an academic, you’re trying to show that you’re doing something new. If you’re a nonprofit.

Do you want to collaborate with another nonprofit? Well, you probably do, but you actually want to show the funders that your nonprofit is a nonprofit that’s the best nonprofit and deserves all the money, right? If you’re a university department, you want to showcase that you are the department that should be getting all of the funds because you and your discipline is the best. So it’s all about incentive structures, unfortunately. And until we really tackle that and say to funders, listen, you know, the way that we’re set up against each other is not helping the problem or the way that universities are…

…you know, design, how they decide whether or not somebody gets tenure. Like all of these different things, unfortunately, around wicked problems like this, or homelessness, or any of the other number of wicked problems we have, it’s not helped by the way that our systems are set up.

Eric Schurenberg (04:01.744)
Hmm, hmm, hmm. Well, that is a good segue to a lot of the work of the Information Futures Lab, which is about working cross -discipline, encouraging collaboration. Talk a little bit about some of the projects that you’re funding right now and supporting right now that you find encouraging.

Claire Wardle (04:23.118)
Yeah, I mean, one amazing thing about being in a school of public health is that everything that a school of public health does really is rooted in community. Like public health is about population health. And so that’s about working with communities. So when we arrived and we started the lab, I was very quickly schooled in like, that’s the way that we do things here. And it was very helpful for me, I think, after over 10 years of focusing on the content, like…

…What do we do with the content? Do we downrank it? Do we flag it? Do do media, you know, media literacy around the content? And it really made me think about, well, how do we flip this and think about what is it that communities need and are we serving those communities? So a lot of the work at our lab is saying, okay, yes, there’s definitely conspiracy content out there. No doubt. There’s definitely people who believe pretty outrageous claims, right? Harmful claims.

But our position is the numbers of people who actually truly believe that kind of content is relatively small. And it doesn’t mean that that isn’t a problem, but actually the kind of things we need to do to bring people back out of a rabbit hole is kind of a form of de -radicalization. And our perspective is, well, let’s take all of the people who are saying things like, hey, I’ve heard about this thing myocarditis, right? Like I’ve got a nine -year -old kid. Should I be worried about that?

and they go to the CDC and the CDC is not giving a straight answer because the CDC is like, we need to wait for more evidence, which is understandable. But in the absence of good information from the CDC, the conspiracy theorists swoop in and say, I’ll tell you about myocarditis. Other valid people say, listen, I just want to like, tell me about fertility. Like, what do we know? What do we actually know? Well, let’s not say anything right now because we don’t know. You know, all of the ways that we, I want to say we, I mean, public health departments, newsrooms, scientists, universities…

…We didn’t communicate effectively during the pandemic. We communicated like it was 1996. We used press releases. We said no comment. We haven’t updated our communication practices for this moment. But you know who’s really good at using our contemporary networked information ecosystem? Disinformation actors, anti -vaxxers, stop the steal communities. They understand that you get people to feel heard. You give them a task. You make them feel like they’re participating in something. So the…

Eric Schurenberg (06:19.76)
Mm -hmm.

Claire Wardle (06:43.256)
…Vast irony in the work that we’re doing now is we’re not playing whack -a -mole with the bad stuff. We’re saying, how can we, those of us who do have some element of impact on the information ecosystem, be better? Right? I think it’s easier to blame the Russian trolls and the teenage boys in their basement because we don’t want to look at the way that we are, I think, failing communities. I don’t think that we are getting information to people where they are, in languages that they speak, in ways that are culturally relevant, in formats that people trust…

Eric Schurenberg (07:04.72)
Hmm.

Claire Wardle (07:12.558)
…quickly enough. We’re just not. So I think that’s a lot of the work that we’re doing at the lab.

Eric Schurenberg (07:14.736)
Uh huh, uh huh.

We talk a lot about media here on In Reality and your work at First Draft addressed a lot of the media and your background is in journalism. So let’s talk about the media and their role in the pollution of the information ecosystem. What are journalists doing wrong? To what extent are they responsible, kind of as an industry, as a profession for the decline in trust and what should they be doing differently?

Claire Wardle (07:48.43)
So I got into this space around 2007, 2008, because the BBC asked me, how do we know what’s true? We’re getting more content from the audience and we don’t know what’s true. And I ended up leaving my academic job to go and work with the BBC to develop a training course for all journalists on verification. Right? We weren’t even talking about misinformation at that time. It was how do we prevent ourselves from being hoaxed? And I would speak to journalists who say, oh, you know what, this is tricky for us. Like we’ve only ever really dealt with the stuff that was true.

Eric Schurenberg (08:10.256)
Mm -hmm.

Claire Wardle (08:17.934)
If something’s false, we never had to report it, right? That wasn’t what journalism is about. So now we’re being told, like post 2016, do fact checks, report on the falsehoods, report on those people who are pushing the falsehoods. And Janice, this is not natural for us because the truth is, and you know this, Eric, the major paradigm of journalism is this idea that more sunlight is a disinfectant, right? If there’s a problem, if there’s corruption, by pointing a light to it, you are helping the problem go away…

…But that in the last 10 years has been weaponized by bad actors, many of whom do not have an audience. Right? So they’re like, okay, so I can post on Telegram or 4chan, but I don’t have very many followers. So what do I need? I need a megaphone. Who’s got a megaphone? Journalists. Okay. So how do I get journalists to talk about these issues? And the number one thing to do is to fool the journalists into getting them to report something that’s false. But many journalists are very good at verification. So they don’t report it as truth…

…but they report it and they say, I’m going to dig into this and I’m going to fact check it. But what we know from psychology is merely the fact of fact checking it means that you are giving it oxygen, you’re giving it some light. And that unfortunately is what the bad actors are looking for. So, and then you layer that reality onto the fact that many newsrooms are really struggling financially. And this kind of content drives eyeballs. I remember people might not remember this, but in 2012…

…there was a Canadian university class where the final assignment was create a fake video. And these students used computer generated techniques to create a video of what looked like an eagle stealing a baby in a park. And it went viral. I mean, these kids got an A plus, like it got 40 million or something crazy. But I remember the Guardian running a fact check of the eagle video and they broke it down frame by frame. I remember speaking to somebody and saying…

…don’t you feel like by doing that, you’re just kind of like highlighting the bad stuff. And he was like, Hey, that debunk video got twice the traffic of weight because the guardian had run it as fake. Right. And that was 2012. And I have never forgotten that conversation because what he said was true. The process of debunking often gives you more eyeballs. Right. So especially if you do it well and you give it a sensational headline, like, did you see insert name of viral video will tell you why it was right. Why you were wrong.

Eric Schurenberg (10:30.128)
Yes.

Claire Wardle (10:39.726)
And so I think we have to recognize that there’s those dynamics around more sunlight, but there’s also dynamics of the clickbait headline that reinforces the myth also drives traffic to newsrooms that are struggling.

Eric Schurenberg (10:52.016)
Yeah, that is interesting. Yes, there is inevitably in the the misinformation campaign cycle that Joan Donovan talks about, there is this point where the campaign leaps from, you know, the small dark corner of the internet to public view, usually with the help of a useful idiot whose motivations may be innocent, like the journalists you describe or, or less innocent, as in the case of…

…Tucker Carlson, just to pick a random name. It is interesting. Well, let me bring up an example of where misinformation that had been confined to dark corners of the internet leapt not only into public discourse, but into officialdom. And that is in the recent declarations by the Florida Surgeon General about how to treat measles in a Florida school.

in which he said in defiance of standard medical practice that parents whose kids were infected could use their own judgment about whether to send them back to school or not and not quarantine them. The Florida surgeon general, Joseph Ladapo, had been associated with other kinds of non -standard medical ideas for a while, but this has kind of gone viral in the sense that everyone is talking about it.

How do you feel about it? And how alarmed are you by this leap into sort of the ranks of Florida administration?

Claire Wardle (12:31.576)
It’s really depressing. And actually at the school here, the former Florida Surgeon General, Scott Rivke’s is here at the school. And so talking to him, you can imagine how he feels about his office kind of, you know, being working in a way that’s completely against medical advice. I think what’s so upsetting for me is to see in this moment, the narrative of freedom, essentially being drawn, this idea that as a parent, you know best.

Eric Schurenberg (12:38.416)
Ah, ah.

Claire Wardle (13:00.078)
this idea that you can make your own decisions. And that, I mean, the way that that has been weaponized since the pandemic, I mean, there were decisions made at the time around school closures and all sorts of things. Of course, with a post -mortem, we’d all say, you know what, we probably acted a little bit too quickly or not enough evidence was in, but we were doing what we thought was right. But there’s been no recognition of those extraordinary circumstances we lived through four years ago and the decisions that we made.

And instead, any tiny misstep or even large misstep has been seen through the eyes of, well, people were trying to cause us harm. And so this idea, like, what do these public health professionals know? You know best, you’re a parent. I mean, this extraordinary undermining of expertise that comes from the scientific method, I don’t know where this takes us. And my, I mean, my deep fear here is that we do end up seeing some very serious consequences.

But yet we still don’t learn from this, right? I mean, I remember in the early days when people were denying COVID, I was like, this breaks my heart. But when people lose friends and family, they will realize that COVID is real. I couldn’t in my mind imagine the way that people would do these kind of somersaults, these, you know, with their brains to say, Oh, no, Bob died with COVID, not of COVID. Of course, this had nothing. You know, I couldn’t imagine that we would at this point have all these people, some people who lost.

30 to 50 friends and family who still do not believe that COVID was a thing. So I think my fear here with the measles outbreaks, that this, I mean, this is the start of something really serious. And we’re seeing as well in my own country, in the UK, a disease that was eradicated to see us now back in a situation where, you know, of course you watch mothers on TikTok saying, if you love your mother, don’t send them to school and saying, how on earth has that become as valid as a form of information as…

Well, you can’t even say the Florida Department of Health, because actually if you go looking for it, they’re putting out none of that information officially. So we’re in a very troubling time.

Eric Schurenberg (15:00.912)
Yes. Yes.

What does a profession do about that?

Claire Wardle (15:07.79)
I mean, the profession has to recognise that the loss of trust over the last four years has been so consequential. And this is, we need to repair it and we need to recognise that we’re not going to repair it in a couple of years. But this is not nibbling around the edges. This needs to be a whole profession getting back out into the community, apologising, listening, recognising the harms that were caused, explaining why decisions were made, but a recognition that perhaps it wasn’t the right decision.

And to say that if we do not as a society understand the importance of science, and if we do not understand the consequences of these kind of individual decisions, then where does that take us? Like, yes, we can all make individual decisions about certain things in our lives, but there are other things where this has to be from a community perspective. We have to understand where community comes in and on something around contagious disease and vaccines, we have to see this through a community lens.

Eric Schurenberg (15:56.142)
Mm -hmm.

Eric Schurenberg (16:06.192)
Yes, yes, absolutely. I find that the sort of takeover of the Florida Public Health Department to be as distressing as you do and part and parcel of a kind of ascension of disinformation kind of throughout the world. Among people on our side of the debate, I feel a sense of exhaustion…

…retreat. So, I mean, just to tick off a few things, Twitter, which used to have, used to at least cooperate in part with researchers in misinformation, now become a safe harbor for trolling and hate speech and content moderators have been banned from that organization and in many other social media platforms as well. Attorneys general in Texas and Florida have acted to…

…assure that misinformation can’t be stopped in their state. And then you have the number of anti -vaxxers capturing public health departments, majority of Republicans believe in the 2020 election was stolen. What is your evaluation of the information battlescape right now?

Claire Wardle (17:28.942)
The one thing I think we have to recognise is that we are living in extremely precarious times. So when we think about all the things you described, it’s not, we’re not doing any of this in isolation. Like the way, the things that people believe, the groups that they’re siding with, the identity politics that are happening, this is partly because the world seems exceptionally wobbly, to put it mildly, right? So people are trying to cling on to a sense of clarity. And the truth is…

Eric Schurenberg (17:53.84)
Mm -hmm.

Claire Wardle (17:58.798)
…A lot of the uncertainty around science or, well, particularly science means that that feels uncomfortable for people. I think we also have to acknowledge that we’re coming out of this extraordinary pandemic situation. And so we are overreacting, right? So I think what we’re seeing with Twitter is a part and parcel of during COVID. I think that many, many people in my field, I think we did make some decisions that went too far.

I think we did try and label too much content and say things that was misinformation that wasn’t. It was unsettled science. But I also think we have to have some recognition that there was a period of time when many people hadn’t seen people for months. We were all terrified. I mean, the decisions were being made by humans who themselves were living through this pandemic. But I think the platforms, nonprofits and others, there were many people who were like, oh my goodness, we’ve got to label this speech. This speech is harmful.

It’s got to do whatever we can to protect people. But then it’s had this blowback. So now we have this idea that any form of content moderation is suppression, is censorship, which is not, but I can see how it happened. Right. So it’s like, we’ve had this push and pull, and I think it’s like the elastic band is springing back right now. And that’s the period that we’re living through. And then I think at some point we will come back to a kind of an equilibrium around speech and we will say…

Eric Schurenberg (19:13.84)
Mm -hmm.

Claire Wardle (19:21.87)
…the platforms do need to do content moderation, but we need more oversight, we need more transparency, we need all of those things. So yeah, I think we’re very much living through a reaction to the horrors of the pandemic and some of those were very human and some of them involved content moderation and all of those things, we’re kind of, I think, failing the repercussions.

Eric Schurenberg (19:41.232)
Unfortunately, the elastic band has stretched, overstretched in the other direction against content moderation at a time of many consequential elections around the world. Prominently the 2024 presidential election here. How distressing is that to you that content moderation is so weakened at this particular moment?

Claire Wardle (20:04.91)
So we’ve got the double whammy, right? We’ve got weakened content moderation, and then we’ve got essentially new, faster versions of Ferrari cars being put on the road every single week through AI, right? So you put those two things together. I mean, two weeks ago when they launched Sora, the idea that you can give a sentence prompt and you will get a perfect 60 second video. I just kept thinking, well, what does that look like when it’s a video of a ballot box being stuffed or it’s a video of somebody voting that, you know, is made to look like an illegal immigrant? I mean…

Eric Schurenberg (20:21.84)
Hmm.

Claire Wardle (20:33.614)
I mean, whatever, you need two minutes with a whiteboard and you can imagine all these things. And I thought, and it’s just out there on Twitter. And we’re like, ooh, no watermarking, no nothing. When we’ve known full well that this was coming, you know, and I kind of drew the analogy with the fast car, but it does feel like we’re putting these insanely fast cars on the road. And I’ve been like, oh, nobody needs to wear seat belts. No need to follow the road sign. I mean, we haven’t done anything to say, you know, these, these elements are going to make real problems. So.

On one hand, I think there’s a lot of catastrophizing about, oh my God, AI is going to ruin all of these elections. I don’t think it’s that. I think it’s a mixture of people’s fears about AI and the fact that they are not going to trust anything, plus the lack of content moderation. It’s like a mixture of all of these things coming at a time when we’ve got, you know, over 50 elections around the world. It’s just terrible timing that these two things would happen at the same time.

Eric Schurenberg (21:15.76)
Mm -hmm.

Eric Schurenberg (21:24.528)
In many of your public comments, you’ve stressed individuals’ responsibility for not contributing to information pollution, developing a healthy skepticism about information. Most recently, we had Peter Adams, the research director of the News Literacy Project on the podcast. I respect their work and cheer it on, but it is focused on school children. So it’s going to be a while before this…

…hopefully new generation of skeptical news consumers hits the market and hits the ballot box. Is there anything that can be done to increase media literacy judgment about news in the adult population, the voting population?

Claire Wardle (22:10.946)
Yeah, I mean, we do need, there’s a phrase, cradle to grave literacy, you know, because I think there’s the there’s the eight year old who gets access to a smartphone and actually very quickly figures out how to edit a video. But there’s also my mom, you know, at the other end of the spectrum, who has only ever grown up or lived with gatekeepers. And so can’t really imagine that anything online would not be true. And then there’s everything in between.

Eric Schurenberg (22:33.166)
Mm -hmm.

Claire Wardle (22:35.598)
So I think it is about us saying everybody needs a different set of skills. And that’s not just how do you tell whether something’s true or not. It’s how do you know how algorithms work, right? So do you know that you’re watching this kind of material and because you’ve watched a bit of it before, you’re getting served even more of it. Like, do you know how that works? Or do you think that you see everything? So there’s that, there’s do you understand how your data is being collected and how ads are being targeted to you? You know, do you understand outrage and how…

…You’re seeing something because lots of other people have liked it and therefore you’re more likely to see it because it’s full of emotion and sensationalism. So there’s all of those, all of those elements of being a digital citizen in 2024 that we need everybody to understand as well as well. How do you figure out whether or not this video is AI generated? So we need all of those things. But nobody wants to invest in it, right? I hear government spending millions of dollars building social listening dashboards. But then when you say, what’s your…

…for media literacy, it’s a fifth of that, if that, because exactly to your point, well, it’s going to take too long. Well, if we had started and really invested in this 10 years ago, we wouldn’t be out of the woods, but we would be further along than, you know, Peter’s organization does great work. But like you say, underfunded and really just focused on news literacy and only really funded on, you know, middle school, high school kids. So yeah, we need far, far, far more investment in this.

Eric Schurenberg (23:40.656)
Yes.

Claire Wardle (24:01.388)
But nobody thinks literacy education is sexy. So funders, it’s not at the top of their list.

Eric Schurenberg (24:06.864)
Education is not sexy. Maybe regulation is sexy. So I wonder, I know you’ve done quite a bit of thinking and convening people about the regulations that would help. And so far, just from these shores, it appears that the most effective work in regulation, the Digital Services Act, Digital Markets Act are happening in Europe. What do those regulations do? And how much hope do you hold out for them?

And how much hope do you hold out for their making their way to the U .S. in any kind of effective regulation?

Claire Wardle (24:41.614)
So I’m glad we’re having the conversation about regulation because to me it is critical. I think it’s important because sometimes people hear regulation and they immediately say, oh my goodness, government’s going to decide what I see or not. Like that is not the kind of regulation we need. But what we do need is far more oversight to what these platforms are doing. And we’re already seeing with the digital services act, I mean, a very basic thing was demanding the platform say how many content moderators do you have in different languages?

And you saw with Twitter, it was like thousands in English, 56 in Arabic or something like that, 20 in Portuguese. And you just think, I mean, you realize very quickly how lopsided all of this is. So we need that. And I think the Digital Services Act is going to force platforms to be much more transparent, but also prepare much. When Ukraine happened and everybody, all the platforms were like on the back foot about, well, what’s our policies about? Can you say hateful content about Russia? I don’t know.

Metta made a decision and changed its mind. I mean, that’s the kind of thing that they should have figured out along, we should have prepared for what does the conflict in Europe involving Putin look like, you know. So I think that’s a good thing. I think, I don’t think we’re going to see anything at the federal level in the US for all the reasons that we know that Congress is just completely dysfunctional right now. I do think there’s interesting things happening at the state level. And in fact, when we consider deep fakes, and as we know…

…deepfakes and their impact on the election is important. But the number one problem with deepfakes right now is pornography and women’s images being used. Like that’s the 98 % problem, right? And right now, we don’t have good jurisdictional regulation around that. If you call the police, they’re like, where did this crime take place? On the internet, you say. So we’re seeing more states moving around AI generated content, but it still seems slow in comparison to the speed at which this technology is moving. So we also need good regulation that

Eric Schurenberg (26:13.552)
Yes.

Claire Wardle (26:35.278)
will still be relevant irrespective of where we are in two years time when we were wearing contact lenses, you know, who knows. But that’s part of the regulatory challenge is building policies that will make sense irrespective of the technology. So we do need more oversight. We do need powers, not just oversight. You know, the EU does have power to go to Silicon Valley and demand certain types of data. And that’s a good thing. But ultimately, the platforms still have very, very, very smart lawyers that can run rings around.

really lovely European bureaucrats. So, and ultimately, there’s a lot of money at stake here for these platforms. And I mean, going back, at the moment, there’s lots of discussion about kids and online harms. You know, and Zuckerberg stood up in Congress last month and stood up, turned around and apologized. But the same day, I can’t remember which outlet, but leaked the fact that Nick Clegg, two years previously, had asked for just 45 more staff to work on teenage mental health on Instagram, and it got turned down by Mark Zuckerberg. But for all of his apologies,

Eric Schurenberg (27:14.768)
Mm -hmm.

Claire Wardle (27:33.934)
Those 45 staff were too expensive. And that’s what he said in the memo, which is that we, you know, this is not something that we can justify because of costs. When we’ve got these companies that every single quarter, it’s not just make a profit, make more profit than the quarter before. These decisions are never going to come down to anything other than how can I make more money? And that’s the way it’s set up and shareholders demand it.

Eric Schurenberg (27:57.404)
Just recently, the Supreme Court heard cases from Texas and Florida about what social media platforms can and can’t do. Do you have any expectation that this is the way that the US will settle the social media harms through the court system? And what’s your expectation for how those court cases will turn out?

Claire Wardle (28:24.142)
I mean, I don’t think I have a good answer to this. I mean, I think it seems kind of extraordinary to me that we’re in this moment when these individual court cases from individual states, I mean, these companies are global, right? And like Texas is gonna say, oh, I’ve got an idea. And then the courts, I mean, this to me is partly, I think, because the platforms, even though they would say things publicly and write op -eds like, oh, we want to be regulated…

…they themselves much earlier on should have been doing more to say, how can we prevent harms and how can we think about it and bring the public along with them, like really think and work with different types of stakeholders, whether they were civil rights groups, human rights groups, business leaders. You know, I think they just weren’t, they just didn’t think themselves about it. So the idea that this gets decided in courts in this moment when…

…How long does that stay? How does it work? How does it get actioned? I mean, I have to say, I haven’t been following those court cases as much, partly because I’m like, really, is this the thing that changes the way that the person in Myanmar accesses content? It just, it seems very strange to me. So yeah, I have to say, I haven’t been watching them as closely, partly because the insanity of watching Brussels and then watching what’s happening in Texas and then watch it go to the Supreme Court. And, you know, it just seems an extraordinary way to respond to these platforms.

Eric Schurenberg (29:48.048)
Hmm. Yes. Quite, quite insufficient. I sometimes feel like we are losing the misinformation war. We. Truth. And, or reasonable approximations of the truth or evidence -based reality is losing the misinformation war. I wonder, Claire, if you can, if you agree with that, or even if you do, if there are positive notes – optimistic reasons for us to carry forward from this conversation that we can point to and give us at least a sliver of hope.

Claire Wardle (30:29.454)
So I definitely think the next year or so are going to be extremely important in the history of the internet. I think one of the big fears around generative AI, and we’ve already heard people say that probably by the end of this year, 25 % of links on Google will be AI generated, basically spammy, slimy sites. We’re already seeing, of course, real fears about how can you tell what’s true when it comes to video footage or photography. We’re seeing people…

Eric Schurenberg (30:47.312)
Mm.

Claire Wardle (30:57.41)
respond to graphic content from the Middle East immediately saying, oh, that’s so awful. That must be a deep fake. But there’s a couple of interesting projects. One’s called Project Origin. That’s kind of being pushed by the New York Times and the BBC, which is about creating for one of a better word, a watermark, but like on the blockchain. So I can’t just Photoshop it out. But that means that you, if you saw something and it said it was a New York Times, you could actually go back and say, yes, it was filmed with a New York Times camera or a BBC camera.

Eric Schurenberg (31:17.36)
Hmm.

Claire Wardle (31:27.118)
And so I think probably in the next two to three years, the rest, the internet will get much more like a cesspool. And so the irony is I think we’ll go back to a time that will probably look a little bit more like 1996, that we can wallow in the cesspool and we can scroll through TikTok and be entertained and not really care whether it’s true or not. But then I think there’ll be a smaller number of places you go when you absolutely need to know that it’s true. And that will be…

…does it have that watermark technology? Do I know that this was actually from one of those institutions? So in one hand, I feel sad about that because the joy of the internet is that it allowed us to hear from anybody and that we suddenly heard new stories and new sources that we hadn’t heard before. But I don’t know how we continue to have that kind of internet when we will also have so much that will just be.

self -generated, will just be spammy, slimy, horrible stuff. So then it might be that we go back to a time when there’s just fewer sources that we trust. But I think we’re going to have to have a system like that. And that’s better than having no trust in anything. So I’ll take that.

Eric Schurenberg (32:30.672)
Hmm.

Eric Schurenberg (32:36.432)
Yes. Well, that could be hopeful for the media institutions and other sort of evidence -based institutions that they may out of the chaos, emerge as the trusted places because people finally have no other options. I do know that in my own media literacy training work, the question that I’m asked all the time is, where do you get your news? Or can you give us a list of trustworthy places where we can get ours because we have no way of knowing what is really true out there.

Claire, I might ask you just to follow up on that particular question. Part of the action, part of the act of becoming a wise news consumer is getting out of your own information bubble and understanding that I think a phrase that you have used yourself is that information gathering is not a rational act, it’s an emotional act. And so that you and I…

…who are more knowledgeable than many people about how the media work are also subject to emotional decisions. How do you make sure that you are curating your newsfeed in a way that gets you to the closest approximation of the truth and doesn’t consign you to a particular information bubble?

Claire Wardle (34:03.246)
So I like to do something with my students. I assigned them a reading that was this really clever study a couple of years ago where people who watch Fox News all day long were paid to watch CNN every day for 30 days. And the study found that actually people did, they didn’t change their views, but what they realized is that there were a whole set of stories that they never heard, right?

Eric Schurenberg (34:17.454)
Mm -hmm.

Claire Wardle (34:27.566)
And so, and then so, and the students go, really? And I say, yeah, let’s go to this morning, MSNBC, Fox, CNN, New York Times. And almost every morning, it looks completely different. It’s not just the same story frame slightly different. It’s the choices of which stories to cover. And so my feed is probably similar to yours is that I still, from my sins, still consume, I don’t post, but I consume on Twitter and I’m on threads and LinkedIn and other things. And so, and I follow a real mixture of sources as well as people.

journalists and others who have different opinions. I listen to different types of podcasts. And I think also being in these spaces when you know, we see major newsrooms making mistakes, the hospital bombing, the New York Times, I mean, that did such harm, I hear my students out, don’t, well, we’ll never trust the New York Times again. I mean, that was just an extraordinary mistake. But by doing so, by having a mix when there is a mistake, you also very quickly hear people call each other out. So I’m like you, there is no one source everybody.

Nobody is perfect. We live at a time where there, I mean, the amount of news every day, you can’t, you know, you need a wider diet to hear all of the things. So yeah, it feels time consuming. And it’s a constant battle, I think, to you too, of like, what am I missing? Am I getting too safe in my own bubble? I mean, even within any bubble, there are kind of you realize, factions are the thing. So yeah, it’s a, I sort of, when I grew up, I grew up in the UK, and my parents got The Guardian delivered, and we’d read the newspaper cover to cover…

…that seems like such a luxury. And I didn’t realize at the time I was in my bubble, right? So I do think social media does allow us, even though we talk about the bubble, we’ve always been in bubbles when it came to information consumption.

Eric Schurenberg (36:06.192)
Hmm, hmm, hmm. Yes, we just didn’t realize it. Claire, thank you so much for this conversation and thank you for the work you’re doing at the Information Futures Lab. That is a necessary thing to bring people together across the institutional knowledge and evidence reality -based community and I really appreciate you.

Claire Wardle (36:27.054)
Thanks for the conversation.


Created & produced by: Podcast Partners / Published: Mar 26 2024


Share this episode:

All episodes are streaming on these platforms: