You Are Not Thinking As Clearly As You Think

With David Robert Grimes - Trinity College's Author of 'Good Thinking'


Episode description:

Welcome to In Reality—the podcast about truth, disinformation, and the media with Eric Schurenberg, long-time journalist and media executive, now the founder of the Alliance for Trust in Media.

Here is the question of our information-saturated age:  Why, when they have access to more information than ever in history, do so many people believe things that are demonstrably untrue? This is not just the gullible, not just “those people”—but also all of us who pride ourselves on thinking clearly.

Today’s guest has spent his career trying to answer that question. David Robert Grimes earned his PhD as a physicist, but switched over to public health and is now an assistant professor at Trinity College School of Medicine in Dublin Ireland. But it’s his parallel career—as a science journalist and author—that brings him to In Reality.

His book Good Thinking—published in the UK as The Irrational Ape—is one of the most readable guides at explaining why human reasoning fails us. In 2014 he won the John Maddox Prize for standing up for science in the face of adversity. In Grimes’ case, that included death threats and campaigns to have him fired from his university post. He kept writing anyway.

In Grimes’ view the barrier is the cognitive architecture we all share—the confirmation biases, the motivated reasoning, the deep human need to protect our identity even at the cost of the truth. In this conversation we’ll dig into why people believe what they believe, what even the most respected journalism institutions get wrong, what AI means for the information ecosystem—and what we can do about it.

 

Transcript

 

Eric (00:01.218)
David Robert Grimes, welcome to In Reality.

DRG (00:04.565)
Eric, thank you very much for having me.

Eric (00:07.182)
You are a physicist and a public health expert, assistant professor at Trinity College in Dublin. And now you spend a lot of your professional life writing about why people believe things that aren’t true. That is all very worthy, but there are also some random elements to your background. let’s clear those up first. So you grew up between Dublin and Riyadh in Saudi Arabia. You were an actor and a musician before you became a scientist.

How do those pieces all fit together for you?

DRG (00:40.013)
I think chaotically is probably the best example, but I think the thing about narratives is that they are things we put on our lives afterwards. We chart things like that. Most things in life are random events and we kind of follow where they go. I was born in Ireland and born into a family that had, my father was the first in his family to get an education and he just managed to get an education when there was no jobs for engineers. So the Middle East is where we went.

Eric (00:42.254)
Thank

DRG (01:05.645)
and I was, my hobby’s over there. I loved science, but I also loved, you know, I love attention. Here’s why I’m talking now. So music and theater were big loves of mine. They still are. I used to play, I still play an awful lot and I still act. I don’t do it for money anymore because I don’t have the time, but I do it as a passion, as a hobby.

And I consider myself a lapsed physicist. I mean, I did my primary degree and arguably part of my PhD in it. But once I got interested in public health and cancer science and things like that, it kind of just takes you on a journey. So I think you’re always following your interests in life. And we put the narrative on afterwards, but I will admit straight off the bat that it’s entirely chaotic. It’s brownie in motion. I’m bouncing around like billiard balls on a pool table. That’s okay. I think most of us are.

Eric (01:51.983)
All right. All right. Good. Yeah, I’m sure I could trace the same kind of Brownian motion in my career too, as most people can. One of the things about your career as a writer and sort of crusader for logic and information integrity was that you were awarded the Maddox Prize, which is given to people who promoted scientific thinking despite hostility that they faced for

for doing so and you received death threats, campaigns to have you fired. What did you do to trigger that kind of hostility?

DRG (02:32.268)
There’s a flippant answer and there’s an honest answer. I’ll you both. The flippant answer is I strive to irritate people, but that’s not intentional. The deeper answer, and this is what got me very interested in what eventually became my first book, is that I started off, when I started doing science communication, I mean it’s a strange term, but when I started writing for newspapers and things about topics like vaccination or water fluoridation,

I was kind of naive. I was coming at it from perspective that this was an information deficit problem, that people didn’t have reliable information, and that’s part of the problem absolutely. But there is a huge…

underlying psychological slash ideological issue that at the time I didn’t appreciate enough. Now I very quickly came to appreciate it and eventually got very interested in why that was. So I would write a piece, you know, saying, actually, you these these anti-vaccine protestors, they’re misinformed because this this this XYZ. And then I would be subject this barrage of hate mail.

And I would naively kind of go, I would take these as good faith conversations and then suddenly find myself in hotter and hotter water all the time. And I was in my early 20s, mid 20s then. And I very quickly realized that you have to appreciate the psychological aspect of this. And you also have to recognize what is a good faith engagement versus a bad faith engagement. And my mistake was treating them all the same. And that was very negative. Initially, like it was, I think the one that they first

Eric (03:59.406)
Mm.

DRG (04:04.96)
The first time I got a real threat, I was about 26, and there was an Irish debate on water fluoridation where another country that fluoridates our water, and every few years, cyclically, you get some kind of protest movement reanimating the same misinformation from decades ago. So I did a piece on that, and they got very annoyed about this, and they started harassing my university, which was Oxford at the time. They started harassing me. They started just being unpleasant people.

And I remember thinking, is really fascinating because this shows you that there’s something, they’re not arguing with what I’m saying, they’re arguing with me, they’re trying to diminish me rather than the argument. And that’s when I started getting interested in the psychology and the ideology of why people hold certain beliefs. Because I think if you want to unpack the puzzle and have a society that’s more evidence-based, you’ve got to understand that we’re not always rational in how we react to things. And that was my trial by fire introduction to it.

Eric (05:03.438)
Well, great. Well, your book, which goes by the American title of good thinking, different title in the UK and Ireland, is I would describe it as an argument or a plea for people to apply scientific thinking to their life. And you make the point that you’ve just eloquently made in conversation is that that kind of thinking doesn’t come naturally to people. Irrationally, irrationality rather isn’t something that

is just an affliction of the gullible or the low IQ or just to people who believe the things that I don’t believe. It’s something that we’re all prone to. just kind of in the broad sense, what is going on in our homo sapiens brain, our primate brains that make us so susceptible to misbelief?

DRG (05:56.797)
I think the biggest thing, I mean, the European title of the book was The Irrational Ape. And I think the biggest, and I mean, obviously in America where there are still debates about evolution in certain quarters, I can see why my publisher over there went for a more, you know, a less inflammatory title perhaps. But one of the things about it.

is that we assume rationality because we’re told this is a virtue, we have these weird models of it, which it is to an extent, but we forget that we’re emotive creatures and social animals first. We emote first and we think later. We react before we reflect. And that happens whether you’re the average Joe on the street or whether you’re a professor in university. It takes a hell of a lot of training to realize, you know, actually…

you know, my intuitive instinctive thinking, which is probably serving well on an evolutionary basis, doesn’t always apply to all situations. That’s one of the factors. And I wrote a bit about that and have you Daniel Kahneman wrote great stuff about that. So that’s, that’s well established. But the other thing I was really fascinated with is ideological drivers that we are.

We often have a conclusion that fits our worldview, our partisan worldview or our emotional worldview. And what we tend to do is a form of confirmation bias where we will bend evidence to fit what suits our existing narrative. And that’s because we put our identities into our cognition. So this is an idea called identity protective cognition. And this is the idea that we become our beliefs.

So if someone, know, if I say your idea is wrong, on principle, on paper, objectively, that shouldn’t cause any, you if you come here and say, you know, I love this local sports team and I go, yeah, but they’re not very good, look at their statistics. That shouldn’t cause any great passion, but it does because people put their beliefs and their identity. So when your ideas become your identity, that’s a problem. Ideas should be changed.

DRG (07:48.661)
promiscuously and regularly, someone said ideas like diapers, they should be changed a lot for the same reasons. there’s a level of truth in that. But because we become emotionally invested in our beliefs and our ideas, we are resistant to changing them and overcoming that resistance. mean, part of scientific thinking is overcoming that resistance because you have to be able to I jettison this based on evidence. But as I point out in the book and I do a lot of my current research on research integrity, a lot of scientists can’t even do that.

They become obsessed with their pet theories and they don’t realize that we should be agnostic about that and then just go with evidence. And that’s very enlightenment idea. It is not natural to humans.

Eric (08:29.571)
Yes.

DRG (08:29.78)
And you see it, I mean, we’re currently, as we’re speaking in 2026, we are seeing in the last 10 years a resurgence of nativism and extreme political ideologies that aren’t necessarily evidence shaped at all, but they rely on people’s feelings or perceptions of persecution or anger. And it becomes very easy to manipulate us when that is happening.

It’s a really, I’ve given a very non-committal answer there, but there are several different factors that come into concert. And some of it, by the way, is just the information deficit. Sometimes we just don’t know, but it’s often more complicated than that.

Eric (09:08.046)
Yes, I suppose you could say that the history of some conspiracy theories like the 9-11 insider story is created by a vacuum about knowledge or some of the early conspiracy theories that arose around COVID-19. Again, people didn’t quite understand what was going on and so, know, conspiracies rush in to fill that vacuum.

On the other hand, you know, over the years, I think we’ve all seen that fact checking, which was thought to be the antidote to misbelief is just overwhelmed by volume, but also not very effective. There is another technique that some of the people who are in the field with you, like Sandra Vanderlinden would say is more effective, like pre-bunking or.

DRG (10:00.013)
Absolutely, and he’s right, he is.

Eric (10:04.856)
Tell me about how that works and why it might be more effective than fact check.

DRG (10:08.588)
So, and I would say these are all complimentary. it’s having different tools in your toolkit and they all work. So if you want to take a disease, actually I’m working with Xander on something at the moment about this. So hopefully it’ll be the domain in the next wall. But if you look at misinformation, and I’m gonna use that as an umbrella term, we can split down a little bit more about misdisc and malinformation later on. But if you look at misinformation, information isn’t neutral. There’s a whole thing that information is neutral. It’s not.

Eric (10:12.238)
Mm-hmm.

DRG (10:36.722)
Information can infect us and misinformation particularly can affect us and we can infect others with it. So it has direct analogies to pathogenic spread and we see that in how, I mean, I always think that things going viral on social media, they’re really well named because like their pathogenic namesake, it’s often what’s happened. So this has a direct analogy to vaccination. If I know you’re going to be exposed to a pathogen that could cause you harm or could cause you to vector it further,

Eric (10:42.638)
Mm-hmm.

Eric (10:52.856)
Mm-hmm.

DRG (11:06.54)
Well then if I immunize you by saying, you’re going to see this, this isn’t true, but you’re going to see it. I’ve given you a form of mental immunity. That’s called pre-bunking. And obviously the opposite is sort of the related of it is when you’ve already been exposed and I’m trying to cure you of your, your, your, or that’s what fact checking is doing. You’ve already been exposed to this falsehood. Here is the fact checking, which is helping debunk it. The idea of we pre-bunk it. If we know it’s coming down the road, if I worked for a pharma company and we were coming up with a new vaccine or even a research group, the new vaccine.

you should already know that it’s going to be absolutely subjective to ideologically driven disinformation. And so one of your strategies should be, okay, well, we have to make sure we have all our ducks in a row and get the information out there before, which is easier said than done. That’s one example there. But these two strategies are complimentary and you need both. Ideally, if we immunize people, prevention is better than cure. So mental immunity is great. And sometimes that’s a simple.

as training people to be aware of source validation. So there’s another concept intimately related called information hygiene. And this is the idea that we should try to be training people to when they see information online, particularly if it makes them emotional, is to go, okay, before I accept this, I’m just gonna do some lateral reading and check if this is validated, right? Is there a reliable tertiary source that backs this up? And also the quality of those sources. Like for example, the New York Times is probably more reliable

Eric (12:08.343)
Yes.

DRG (12:31.136)
then you’re racist uncle’s Facebook page. So we need to break people’s minds so that they, know, these things come from different places. People have different objectives for sharing information with you.

Eric (12:33.326)
Mm-hmm.

DRG (12:41.298)
So this is media literacy training 101. So these are all complimentary. I mean, in, it’s incredibly, pre-bunking is incredibly effective because if you’re not infected, then it’s less effort required to it. So I that fact checking isn’t effective. It can be, but it’s only effective among people that are willing to counternance the ideas that maybe something they’ve read could be wrong. If I’ve really got you into that, deeply infected by my misinformation virus.

Eric (13:02.846)
Mm-hmm.

DRG (13:08.608)
I’m also going to make you believe that the traditional mainstream media is out to get you. And suddenly I’ve isolated you from any potential cure. So it can be a of a rabbit hole with different levels of severity as well.

Eric (13:21.278)
which raises an interesting question about trust in institutions. So you, you made the point that New York times is probably more reliable than your crazy uncle’s Facebook page. and yet, for a lot of people, I, their identity is tied up with mistrust of institutions in general, that that is, seen as a marker of identity, of not being naive, of not being a sheep and all those things. You.

And your book draw a distinction between sort of genuine healthy skepticism and cheap cynicism, just a reflective rejection of expertise disguised as independent thinking. So where do you draw the line between those two, between being healthily skeptical and just too dismissive of expertise?

DRG (14:12.822)
See, that’s really insightful and difficult to answer question. Thanks for that, Eric. But I’ll try. I’ll try. So I think it goes down personally to the individuality problem. So for example, when I look at why are doctors experts in health, it’s really important to remember not all doctors.

By that I mean there are individual doctors that will tell you vaccines are dangerous, right? They are fringe in their profession. There are doctors that will tell you that you know should be shooting yourself up at vitamin C injections. They will be fringe in their profession. What gives them authority as a profession is not the individual opinions of the doctors. It is their subscription to evidence, right?

So professional bodies will reflect that evidence. On average, they will. There’s always exceptions to all this. But for example, it’ll make sure the fringe elements, no matter what their personal beliefs are, it doesn’t matter. A doctor or a scientist is only speaking with authority when they are accurately reflecting the evidence base back to you. If they go off piste and start making stuff up, the fact that they have a qualification becomes irrelevant. So…

The trust and authority is based on the idea that we synthesize evidence and those authorities are supposed to accurately reflect that evidence base. Now they can be undermined. We’re seeing that in states at the moment that Robert F. Kennedy Jr. has very much undermined health departments that for a long time were internationally renowned.

for being considerate of the evidence base and putting the best evidence out there. And that’s a good example of political undermining of an authority, in which case you might be valid to go, I no longer trust reports from the CDC for now because of what’s happened. I totally understand that. But generally the trust in authority, and that would be a reflective case of going, well, at the moment that authority is not reputable. But.

Eric (15:54.062)
and we’ll-

DRG (16:06.44)
which is a sad state of affairs, which I won’t dwell on too much. But what I will say in general, that is how we do it. If you go individualistic and, I don’t trust them because they’re out there, that’s a weird narrative to put there. You need to have evidence for a narrative as well. A knee-jerk rejection of anything that is reputable is often founded in the fact that people, and that goes back to the ideological drivers of things. If you have a tendency to conspiratorial ideation, that is,

Eric (16:08.28)
Thanks.

DRG (16:32.82)
to a kind of cynical distrust of any form of authority, you’re actually not consistent with that. For example, if I distrust medical authorities but believe anything I see on TikTok, I’m not even being consistent with my standards. It’s more about developing a form of consistency to be at least able to say, believe this for this reason, but I don’t believe that. And for that reason to be consistent across the board. But it gets back to the problem that humans aren’t consistent.

Eric (16:47.588)
Right.

DRG (17:01.492)
Humans have blind spots and they have ideological drivers and trying to train ourselves to spot when we’re doing it. That’s tricky.

Eric (17:09.878)
this. Speaking of conspiracy theories and conspiracy thinking, you are well known for having tried to answer mathematically the question about why conspiracy theories don’t work. Actually trying to put a timeframe on how long a large scale conspiracy theory could actually survive before someone who was one of the

DRG (17:22.252)
Yeah.

Eric (17:37.494)
alleged conspirators would have to come clean. Could you walk us through that thinking?

DRG (17:42.613)
I sure can. And I can even tell you when the thinking started, I was at a conference about 11 years ago with some of my colleagues in mathematical oncology somewhere in Germany. And they were looking at my emails, because we were just doing this thing. was like, check out the emails I’m getting. And they were all accusing me being part of a conspiracy, but not just one.

mean, about 15 overlapping conspiracies I was apparently part of. And I was like, these people knew how incompetent I was, they would certainly not be putting me in the illuminati. I struggled to tie my shoelaces accurately some days. So, as I was doing this, we were talking about, I said, look, what if we were all bastards? Excuse my language, but I said, what if we were all terrible people? And what if we really wanted to keep a secret? Now, this is a bunch of scientists who know that

you know, getting scientists to agree is like herding cats. It’s a very opinionated group of people who are all very professional and have their strong biases as well. And that’s why scientific consensus is really hard to arrive at, by the way. I mean, it’s a consensus, easy. It is not. Like any consensus, it is difficult to arrive at and is driven by evidence rather than personality, ultimately. But I kind of said, well, look, what would happen? And then I started using the mathematics that you would use for this. And this is…

pretty, it’s a pretty simple way of looking at this. And I was looking at this from radiotherapy, we have a totally different field about how often particles interact. I was like, well, okay, let’s say everyone’s in a conspiracy and we have all these people who doing it for this long. How long before someone screwed up? And by that mean either, you know, let the cat out of the bag deliberately or…

I don’t know, left a laptop on a train that had secret Illuminati world plans. And this wasn’t that long after the Snowden revelations where, you know, the secret prison project that was being run, about 30,000-ish people were involved in that.

Eric (19:23.362)
Mm. Yeah.

DRG (19:26.656)
It had been running for only a year or two, and then Edward Snowden came out with the receipts, as the kids say now, right? So there’s a reason no one disputed. Once Snowden came out and said this project, no one ever said, no, it isn’t. Like, it was very clear cut that he had exposed the project, and that was it, right? And the ethics of that, that’s what people can discuss later on. But I was looking at that as a great example of, that’s just one person who got uncomfortable with it and blew the whistle, right?

Eric (19:33.046)
huh.

DRG (19:50.669)
So what if we wanted to keep a secret? What if climate change was a hoax, but we were all in on it and all the scientists working climate change were in on it too? Because in science, you’d have to be complicit. So you’d need everyone to be complicit because otherwise, if I could prove it wasn’t true, I’d get a great paper in nature out that, wouldn’t I? And it turns out just assuming everyone is a really good secret keeper. And I went very like, you know, let’s assume everyone is better than the NSA keeping secrets.

but we’d still need, for say, know, vaccines cause autism cover-up, we’d still need about 4 million people involved. And the problem is the numbers get so big, these things tend to failure rapidly. Even single events. looked at the moon landings, NASA had 468,000 contractors on the Apollo program. Like all of them would have to complicit in it. And even that one off, you’re like, you’re not going to keep this. And there’s one of your…

Eric (20:28.43)
Mm-hmm.

DRG (20:40.46)
and American founding fathers, I believe, was Benjamin Franklin, who famously said that three men can keep a secret if two of them are dead. All I did there was put some mathematics on that. I did come up with the idea if you do want to do something dodgy, tell as few people and involve as few people as possible, which is already what intelligence agencies do. There’s need to know different clearance, there’s compartmentalization. But I was saying that the idea of an overarching conspiracy in science about vaccines or climate change or moon landings.

Eric (20:46.967)
Yeah.

Eric (21:01.986)
Yes.

DRG (21:08.652)
or a secret cure for cancer, they’re just not realistic because they involve too many people. They would collapse so quickly. And you can do the time frames on that. And even at best, if everyone was a really sinister, but brilliantly effective secret keeper, the fact that you have a non-zero failure rate, even if it’s tiny, I think I’d like took one in a million as one of the failure rates.

It’s gonna happen, you know, it’s a surety. And there’s statistics and modeling you can do for that. that, and as you can imagine, that annoyed a lot of conspiracy theorists, because it went viral, I think, at the time, about 10 years ago when it came out, and it was picked up by newspapers around the world. my good Lord, I had to, my inbox was toxic for about a year after that one, so that was fun.

Eric (21:30.882)
Yes.

Eric (21:44.023)
Uh-huh.

Eric (21:50.03)
Wow, well, clearly you were part of some conspiracy to debunk conspiracies.

DRG (21:55.499)
I actually put it the conflict of interest statement. I said that I was actually not part of sinister cabal, despite assumptions to the contrary. I put it in hoping the humor would help diffuse things. It did not, Eric. It did not.

Eric (22:11.15)
I can’t believe it. Good Thinking talks, devotes a section of the book to media. And we talk a lot about media in reality. And I spent decades of my career in that profession. To what extent is professional journalism? not just tabloids and not just conspiracy theorists or motivated

of liars online, but professional journalism, to what extent did they contribute to the problem?

DRG (22:48.748)
So they contribute to the problem both inadvertently and sometimes deliberately. And I’ll break that up. First, I need to caveat all this by saying that I think that we need traditional media now more than ever. I am a strong believer in the virtues of traditional media. So we’re just specifically addressing faults right now. But again, with those caveats. So one of the faults has

And this is a problem of resource management and everything has this. There is a tendency in all forms of media, both social and traditional, to go for the most novel, the most exciting, the most headline capturing thing, or what you call in the modern era click bait. And there is always a battle between velocity and veracity. You want to get this out fast, fast, fast. I want to be the first people to publish this. Don’t ever get scooped. Or I want to get the quickest headline on this.

Eric (23:30.446)
Mm-hmm.

DRG (23:44.491)
The problem with that is news is always evolving, right? And oftentimes, to be really impactful, to do stuff that really matters, it’s time consuming, it’s expensive. You’ve got to put a journalist on the ground, maybe for months, right? But if you’re in a race to the bottom,

And a media outlets kind of, and I don’t blame them entirely because the arrival of the internet really sped this up to a huge degree. Over-reliance on wire reporting, journalism, things like that, they were definitely problematic. When I started doing my early talks, actually, and this is about 15 years ago, for the public, I used to really bash journalists because I was a young scientist annoyed at science coverage,

And within a year or two of doing that, I had totally changed my opinion, realizing, yes, there are bad examples of journalism, but overall, journalists are doing extraordinary work under incredibly hard time constraints. And also often, because people aren’t buying newspapers anymore, the funding they used to, or paying for media as much.

You have a weird thing where it becomes an inadvertent race to the bottom. So I no longer put the blame on media outlets for that. saying actually they were a victim of circumstance. And actually now I encourage my students to go and buy or subscribe to a newspaper or a journalistic outlet they like. Like you like that journalism? Pay for it. I go with, you know, do a cheap one, a student deal, because that way they can afford to investigate. That’s one of the problems. And the other problem, and again, I think it’s really related, is again a velocity related problem. That really

sometimes taking a step back and not looking for the sharp angle, look reporting this and also saying I’m going to put some time to dig deeper into this would be more rewarding for everyone. But I totally understand the pressures that journalists are. I speak to a of them these days and I’m often a source store or quoted on stories and I have so much time for journalists these days. So I’m like, wow, you work, you’re doing something really important in an era where it’s actually not respected enough that some of you can’t.

Eric (25:27.95)
Mm-hmm.

DRG (25:47.085)
you know, the jobs that used to be there aren’t there. And we have this weird idea that AI is going to do everything. I’m like, oh my God, no, that’s disastrous. It’ll just make, you know, nonsense come out faster. We do need people, but that’s where the investigation, the time, that really matters. And I, the biggest failures of journalism have been in the past, trying to go to online and trying to go, let’s get speed over everything else. I don’t really care if you report first, because I know the initial reporting will evolve and therefore it’ll be wrong.

Eric (25:54.936)
Yeah.

Yes.

DRG (26:16.394)
I’d much prefer something that’s really reflective and I think people will pay for that. And that is one virtue I would say, Eric, that I think we need to remember. People listen to long form podcasts now. So while it’s really easy to get really negative and say everything’s terrible, there’s obviously a need for this, that people want this and people see a value in it. So it’s just recapturing that trust in media and realizing this is what traditional media can do that your social media is never ever going to be able to do for you.

Eric (26:31.382)
Is it?

DRG (26:44.81)
that veracity, that fact checking. And again, the other failure of journalism historically has occasionally been a reluctance to do proper diligence and fact checking. And these are individual failures. I wouldn’t say they’re systemic. mean, most newspapers get it right. Most reporting journalistic outlets get it right. And we remember their failures. So I don’t want to entirely flog them with that because in the book I use several examples of reporting failures.

But a lot of them were done because they were looking for reactions. They were looking for a quick story rather than what’s actually happening here. I know the old stereotype of the news reporter with the band in his hat speaking like, hey, what’s the story here? What’s going on? What’s happening quick? I know it’s a stereotype, but we do need people asking questions, mud raking, but asking them ideally without the agenda. It doesn’t happen that often, but like.

We have enough social media self-described citizen journalists doing that. We actually need our media to be dispassionate but objective and to give us the facts and to give us the analysis that the lay citizen can’t do. To go, I’m actually really going to dig into that. That is where the value always is going to be.

Eric (27:46.05)
Yes.

Eric (27:50.702)
I would certainly defend my profession, but also aware of the incentives that we have to spin things. We are human beings too, and also have our own political beliefs and our identity tied up. And for understandable reasons, there is a temptation to…

write stories, create stories for other journalists. You want the respect of your peers and it’s great for your career to get awards for investigative journalism, that kind of thing that prioritizes the novel, the shocking and so forth. But I’m glad to hear your support for the idea that there is at least in the journalistic profession,

grounding in factuality. There are incentives to get things right. There are penalties to pay if you make stuff up, which is not something that you would say about social media, for example. Speaking of social media, there was a landmark case in California, was decided just yesterday as we record this, in which a jury found that Metta, the owner of Facebook and Instagram, was guilty of negligence for product design.

DRG (28:52.502)
See you next

DRG (29:05.814)
for child protection, wasn’t it? Yeah. Good. Frankly. So one of the things I wrote for Scientific American earlier this year, and I made the same statement I’ll say now, I know people say it’s inflammatory, but I think it’s true. If we had to look for a historical analog of tech giants, social media companies, the closest we could find are tobacco companies and the fossil fuel industry.

Eric (29:09.28)
What’s your reaction to that?

Eric (29:33.23)
Mmm.

DRG (29:33.581)
And if you have a product that you know from your own internal testing is harmful, that causes division, that causes the propagation of potentially toxic and damaging falsehoods, and you go, well, actually, we just care about engagement. We don’t care about that.

You’ve made a choice, right? You’ve made a choice and you know your product’s harmful, but you’re trying to cover up the harms of it. So yes, they did design it for it. The classic example was in 2020, the 2020 election, Facebook on the back of the pandemic and everything else and Meta, they did try to put the lip service to being more socially responsible after the the debacle of a 2016 election misinformation.

And they put a very simple, but really, what I suggested earlier on, basically, they had a basic that if you were putting a new story up, they had a series of listed reputable sources. And they put a simple warning thing up that you put this story up, it’s from a weird sub stack. It’s not really verifiable. And that simple measure alone was enough to markedly reduce the spread of partisan disinformation. But you know they also noticed in their internal testing? It reduced engagement.

because the people that spread this stuff really violently are the most engaged users that you can sell advertising revenue to. So they deliberately removed it, right?

Eric (30:46.926)
Hmm.

DRG (30:56.904)
Even though these are companies that have the turnover of small nations when it comes to revenue, they don’t care. And you can see that after Trump’s inauguration, they all went from leaning slightly left, or least performatively left, to immediately kissing the ring. And to me, this is good. I’d also say another thing that came up recently, and you might have seen last year, Antropic were sued for plagiarizing books and content.

a class action. My book was one of the books that they scraped from their AI models. The thing is, even though there’s a class action settlement and there’ll be a small payment to every author of a half a million authors in the English language affected by this, it’s weird because you can never take your content back. I can never say, take my book out of your AI model. It’s stuck there forever. And it’s a weird thing. Tech giants are incredibly irresponsible on average.

Eric (31:30.114)
Hmm. Mm-hmm.

DRG (31:54.453)
And because we for years went through that this is a net good. And I was, when I was younger, I mean, I’m 40 now, but when I was, when I was in my twenties, was like, tech is a net good. It’s uniting us all. It’s brilliant. I would be slightly more, you know, and you probably saw in the book, I’m definitely more dubious about it now. But again, social media companies, unlike newspapers.

They don’t have responsibility or they’ve got a way with not having responsibility. It’s like they would bemoan, dog fighting is terrible. Isn’t it terrible as they’re selling you tickets to a dog fight? we’re just selling tickets though. No, right. Whereabouts if I think it’ll happen eventually on a European level, I imagine first, I imagine what will happen eventually is you just make these social media companies responsible as publishers. You hold the same standards that if you were doing for any newspaper, like if you put that in your paper, it’s on you and they will.

Eric (32:33.739)
Yes.

Eric (32:44.685)
Yes.

DRG (32:45.836)
complain and moan, they’ve been a net deficit for democracy, they’ve been a net deficit for social cohesion, and the more we look into this and the more the research comes out, the more terrible they actually are. And we may take decades to recover from the harms they’ve done if they stopped now, but they’re still going.

Eric (33:05.196)
Yes, yes, that one of the remarkable things I think about this california suit was that the the platform was sued for sort of basic product safety concerns rather than first amendment or the the harms of the misinformation but the fact that it was designed to be yeah, got to hand it to him and perhaps this will trigger many

DRG (33:25.034)
Yeah. Clever way of doing it. was so clever.

Eric (33:33.526)
suits, you know, a $6 million decision, including both the damages and the punitive charges is, you know, a rounding error for for meta. But if there are many such suits, they will have to pay attention.

DRG (33:49.421)
Absolutely. And if it starts getting transnational support, that’s a big thing too. Because again, these companies are bigger than countries. You probably read, what is it, Careless People, the book by one of the meta whistleblowers recently. Very interesting book. But I read it on a flight home a while ago. And it’s very interesting, but there’s really nothing in it that surprised me. I like, I go, yeah. And unfortunately, that’s just the way these things are.

Eric (34:02.54)
Yes, yes.

Eric (34:12.899)
Yes.

DRG (34:18.73)
We will look back at this time in history, future generations, with a curious wonder why that no one suspected this was gonna be a problem in the first, I will say.

There were researchers as early as 1996, before social media was even a thing, who said that there might be some dangers of the internet. this, I wrote about them in the book, but they were a group in MIT and they have Scandinavian names, I’ll get horribly wrong, so I won’t say them. But what they said is everyone was very excited about the nascent internet and how it’s gonna bring us all together and unite us. And there’s an element of truth to that. But they also said, or what if we all get into communities that reinforce our negative prejudices

Eric (34:38.318)
Mmm.

DRG (34:59.949)
that, you know, make us, you know, into echo chambers. They didn’t use that term, but they had a term that’s even cooler. They called this cyber-Balkanization to the fractured states of the Balkans. And they said, what if we… And people said, oh, you’re being so pessimistic. I think they may have been vindicated in the intervening two decades. I think they… Three decades, really. I think they may have got that right. So, go to the MIT, guys.

Eric (35:09.837)
Hmm.

Eric (35:18.562)
Yes. Well…

Eric (35:25.336)
Yeah. Yeah. Well, that’s interesting. So let me challenge you to make a similar prognostication about the newest technology. Social media has proven to be a net harm. How about artificial intelligence?

DRG (35:43.661)
I loathe the term artificial intelligence because it’s not intelligent in any meaningful way. Now, I’m also a chartered statistician, and maths is a passion of mine. Large language models, for the most part, use a ton of linear algebra and statistical methods to find associations in huge data sets between ideas. They are essentially a glorified version of predictive text. Now, that can be immensely useful. If I say,

Eric (36:07.895)
Mm-hmm.

DRG (36:11.916)
You know chat GPT Claude whatever write me a cover letter for this job with this point, right? It has millions of cover letters in its training set that it’s it’s usually pilfered as we found out That it’s gone but enough that it can make you a pretty convincing pastiche, but it’s not doing any thinking It’s doing searching right and it’s it’s it’s bring these things together It doesn’t know the ideas of truth because these aren’t concepts. It knows that the amount of data this side is this but

We’ve already seen in this week, I don’t know when this episode will come out, but this week, for example, Google pulled their health advice thing because it was giving horrifically misleading advice because there’s so much misinformation about health online. It was mining that stuff. And even once the guardrails came off, it was telling you terrible things. We’ve seen AI companies being sued with the families of patients with psychosis who died by suicide.

because AI chatbots, once you take the guardrail off, they are designed to be sycophantic search engines. That’s essentially what they are. And the excitement over them confuses me because they’re really energy inefficient, incredibly wasteful, but what they’re doing is creating more nonsense faster. I use AI models as a form of a search engine.

But I also know I have to check the hell out of it as a subject area expert. I do a lot of work on bad science. And what we’re seeing increasingly is scientific publications referencing things that don’t exist because they got an AI model to hallucinate it. And I’ve seen this has been happening in legal judgments in the UK where judges are throwing things out because solicitors put together a legal brief. They used AI to, and it makes up stuff. And it doesn’t know it’s making it up. That’s the thing.

because it’s using associations to say that looks okay, I’m putting that in there. And because we don’t recognize that, we get obsessed with it. But it’s also, we’re seeing at the moment as this is being recorded, we saw that chat GPT has been used to target locations in Iran.

Eric (37:59.703)
Yes.

DRG (38:13.298)
including potentially a school that it had the wrong information on and had no human in the chain to check this. So these are black box models. There’s so much going on. They’re pulling in so many billions of data points and putting them into a model that we can’t actually follow all the chains of reasoning there. Neither can they. If you ask them to explain it, the models can’t do it because it’s using probabilistic models and linear algebra and throwing it all together.

So there’s no responsibility that we take on that, even if it’s creating immense harm. And there’s no real sanity checking of it. And I guarantee, I mean, once you start asking AI hard questions, really hard questions about creating knowledge, like, you know, it gets really bad really quickly. I’ve even told my students, you know, they’re they’re comfortable with these cool statistical models. They think they got chat, you need to help them with. And I will immediately find something in and go, that makes no sense. And if you do that, you’re going to mislead yourself.

Eric (38:44.856)
Yeah.

Eric (39:05.976)
Hmm.

DRG (39:06.09)
And I already know that their professors are doing that too. And I’m like, God damn it. You will never get rid of a need for human expertise. You can make garbage faster.

So I mean, if I’m writing, if I need to write a stupid cover letter for something, yeah, you can use Jack GPT, no problems. And it’ll do it passively. Do not ask it to do it. And it’ll still get bits wrong, but never use it for more than that would be my advice. And I think that we are going to see massive layoffs because of this and a realization that those layoffs probably weren’t sensible eventually coming belatedly down the road, but maybe not after it’s caused a global recession and massive amounts of discomfort to appease shareholders. I mean, I’m going to sound full

communist now. I don’t mean to, but it’ll go that way. I mean, again, not because we needed to lay people off because some clown thought that would maximize shareholder value. You know, and you know, the entire world’s AI models is to maximize shareholder value, not to make the world better. And that’s something we have to keep in mind.

Eric (40:03.63)
Yes. maximizing engagement, maximizing shareholder value, that sort of not necessarily harmful or evil goals, but the unintended consequences can be severe. So anyway, you’ve painted a not exactly rosy picture of our future under our AI overlord. For those of us who now listening to this

DRG (40:17.228)
Absolutely.

DRG (40:25.164)
Sorry.

Eric (40:31.712)
in reality episode have to go out into the world, a world of chaotic information and make sense of it. If you had to distill your advice, are there just a few practices that people should use when they move out into the information world and encounter something that they’re being asked to believe? Just kind of.

DRG (40:55.028)
So there are practices and there’s also cause for optimism, which I’ll come back to at the end. So remind me if I forget because I’m scatty at the best of times, Eric. So what would I say? The very first thing we can do is to enure ourselves a little bit, to give ourselves mental immunity. And the first thing we have to recognize is things that we come across, the biggest single predictor of whether we engage with content in its myriad forms, whether it’s a newspaper headline or something you see social media.

is whether it provokes a visceral emotional response, a negative one particularly. Things that induce fear, outrage and disgust, anger, they are what it gets engagement. And that is how bad actors from, you know, Russian state disinformation to political partisans, to anti-vaccine activists, they’ve realized that is our kink in our mental armor and they take advantage of it. So when we come across content that makes us angry, disgusted,

Eric (41:26.317)
Uh-huh.

DRG (41:47.671)
The very first question we should ask before we let ourselves embrace that feeling is to go, hang on, is this designed to make me feel this way? Is this legitimate framing of something? This is the information hygiene idea. I’m gonna just reserve a few seconds before I make a judgment. I’m just gonna check that independently. Before I rush to judgment, I think that it’s not popular because social media will always tell you you to have opinion immediately. It’s absolutely cool to go, I don’t know enough about that.

and I haven’t verified that. But when I come back to it, then I’ll have an informed opinion. So we put things on the back burner. Before we let ourselves get infected by it, you treat it like PPE. Remember back in the pandemic, you’re like, I don’t know if that’s infectious. I put that there for sec. So take a breather. Take a sec. Check the source.

Is it reputable and is it in context? Because if I can strip something and that’s we didn’t get into that, but malinformation is stripping real information to context to mislead people. So is this being stripped of important context to mislead or is it just outright false? Is it disinformation, misinformation? Is there an agenda behind the spreading? You asked this. And then the next thing you do is take a breather. Take a few minutes and go, yeah, I’ll check that. And I would say this is really good. I would also say, and I’m preaching to a choir.

If you can subscribe to a news organization that you admire and respect, the journalistic organization, and you have a few pennies in your pocket every week, please give it to them, because they are going to be part of the reason we get out of this mess. We’re going to get back to respect. And we’ve seen this. I want to give you a bit of an optimistic note to show you where this is happening. Finland. Finland has started training media literacy to five-year-olds up.

So they get to, they do exercise where they learn about misinformation and disinformation and malinformation, and they learn about editorial biases and things like that. And do know what it’s really done? Firstly, it’s made them far more media savvy. They’re far less likely to be infected by falsehoods or angered by partisan rhetoric or whatever else. And you know what they’ve really come with? A respect for journalism. They come back on it. Actually,

DRG (43:46.861)
people doing this fact checking, people doing this contextualization, people doing these investigations to an objective standard, which they’re held accountable to, they look at them as complimentary to scientists in that regard, as in they’re doing something important.

So let’s look at Finland. know, Finland are doing good stuff. And one of these days we will be over this. will. The one thing I say is optimistic too is all the stuff I wrote about the book and we before we started recording out, we were chatting about this. The book, I a lot about the psychology of why we fall for this. These have been problems in the human psyche for millennia, generations, since we first learned to reason, we’ve made these mistakes. Social media has massively exacerbated those existing flaws. But the good news is

This is our rock bottom. This is a rock bottom where we can actually go, all right, now we’re going to be stronger in future. We’re going to really address these flaws that we could maybe skirt over because they’re only minor issues. These are now dividing societies and electing political parties.

we’d have to go, let’s look at this again. So even though I do sound very negative and pessimistic, I say there’s at least part of me that’s really optimistic. This is finally recognition of a problem that actually has been causing us issues for generations. Let’s hope this is our rock bottom, our nadir, that we can rise up from and then go, you know what?

in 400 years time, when they look back there in history and they’re very well informed and media literate and really savvy, they’d be like, what were those 21st century jackasses doing? And like, you know, the way we look at the pre-Enlightenment era, like, what were they thinking? Let’s hope that our future generations can think that. And that no person in the Oval Office hits a big red button and we’re all, you know, removed from existence in the immediate future.

Eric (45:17.272)
Yeah.

Eric (45:25.816)
grateful.

Eric (45:32.109)
That sounds like something to be devoutly hoped for and a great place to leave the conversation on that note of hope. Let’s fast forward 400 years and look back on this era as the point where we bounced back from and loosened the grip of misinformation. David, thank you so much for the conversation. I’ve really enjoyed it. Thank you for the work you’re doing and I would encourage everyone.

DRG (45:36.012)
Thanks

Eric (45:59.096)
to get the book is called Good Thinking.

DRG (46:01.355)
Eric, thank you so much. It’s been an absolute pleasure speaking to you.


Created & produced by: Podcast Partners / Published: Mar 28 2026


Share this episode:

All episodes are streaming on these platforms: