My guest today is Dr. Andy Norman, who is the award-winning author of ‘Mental Immunity: Infectious Ideas, Mind-Parasites, and the Search for a Better Way to Think’.
His research illuminates the evolutionary origins of human reasoning, the norms that make dialogue fruitful, and the workings of the mind’s immune system. He champions the emerging science of mental immunity as the antidote to disinformation, propaganda, hate, and division.
Currently, Andy directs the Humanism Initiative at Carnegie Mellon University and is the founder of CIRCE, the Cognitive Immunology Research Collaborative.
Some of the topics we covered include:
- Andy’s entry into philosophy
- The mind’s ‘immune system’
- Definition of ‘mind parasites’
- Manipulation gone to scale
- Simple conspiracy vs complex reality
- The challenge of debating against conspiratorial thinking
- The incentives that fuel misinformation
- How to prevent mind infections
- Determining whether a belief is ‘reasonable’
- Evolutionary origins of our capacity to ‘reason’
- Dangers of confirmation bias
- Factors that make us vulnerable to ‘mind parasites’
- The dangers of hitching belief to identity
- Developing a ‘mind vaccine’
- The ‘New Socratic Method’
During the show, Andy and I discussed a paper he wrote in response to Hugo Mercier and Dan Sperber’s book ‘The Enigma of Reason’. You can download that paper here.
—
If you like what you hear, please consider liking and reviewing the show wherever you get your pods. You can also support the show on our Patreon and Buy Me A Coffee page on the links below:
Patreon: https://www.patreon.com/thevoicesofwar
Buy Me A Coffee: https://www.buymeacoffee.com/thevoicesofwar
—
Listen to the podcast here
Andy Norman – Mental Immunity: Inoculating Against Conspiracies And Disinformation
Welcome to The Voices of War, a show with a simple vision to bring to life the true costs of war through the voices of those who’ve lived it. I’m Maz, and I speak to soldiers, academics, refugees, peacemakers, and anyone else who’s been touched by war in the hope of demystifying and, most importantly, de-glorifying it. If you like what you hear, please consider showing your support for reviewing the show wherever you get your shows. You can also support us on our Patreon or Buy Me a Coffee. Thank you. I hope you enjoy this episode.
—
It’s Maz here. A short note before we get to the chat with Andy Norman, due to an invasion of Ukraine and my pivot to cover this extraordinary and grossly unjust act, I’ve held off releasing this conversation. I find Andy’s book and insights to be even more relevant. As I’ve discussed in my previous conversations with the likes of Peter W. Singer and Carl Miller, we’re seeing the effects of misinformation play out both in the West as well as in non-Western countries, such as the BRICS nations and Africa.
Given the growing global tensions, arming ourselves against mental parasites seems to be more important now than ever. I hope this episode gives you some ideas about how to do this for yourself and to help those around you. I want to thank the two most recent Patreon supporters of the show, Andrew and Shane. Your support will go a long way towards sustaining and growing the show. Let’s get on with the episode with Andy Norman.
—
My guest is Dr. Andy Norman, who is the award-winning author of Mental Immunity: Infectious Ideas, Mind-Parasites, and the Search for a Better Way to Think. Andy has also published in outlets such as Scientific American, Psychology Today, Skeptic, Free Inquiry, and The Humanist. He also appeared on The Joe Rogan Show, the BBC’s Naked Scientist, and The Young Turks. His research eliminates the evolutionary origins of human reasoning, the norms that make dialogue fruitful and the workings of the mind’s immune system.

He champions the emerging science of mental immunity as the antidote to disinformation, propaganda, hate, and division. Andy directs the Humanism Initiative at Carnegie Mellon University and is the Founder of CIRCE, The Cognitive Immunology Research Collaborative. Andy, it’s a pleasure to host you on The Voices of War. Thank you so much for joining me.
It’s my pleasure, Maz. Thanks for having me on.
I’ve finished your book, and to say that it’s timely would be a gross understatement. It’s relevant in the interconnected world that we start exploring how our minds can be infected by bad ideas. Thank you for publishing it. It’s a timely book.
Thanks. One doesn’t go into the philosophy business because one expects to get a lot of attention or a lot of media coverage, but it turns out the stuff I’ve been thinking about for a long time is very much on people’s minds. I’m lucked into this.
The fact that you were on Joe Rogan is an indicator of that and how broad that audience would’ve become after speaking to him.
That was a fun way. That was my very first podcast show after the book launched. I met with Joe on the day my book came out in stores. That was a nice way to start this run.
I’ve listened to it. I thought it was great. Before we delve into the book, maybe we can find out a little bit about Andy The Man. How did you even get into Philosophy, and this emerging new way of communicating philosophical ideas?
I went to college firmly expecting to study Physics or some very science-y thing because I wasn’t very good at words and emotions. I took a course where it became clear to me that our species is remarkably clever in the things we can invent and do. This particular course left me with the feeling that our cleverness has far outstripped our wisdom. We’ve developed amazing powers through our technologies, and we don’t seem to have the wisdom to wield them with our long-term best interests in mind.
I was bellyaching about our species’ lack of wisdom. My friend turned to me and said, “Andy, quit complaining and do something about it.” I declared my Philosophy major the next day. Philosophy, as you may know, derives from the Greek for lover of wisdom. I descend from a long line of people who think that it’s important that humanity find ways to become a little bit wiser each generation. We’re behind the eight-ball right and got some catching up to do. Hopefully, my work can nudge a little bit in that direction.
At the start of that reply, you’ve stressed the technology, and we are exceptionally clever and smart, but the wisdom is lacking. Our modern relationship with technology is probably to blame for our predicament in more ways than one, I suppose.
Yuval Harari, whose book is Sapiens, you and your audience may know, he’s worried about how our ability to edit the human genome, and artificial intelligence, these are enormously powerful technologies. They’re getting better so fast. We seem to be bumbling along in terms of our ability to handle the ethical questions that would allow us to use those technologies wisely.

This is definitely something I hope to explore, but as we dive into the topic, we can start with getting the main thesis of your book, Mental Immunity, because that will be a nice launching path for us to delve into some deeper questions.
It’s quite simple. It turns out that the mind has an immune system. If you look back at the history of psychology, you see that scientists began to discover that the mind’s way of filtering information behaves very much like the body’s immune system. It generates antibodies. It can be compromised in the same way the body’s immune system can be compromised. The body’s immune system fights off infectious microbes. The mind’s immune system fights off infectious ideas when it’s working properly, but it doesn’t always.
It turns out that there are some powerful ways to enhance mental immune function to become better at spotting bad ideas, filtering, and weeding them out. When we do that, we become wiser. I believe that this emerging science of mental immunity, I call it Cognitive Immunology, that this new science is going to actually show us how we can make rapid, concrete steps towards becoming wiser.
I like the analogy of a viral infection, and probably in today’s environment, it’s a very timely analogy as well for us to realise how it works.
You’re actually touching on a piece of my big idea that I didn’t get to specifically, but part and parcel with this idea of mental immunity is ideas can spread in a viral fashion and unhinge our minds. That’s becoming obvious in this day and age. We need to take this biological metaphor seriously, if we’re going to get out ahead of this, and begin to control the mind infections that are destabilising a lot of people.

Why do our minds get infected? Before we get into the technical aspects, what makes our mental immune system malfunction or vulnerable to a virus?
The whole idea that mind parasites are infecting our minds is creepy. It sounds like a B-grade horror movie, a premise for a cheap horror flick. It’s a more mundane thought than that. False ideas invade our minds and sometimes stick as beliefs. Sometimes, beliefs or ideas that don’t serve our interests well get taken up and applied, or we make false assumptions and do things that harm other people.
All of these are familiar kinds of errors, but if you reinterpret those as mind infections and the content of the ideas as mind parasites, it turns out you get some very powerful tools for strengthening your resilience in the face of them. In the same way that we had to understand the body’s immune system to develop vaccines, we have to understand the mind’s immune system to develop countermeasures to the virulent ideologies that spread.
As you refer to it as the mind vaccine, I guess that’s what we’re ultimately looking for. I must emphasise, the timing of your book is incredible, both in the metaphor or the analogy you used of vaccines and the virus, but also in the ubiquity of information and openness of our connectivity to about any idea in the world. This is something that certainly, many in my audience and my profession in the military, it’s something we’re exploring.
PhDs, in numerous, have been written on how people are shaped and influenced by bad information, leaders, and poor incentives that ultimately lead to genocide and war crimes. Even the wars that we have fought over the years have, in many ways, been driven by poor information and therefore, mind parasites. How does this all relate to your view?
I do think that some of our worst decisions are driven by bad information. Sometimes we take on bad information almost willingly. We’re wilfully self-deceiving. It raises a whole set of problems on its own. Many people take on bad information unwillingly, and then make mistakes that harm themselves or other people they love. I argued that this idea that everyone is entitled to their opinion is commonly spouted by people on the left and on the right. It’s almost an orthodoxy.
I argue that it has reached the limits of its usefulness as an idea because it now excuses irresponsible thinking and believing. If I can believe self-serving delusions and defend myself by saying, “I’m entitled to my opinion,” I could end up harming others. As soon as we start doing things that harm others, we have responsibilities that come into play that limit our rights. It’s time we stop aping the idea that everyone’s entitled to their opinion, and start looking more carefully at our cognitive responsibility.
I couldn’t agree more with you. To take the COVID crisis as a very timely example, how do I confirm what a delusion is though? I’m vaccinated, boosted, etc., but even in Australia, let’s not even talk about the US, we have a significant part of the population that is completely hesitant to the information that’s being served by those who we deem to be figures of authority. They are rejecting the official dominant narrative with some, dare I say it, credible counter information spouted by quite prominent experts, who are showing or asking a question, “Who’s actually delusional here? Is it us the majority or us the few that have seen the light?”
Some of these experts are experts, in a sense, fake experts. It turns out that many people are unaware of how many people are actively trying to manipulate our minds. There are lots of people who run slick websites or sport slick credentials, who can make a fantastic living for themselves by taking counterculture, by adopting counter-narratives to its official or institutional narratives, and gain a following as a result.
Many of us are unaware of how many people are actively trying to manipulate our minds. Click To TweetI’ll be the first one to admit that a lot of the major institutions in America are failing your average American. I’ve felt that way for a long time. The unhappiness is understandable, but when you lurch to the conclusion that science is a hoax or that the Earth is flat, or climate change is a hoax, when you come to distrust the institutions that work so diligently to arrive at the truth, your mind literally becomes unhinged or untethered from reality. You don’t any longer have the cognitive antibodies that a healthy mind does. In a world where information is so plentiful, it’s very easy to become disoriented if you don’t have boosted mental immunity.
Good play on words there. I wholeheartedly agree. I’m certainly in that same camp, but what I’m also seeing is it’s not necessarily the wacky out there Flat Earthers, QAnon types. It’s like the bell curve. There’s the far end of the extreme. There’s probably no saving those people. Where I’m looking at is people who are otherwise intelligent. I’ve got some in my immediate networks, and even in my family, who are otherwise intelligent, people who are not anti-vaxxers as we know them, and this is the point that you’ve raised, but have fundamentally lost trust in institutions.
Oftentimes, they’re justified in having lost that trust because institutions have disappointed the everyday American, Australian, and across the West, there is nothing new. You call this in the book, the basic thoughts or beliefs, or our underlying assumptions about the world. For example, the government has the citizen’s best interests in mind. This is a building principle of what a citizen of any nation ought to believe. When that’s been shaken, and it seems to me like it’s been shaken, certainly in the US sufficiently to be a destabilising factor for the society. Therefore, everything that’s built upon that is then questioned.
It’s questionable as well.
It’s thrown out because, “If I don’t believe that the government any longer has my best interest in mind because it’s profit-driven, it’s the Big Pharma, and corporations, etc.” In all of those, there are kernels of truths because we are driven by incentives. What are your thoughts on that? That’s where I feel the root of the problem is in these basic underlying assumptions. The US is far more challenging than probably much of the rest of the West.
Doubts are healthy. I argue in the book that doubts are literally the antibodies of the mind. It’s important to be able to question and test ideas with your doubts. That’s how we keep a lookout for ideas that have problematic features. Our tendency to doubt can be hijacked by exaggerated or hyperbolic grievances. When it occurs into something called cynicism or distrust, and it becomes sweeping and indiscriminate, it’s much easier to say, “It’s all a big conspiracy,” than to say, “The media is a complex animal, and there are conflicts of interest all over the place.” It’s much easier to accept a simple conspiracy theory than the complex reality.
It’s a convenient way to dismiss a lot of stuff that you don’t like. The problem is, that’s not responsible thinking. Philosophers have been playing with exaggerated doubts for hundreds and hundreds of years. You may remember René Descartes. He did a thought experiment back in 1700 where he basically said, “What if I’m in the Matrix?” He said, “What if I’m dreaming it all up,” was Descartes’s version. It was the 17th century version of the Matrix. What he found is that if you doubt everything all at once, you have a hell of a time building a real bridge back to reality. Your doubts can become so extensive, sweeping, and corrosive that your mind literally becomes unmoored.
If you doubt everything all at once, you have a hell of a time building a real bridge back to reality. Your doubts can become so extensive, sweeping, and corrosive that your mind literally becomes unmoored. Click To TweetLots of people are being swept up by exaggerated doubts. Some people are manufacturing cynicism as a technique to hijack the mind. Think about this too. It’s possible to weaponise doubt. It’s not that there are information warriors out there who are weaponising falsehood. I would nominate Tucker Carlson, the Fox News anchor here in the US, where he says, “I’m raising questions about things.” He uses innuendo to suggest that the Biden administration is doing something slightly underhanded. “I’m not saying anything here. I’m raising questions,” is what he says.
It’s a patently manipulative way of inducing cynicism into people. When people become cynical, they become easily manipulated. We need to protect ourselves. We need to protect each other by learning how to boost our minds’ immune systems. By doing this, we can become next-level, critical thinkers. Critical thinking is great, but it gives us this much protection. Real cognitive immunity gives us this much protection. That’s the world I’m trying to build, where we have 4 to 5 times the level of critical thinking, and it’s utterly normalised that cognitive contagion doesn’t happen.
Those simple narratives that are rarely the solution and that’s ultimately the reason for this show. My focus is on war and conflict, but this is absolutely an intimate part of it. The mind is the driver behind it. I’ve had debates. I consider myself to be somebody who’s broad thinking. I read and watch a lot. I’ve studied a lot. I consider myself an eternal learner, and I’m open to new ideas. I try to be.
I’m biased in my own views, undoubtedly, as we all are. When I speak to people, particularly on the COVID crisis and the pandemic challenges oftentimes, they are far better informed than I ever could be. I draw on some thinking by Sam Harris, who I know you’ve acknowledged in your book as one of the influencing thinkers of yours.
Even Joe Rogan, who you’ve been with, has had questionable experts who have on-paper credentials and can talk the talk and sound profoundly knowledgeable about the topic. Bret Weinstein is one of these people who’s incredibly intelligent and articulate but is sowing the seeds of doubt or providing that confirmation bias for anybody who might even loosely be doubting what the main narrative is. Sam Harris makes a good point.
He doesn’t want to see Joe Rogan or doesn’t want to have Bret Weinstein on his show because Bret Weinstein can pull out 50 counter-facts that one would never have the time to dedicate to research. I’ve seen that happen to myself. I’ve spoken to people who said, “You didn’t see what happened. Did you read about the paper that the Italian Foreign Ministry put out about the impact of, or did you read about the Israeli study that said X, Y, Z?” “No.” “You haven’t? You are not informed. You’re calling me brainwashed. You are the one that’s brainwashed. You’re swallowing the narrative. You need to read more.” I’m caught out thinking, “Where do we go from here?” There is no way out.
One of your countrymen has done some interesting thinking about this. He taught me about Brandolini’s Law. I’m getting this right, but it’s something like the amount of work taken to fight back against BS is an order of magnitude greater than the effort needed to generate BS.
That’s part of the problem because it validates the bullshitter, because the one that’s countering the bullshit doesn’t have the time, the scope or the energy to go and follow it down every rabbit hole that is presented.
A very good friend of mine is a philosopher of Science who studies Science Denial. He actually went undercover at a Flat Earth Convention. He cornered an ardent flat earther and took him out to dinner. He tried to get through to this guy, and it became clear that his identity was wrapped up with this whole gig that he was milking. He couldn’t admit that his reasons were self-serving. It was ultimately self-serving, wishful thinking, masquerading as responsible scientific thinking.
The guy wasn’t dumb. Right. He was clever. You can be enormously intelligent but not use your intelligence in the right way to become wiser. This guy was a perfect example of that. I don’t have any trouble believing that there are extremely bright people in your family who’ve been knocked off-kilter by some of the BS that’s out there. You can find bullshit to support almost any view on the internet.
You can be enormously intelligent but not use your intelligence in the right way to become wiser. Click To TweetOne of the recommendations in my book is that the true test of a reasonable belief isn’t whether you can find some evidence for it. It’s whether you can beat back all of the good questions or challenges that might be posed to it. As you point out, it can be hard to beat back all of the questions and challenges that arise when bullshit is swirling around us, when fake facts are gained currency.
If the information environment becomes full of toxic sludge, it becomes harder and harder to think well. It’s harder and harder to spot the bad ideas and weed them out. Almost everyone thought that the internet was going to usher in an age of openness, enlightenment and everything. I hear it turns out rapidly descending into a dystopian nightmare in some ways. We got to turn it around fast. The key to doing that is to building our immunity to nonsense.
How do we do that? Particularly at scale, because we are countering, and even in the information operations or cyber domain, in many ways, bots that are infinitely more powerful and fast than any human ever could be. They are pushing out misinformation for the sole purpose of destabilising. We know this of countries like China, Russia, and certainly in the case of the US, but of course, Western Europe. We see this in Ukraine, where misinformation is used purely for the reason of lifting levels of anxiety. When there’s anxiety and anger, you then look for simple solutions.
You also look for enemies. You start pointing fingers at your own countrymen. The Putin regime in Russia is quite deliberately sowing discontent in many Western countries. I don’t know if Australia counts as a fellow Western country. I’ve been watching with increasing horror as the level of decision-making in my country has declined in over decades. It’s alarming, but it also lends support to this thesis I’m developing that the mind’s immune system can gradually become unhinged.
In the same way that a mind has systems to protect itself against mis and disinformation cultures do as well. We can actually talk about cultures as having immune systems. It’s quite clear that certain subcultures think that the subculture of QAnon has immune mental health, the cultural immune system in the QAnon community is completely haywire. It has broken down entirely.
What do you mean by the culture? Culture can be an immune system. Can you explore that a little bit more?
Cultures have mechanisms for limiting the transmission of bad information. Think fact-checkers and journalistic integrity. Think about if social media companies actually implemented algorithms that reduced the exposure to misinformation. That might be another element of a cultural immune system. Think about a culture’s media literacy. If a culture institute media literacy training for all young people, they’re strengthening the culture’s immune system. Does that make sense?
Cultures have mechanisms for limiting the transmission of bad information. Click To TweetWouldn’t that be wonderful? QAnon is the extreme, but is it not fair to say that even our culture then, even our consumption of media more broadly, has also been corrupted by poor incentives? That’s perhaps part of the problem. If we look at our dominant narrative, and QAnona, they’re of course more, but there are two cultures. Indisputable from where I stand, is that QAnon has gone off into a whole new world off to deep end.
Our culture also has a cold, undoubtedly, as in the dominant culture, which is then fueling the fever that is wrapping up QAnon. It’s not only dealing with QAnon. Oftentimes, you’re going to do more harm than good trying to bring these people across. If it seems to me then, to combat our illness and mental disease, also, we need to cure the cold of our dominant culture, which has undoubtedly been in the poor incentives in the media. The algorithms in social media are well-established.
You’re right. The incentives have become extremely perverse. There was a study that found that something like 60% of all COVID misinformation stems from twelve purveyors. I don’t know if this is in our country or the world. A very small number of people who spread COVID misinformation are getting rich off this because their YouTube videos are viewed so often. Their websites get so much traffic, and they’re invited onto so many podcasts and radio shows that the media ecosystem echoes them. Some of them are self-deluding, but many others are getting rich and laughing their way to the bank while they hood wink.
That’s a societal-level issue that we somehow need to address.
There was a QAnon follower who, in the wake of the January 6th insurrection here in the US, announced the scales fell from their eyes. They said, “Q played us.” Another Trump supporter who listened to Trump says, “Let’s head on down to the Capitol, and I’ll be there with you.” Trump didn’t follow them down there. Many of those insurrectionists hung out to dry. One of them concluded, “Trump played us.”
I argue that there are a lot of actors out there who are deliberately manipulating our minds because they understand how to hijack mental immune systems better than we know how to protect them. Those of us who care about civility, kindness and a better future for our children and grandchildren, if we get together and actually fight back by boosting our own and each other’s mental immune systems, then these bad actors are not going to be able to manipulate us laugh all the way to the bank.
What role do you think institutions have in this? That’s an interesting example. Twelve sources equal 60% of the information consumed. That’s incredible and dangerous. One would then have a reasonable argument to say, “We should cut off those sources and stop them from having the ability to share.” We’re coming into the dangerous part of the censorship, cancel culture, all of these things that are pouring fuel on the fire. They’re going to censor him or her because they’re speaking the truth or whatever. It’s almost comical because it’s so useful to the narratives. I forget the name, but the CEO of Reuters is on the board of Pfizer, for example. Reuters is fact-checking Pfizer. These are facts because they’re there. They’re black and white. It’s taken out of proportion. There’s corruption at the top.
You can create the appearance.
You can create a narrative that the mind boggles at the level. What are your thoughts on all of this? What I was going with it is the fact that we’re faced with the reality that we need to prevent bad information from going out there. How do we do it without censorship, potential culture in these types of things?
There are fundamentally two ways to halt and prevent bad information from spreading. You can try to halt it at the source. That means trying to silence the person who is trying to spread it, or you can improve the critical thinking skills or the mental immunity of the recipient of the bad information. For the most part, we should be trying to beef up our mental immune systems and resistance to bad information. That’s far preferable than trying to take away the speech rights of disinformation spreaders.
First and foremost, my commitment is to maintaining freedom of speech and limiting the damage by boosting mental immunity. That ought to be the focus of our efforts. At the same time, it’s not clear that that will be enough. I’m as big an advocate of The First Amendment and freedom of speech as anyone, but having studied the question philosophically, I know that you can’t be an absolutist about free speech any more than you can be an absolutist about anything else.
If you’re an absolutist about free speech, you have to say that it’s okay to yell fire in a crowded theatre, but it’s not. If you’re an absolutist about free speech, you have to say it’s okay to incite people to violence, but it’s not. The fact is, there are limits. The worship of free speech can become pathological. As a culture, we’ve emphasised speech rights to the exclusion of speech responsibilities. If we don’t correct for that imbalance, dangerous information, that is not a contradiction.

Dangerous information exists. It could very well send us back into another Dark Age. There’s a very strong argument for de-platforming people who abuse. There’s no reason why peddlers of disinformation should get huge platforms to spread their bullshit. We certainly don’t have an obligation to give them huge platforms.
This is the idea that with great power comes great responsibility, and this applies to a lot of these social media companies. You talked about reasonable ideas. We need to hone our skills in figuring out what is reasonable. Can you define how you would define reason or our ability to reason? As I mentioned in our initial exchanges, there are a couple of things I like to unpack about this idea of reason.
Our concept of reasonable fills an important role in the ecology of our thinking. We basically need a way to sort out responsible ideas from irresponsible ideas. The responsible thoughts from the irresponsible thoughts, the responsible beliefs from the irresponsible beliefs. We need a way to do that. The concept of reasonable helps us imagine what that’s like. The concept of reasonable suggests that we have to test our ideas with reasons in some fashion. If it passes the test, then you attach the label reasonable to it, and then you’re allowed to use it in your thinking, in your justifying, in your policy, and in the construction of the public policy, etc. If you test an idea and it fails the test, then we slap the word unreasonable on it. That means you’re not entitled to sling this around or share it as though it was a responsible conclusion. Does that make sense?
What makes something then reasonable? What are the necessary conditions for it to be reasonable?
Philosophers have been wrestling with this question for 2,400 years. The wrestlers go, “What are the signs? What are the tell-tale features of a genuinely reasonable belief?” For about 2,000 years, philosophers have been fascinated by the idea that the true test of a reasonable belief is if there’s good proof for it. If there are solid reasons, and they can be traced all the way down to foundations, that’s the true test of a reasonable belief. That’s a very common view that goes back to the philosopher Plato. I argue in the book that that view is actually mistaken. In fact, that view exacerbates our prognostic confirmation bias. It makes us actively look for proof to support the beliefs we want to believe.
What we actually need to do is a more rigorous way to check for wishful thinking. The way to do that is to listen to the questions and challenges of others who can spot our own blind spots better than we can and only hang onto that belief if we can fend off all of the good questions. I call this the Socratic test of reasonable belief. Can you answer all the savvy questions that might be posed to your belief? If yes, it’s okay to treat it as reasonable and responsible. If not, you probably ought to take it with a grain of salt and be at least a little bit less certain in your views. Does that make sense?
I don’t want to say push back a little bit, but I want to bring in some other thinking. I’ve mentioned it in our exchanges, which is The Enigma of Reason by Mercier and Sperber. I’m sure you’re familiar with their work and definition of reason, being basically a social interaction goal that has evolved. It’s an evolutionary by-product where reason is used for two principle functions. One is to justify my own actions. We all do this in everyday life. We find excuses for, “I’m a recovering smoker. A long time ago, I used to smoke.” Back then, I could always find reasons why I was smoking and justify them to myself perfectly easily.

The other reason is to convince others. This is part of our social and group identity and our need to be in a tribe. Their argument is quite convincing because we use reasons to convince others that our worldview is correct and accurate, which they pulled apart. In their book, they’ve moved reason to a social function as opposed to logic, that if then, if then. It seems to me like they’ve pulled that apart. In my view, at least quite convincingly, because if I view everything we’ve discussed, for example, COVID, through their lens, it becomes quite easy to see what they mean. Those who are anti-vaccine can justify it so easy to convince each other or justify their own hesitancy.
Hugo Mercier and Dan Sperber have been spreading a very subtle mind virus here.
I’m glad to hear it because I have been, in some sense, infected, and I know others who’ve read their book because it’s a rather convincing argument.
Dan is a Sociologist, and Hugo might be a Cognitive Scientist of some kind, but in any case, they do some good work. They rightly point to the fact that our reasoning is often skewed by social pressures. We tend to reason in ways that keep us in good standing in the groups we want to belong to. It turns out our brains actually evolved to keep our thinking aligned with those in our group.
Groupthink is a real phenomenon. When you use reasons to rationalise what you want, it’s a common thing, but it’s not the only way to use reasons. It’s possible actually to distinguish reasoning and rationalising. Genuine truth seekers as opposed to those who are, trying to curry favour with the in-group, are sensitive to the difference between reasoning and rationalising, and they reject rationalising as not as irrelevant.
Mercier and Sperber had an article that came out a few years before their book. It got me thinking about this issue. I spent a few years actually researching the evolutionary origins of the human capacity to reason. It ended up concluding that reasoning is fundamentally about trying to align the mental states with one another. If you and
Do you mean like internal cognitive dissonance or between?
It’s more like this. Suppose you and I are driving together to a destination we both want to get to. You’re at the wheel. I’m in the passenger seat. You want to go across the 30th Street bridge. You’re heading that way. I say, “Maz, slow down. That bridge is under construction. You want to head down to the 40th Street bridge.” What happens there? We actually have a difference of opinion about which way to go. We’re at odds. I present information, a reason that changes your mind. You turn the car around. We go across the 40th Street bridge and where we want to go. That’s the use of a reason to change your mind. I’m actually using that reason as a lever to pry loose, to shoehorn a new piece of information and nailed it.
I agree. What if that information proved to be wrong? You’ve convinced me that you have the reason, and because of my natural and human need for cooperation and recognition, I will say, “Of course, Andy, you’ve probably been there a number of times this week.” I’ll believe you because I want to keep the cohesion. That’s my natural inclination. What if that information is wrong? I’m bringing this to the day-to-day lives. I might even say you’re a figure of authority because you’ve been there a number of times. You have the knowledge, say somebody like Malone on Rogan, who is “the inventor” of the mRNA vaccine.
Here’s somebody whose reasons could arguably believe. Sorry to finish my thought because I can’t say that I’m an expert in anything, let alone everything. For me to go and confirm somebody, like Malone’s reasons, would take PhD after PhD, dedicating my entire life to confirm that his reasons are right, especially when he’s somebody with such credentials, he’s an authority figure. He comes to somebody like Rogan, who’s a popular social culture figure. These are reasons for me to believe that he’s right. Down the line, they might prove to be wrong, and they probably will. Do you see the predicament?
There’s no question that there are some difficult and complex cases. If I sat down with Malone and tried to counter his every argument, I’m sure I would struggle. In part because there’s so much bullshit out there that it’s hard to combat it all.
Doesn’t it then move our understanding of reason or reasonable?
Let’s bring it back to the simpler and more everyday case of which way we should go to cross the river. This is a perfectly ordinary everyday case of reasoning, of reason-giving, and it works to change your mind so that our thinking and behaviour become aligned again. We can achieve a shared task better because I’ve used a reason to bring our minds back into alignment. I argue that that’s the fundamental reason. The reason evolved to help human beings do that thing. Notice I am changing your mind and deliberately changing your mind so that we don’t waste time in a traffic jam. It’s not manipulative. I don’t have a Machiavellian motive to change your mind so that I can take advantage of you.
How do I know that? I have to trust it.
Let’s assume we have a long friendship, and you can trust me. The point is that Mercier and Sperber develop a view of reason where reason is fundamentally a manipulative thing. The cases were more cooperative, use of reasons was more the outlier for them. In the paper I wrote, responding to Mercier and Sperber is that they can’t be right about that. That manipulative reasoning has to be the exception, not the rule or the practice of reason-giving will die out. We’d never trust each other, and we’d stop listening to each other’s reasons.
It’s about the underlying motivation.
For the most part, reasoning works when we live in a group where of people we trust. We tend to trust people who are in a better position to know than we are. It works out pretty well. I defer to you when you probably know more about the best way to get there. You defer to me when I probably know the best way to get there, and it works out well for both of us. That’s the way a complex society shares. It’s a way of dividing up epistemic responsibility so that collectively we’re smarter than any one of us is alone.
For the most part, reasoning works when we live in a group where of people we trust. We tend to trust people who are in a better position to know than we are. Click To TweetReasoning has to work first and foremost as a collaborative way to sync us up and only secondarily as a way to manipulate and hijack the mind. If we keep that in mind, it turns out that the Mercier and Sperber thesis is off-kilter a little bit. Steven Pinker had basically said in the preface to my book that I got it right. Mercier and Sperber need to adjust because of my article.
Revisit their thesis.
I appreciate the validation. I’ll send you a link to my paper, and maybe it will cure this mind infection.
I’ll share it because I can’t say I looked at their work through that lens. I didn’t pick up on the sinister notion that it’s about manipulation. I always looked at it when I read their book that, it’s a product of evolution in the sense that it is about cooperation because we need to reason with each other in order to achieve social goals.
I’m presenting the simple version of the conflict. There’s a great deal of insight in their work, and they’re quite right. The social tensions are part of what makes reasoning what it is. In my original paper on this, I said Mercier and Sperber are mostly on the right track, but they need to tweak their view in this way. The emphasis becomes quite different. I notice that where the view that reasoning is fundamentally a way we use to manipulate each other’s minds that produces cynicism. The very cynicism that undermines trust, the very cynicism that makes thinking go haywire, my view, which is the reasoning is fundamentally more cooperative, doesn’t have that defect.
Now we apply that on the macro scale. We are talking about our institutions have failed us and the wrong information and actions being taken by institutions. If we take away the manipulative component but look at it with the most gracious set of eyes, we possibly could say, “Everybody makes mistakes, but they’re doing it with anybody that’s in power leading a country.” At least some of the more well-established countries and democracies, we have to have the underlying assumption. We come back to the underlying assumptions that they’re doing right reasons, despite the fact that they might make wrong decisions.
Philosophers call this the Principle of Charity. If you want to learn from people, you go in assuming that their hearts are in the right place and they’re being honest and fair-minded with you. We try to assume that as much as possible. Sometimes that assumption of the trust we extend to people that way can leave us ripe for manipulation. It turns out that our ability to think together degrades rapidly when we lose trust. When large communities of mutually trusting thinkers like scientists come together, they can do so much more to illuminate the way the world works than when you have a lot of petty tribes squabbling with each other, back-biting and mutually suspicious. There’s a lot of trust built into the way Science works, and it’s part of what makes science work works so well.
There is a lot of trust built into the way science works, and it's part of what makes science work works so well. Click To TweetWhat do you mean? I know the answer, but I’ll give you the space.
Every scientist who earns their stripes has to work hard to develop a specialty, where they can see further and more clearly than anybody else. That’s the test. That’s how you get a doctorate. In the process, you learn that there are a hell of a lot of other areas the way you’re not an expert. When it comes to Quantum Physics, I defer to Quantum Physicists. When it comes to Chemistry, I defer to the Chemists. I’ve developed a very narrow expertise in Epistemology, and I can apply it.
I have some things to say there, but you don’t gain that expertise without learning a lot of humility. A well-functioning community has people who are confident where their expertise is high and who are humble where it’s not. There’s a whole lot of junk on the internet that gets people to be passionately confident in stuff they have no business being confident about.
That’s one of the ways I’ve tried to reason with people. When they throw 1,001 facts at me say about COVID, my principal defence mechanism is, I’m making a conscious choice that I’m trusting those who I conceive as experts who’ve dedicated their entire lives and countless lives before them, that they’ve built upon to answer this question. I’m not an expert in anything, let alone everything. I have to trust, or we have over forever. We have trusted those who we deem to be experts.
We are going haywire because there, as you said before, questionable experts, who have the titles and can talk the talk, but certainly are going against the scientific consensus. We’re in a world where, “Follow the Science,” has become a dirty word. When you say that you are cast out as “You’re brainwashed” which is crazy.
Let me give you an example. I have felt the pull of some of these internet rabbit holes. Let me give you one of them. I was diagnosed recently with kidney stones, which means that crystals are forming in my kidneys that could cause some harm. The last time I had them removed, it was excruciating. This time when I was diagnosed, I went online, and looked for food supplements that would dissolve kidney stones. Sure enough, there are some slick websites out there that say, “This stuff works. They have hundreds of testimonials from people who are passionately devoted to it.”
For all I know, they’re onto something. I go to ask my doctor, “What about this supplement?” “I’ve never heard of it. Chances are there’s no good clinical study yet. I would toss it in the trash and forget about it.” What am I supposed to do? I want to believe in this stuff. It’s quite possible that a vast extended community of people experimenting on themselves can learn things before the lab researchers do. The biohackers who are out there trying different herbal supplements are probably going to stumble upon something before the experimentalists.
That’s how it started in the first place before we had Science, “Let’s try this mushroom. This one might take you to a whole new world. This one will kill you, or this one will stay good.” It’s experimentation.
I can see why people are saying, “I can do my research online,” and sometimes learn things that’ll take my understanding beyond where some of the supposed experts, my doctor or my kidney specialist. There are times when internet research can actually take you one step beyond what the specialists say, but there are many cases where the internet will take you ten steps in the wrong direction instead of one step in the right direction.
That’s the dangerous part because once the specialist has been invalidated once, it’s far easier to then start doubting the specialist time and time again.
When those doubts grow out of all proportion and become sweeping ways to feel superior to the entire establishment, that’s a seductive trap.
I like the story of Fred, the Flat Earther, because that speaks to this point. Maybe you can tell the story, but it speaks to the point that the rabbit hole is so deep.
I’m glad you like the story. Fred, the Flat Earther, dies and goes to heaven. He gets a chance to chat with God. He says, “God, I’ve been a Flat Earther my whole life. I have to know. Can you tell me now? Is the world flat, or is it round?” God says, “I’m sorry, Fred, but the world is very round.” Fred looks at him and says, “This conspiracy goes higher than I thought.”
Isn’t that the truth of it? Some of us are more vulnerable than others because there are correlations, and there’s a lot of research as to what makes people more vulnerable to conspiracy theories or conspiratorial thinking. What are some of those that you’re aware of?
Being disaffected and hopeless, feeling like you don’t matter, lonely and disillusioned, all of those things are contributors. Those are some of the larger sociological or psychological factors. I argue that if we want to examine the matter closely, we’ll actually look at the way a person’s mind’s immune system works.
There’s an interesting study out of Canada. It says that if you’re tempted to discard the idea that beliefs should change in response to evidence, suppose your faith tradition demands that you say, “Evidence isn’t all that,” it’s certainly not the hallmark of responsible belief. My faith tells me that it’s more important to believe this. Once you accept that, your mind’s immune system gets weaker, and you become more prone to conspiracy thinking. You become a mark for propagandists, manipulative advertisers have an easier time hoodwinking you.
You’re saying religious thinking, and you talk about this in the book, which makes you more prone.
I don’t offer that with any pride. I don’t want to say that in a way that disrespects many of my religious friends. I know that a message like that feels harmful to them. I’m not trying to be mean-spirited here, but I am trying to illuminate a phenomenon that is quite real. There’s actually a lot of evidence from history that passionate religious devotion can unhinge your thinking. We think about religious extremism. That’s a small fraction. It’s quite an open question whether more moderate religious faith can weaken your mind’s immune system to a more modest degree. The evidence is starting to accumulate that it can. If we care about mental immune health and wisdom, we need to rethink what faith means.
I distinguish in the book between good faith and bad faith. When you use the concept of faith to excuse dogmatic, inflexible belief, I call that bad faith and argue that it’s bad for your own mind and probably bad for humanity as a whole. If you’re resolutely hopeful and trusting because that’s the person you want to be, and you think you can help make the world a better place by being resolutely hopeful and trusting, that’s a beautiful thing. That’s the faith I can get behind.
Those that are stuck in the dogmatic sense of belief and faith oftentimes don’t see it because our own confirmation bias tells them that, “I’m into good faith. I’m fighting for humanity against the hoard that is trying to manipulate, buy and own you.” How do we inject them with the mind vaccine? How do we reach these people?
We should start raising every generation to realise that hitching your identity to a set of beliefs sets you up to be close-minded about certain things. We shouldn’t be teaching people to identify with any ideology.
That strikes me as profoundly important, almost the crux of it all. What do you mean by that?
How about this as an alternative? Instead of saying, “These are my beliefs, and I’m going to fight for them, come what may, because my whole life is about these beliefs,” say, “I have a need to belong like everyone else. I’m going to find a community of fellow inquirers and seekers. Instead of holding any particular belief sacred, we’re going to be passionately devoted to learning and finding out, and trying to find the truth.” If you trade in a belief-based identity for an inquiry-based identity, your mind begins to open up. The opportunity to learn at a much more rapid pace becomes a beautiful side effect of that.
This is also how you finished the book. You talk about a new or upgraded Socratic Method. What do you mean, and how can we apply it?
Thank you. Socrates was the Greek sage who developed a method that still bears his name. The Socratic Method is a process of testing ideas with questions and then going in with an open mind to see what will happen. Socrates would wander around and say, “Do you think you know that? Let me ask you a few questions.” He’d start asking brilliant questions that would soon reduce the person. I guess I didn’t know that after all.
Socrates would do that, and he would humble people. He made enemies that way, which is why they made him drink poison. That’s how Socrates’ life came to an end. The Socrates basic approach, which is to ask clarifying questions that I’m genuinely open-minded, open heart, and then see what happens and being responsive to what happens. Being ready to learn from whatever transpires in the ensuing discussion. Observe some very simple rules about how to reason fairly, our minds become remarkably more resilient to misinformation and disinformation.
I take this famous Socratic Method. It’s one of the most powerful mind inoculants of all time. I soup it up in the same way that immunologists soup up inoculants to get vaccines. I can soup up this Socratic Method and make it something that’s even more powerful to protect our minds. The heart of it is learning to ask good, clarifying questions, and to be patient while you explore the many different features of an idea, both good and bad. Many ideas have good and bad features, and real honest inquiry tries to look at them all and give each one its due. If the idea still looks good after you’ve carefully and fairly examined all of its pros and cons, then maybe it’s worth relying on then, otherwise, set it aside and look for an alternative.
What a wonderful view of the world. If more people thought like that, what a wonderful world we’d live in, undoubtedly. I particularly like the separation of identity because it strikes me that we need to separate our thinking and the reasons that we embrace from the identity that we choose or are brought into. I like that.
That’s a powerful one.
Andy, thank you so much. We’ve gone a little bit beyond our agreed time. I thank you very much for giving me so much of your time. It’s a wonderful book. I’ll be promoting it and sharing it with my peers, certainly in Defence Force, because this is an important aspect.
Thank you, Maz. I’ll send you that article, and if any of your listeners want to support my work, bringing about The Cognitive Immunology Revolution, please check out our CIRCE website, www.CognitiveImmunology.net. We need lots of people to help us if we’re going to save the world from infodemics.
I’ll certainly be sharing that site, and I’ll be supporting myself. On that note, Andy, we’ve discussed a whole bunch of different topics. It’s perhaps not always a structured as I might have wanted. I want to make sure that I haven’t taken your space to say anything else. Have we covered the main points?
This has been lovely. I’m perfectly happy with this conversation. I’ve enjoyed it immensely. Thank you for the chance to share my thoughts.
Important Links
- Patreon – The Voices of War
- Buy Me a Coffee – The Voices of War
- Peter W. Singer – Previous Episode
- Carl Miller – Previous episode
- Mental Immunity: Infectious Ideas, Mind-Parasites, and the Search for a Better Way to Think
- The Joe Rogan Show – Episode with Andy Norman
- The Cognitive Immunology Research Collaborative
- Sapiens
- The Enigma of Reason
- Article – Why We Reason: Intention-Alignment and the Genesis of Human Rationality