This essay starts with a simple question: how do we know what is true? In answering this I examine the collapse of trust in society alongside the rise of ideology as a substitute for rationalism. I argue that rationalism only works if we have the time and resources to enable it, and that these criteria are no longer being met. I posit that much as the Enlightenment killed God, we are watching it happen again. Rationalism is dying - the framework of truth we had is now breaking under the weight of volume, complexity, mistrust and ideology.
What happens when rationalism fails? What happens when we kill God a second time?
Quick note: throughout this essay, I use the term “rationalism” in a broad sense. Academically it refers to the philosophical tradition that privileges reason over experience (usually in contrast with empiricism). I mean something different by it here.
I use it as a catch-all term for an approach to truth-seeking that relies on both critical reasoning and empirical evidence - a sort of evidence-based rationalism, if you like. Popper called this “critical rationalism”, but applying that term to thinkers who lived long before Popper feels a bit ahistorical, and invites the kind of philosophical debate over labels that I generally find dull.
All in all, by “rationalism” I mean the use of evidence and reason to arrive at truth.
1. How do you know what is true?
The vast majority of things we hold to be true, we take on trust.
Sure, each of us holds some knowledge that we’ve verified through our direct experience. And equally there are statements which are true by definition (analytic truths - bachelors are unmarried men and the like). But if we were to take an inventory of all the things we hold to be true about the outside world (synthetic truths), most are handed to us. We defer to authority. They are taken - in large part - on trust.*
*Yes, there are those that doubt the existence of the outside world - the “you create your own reality through your thoughts, there is no objective reality” manifestation types. I’ll talk about this perspective in another piece.
I feel like a heretic when I say the following:
While I wholeheartedly believe in the reality of things like the planet Neptune, the continent of Antarctica and the long term dangers of antibiotic resistance, I have not personally verified any of these truths to any great extent. I believe in these things in large part because I trust the myriad of sources - people, papers, books and so on - that have told me that they are true.
Of course, my trust isn’t blind. Anyone even slightly discerning will do some evaluation (brief though it may be) to see if a new piece of information fits in with their pre-existing understanding about reality. I do my best to make a habit of this.
In the first year of my physics degree I studied a module in Earth sciences. Several lectures were dedicated to the evidence that had been gathered that had led the majority of scientists to the conclusion that the Earth was getting disproportionately warmer due to human activity. We were shown graphs, statistics and data. I found the argument that was presented to me (like most who have actually sat down and looked at the evidence) extremely persuasive. Given my understanding of physics - energy, heat, and so on, it all made sense to me. Though I had believed in man-made climate change before, now I was certain as I understood it, and the arguments behind its existence, in far more depth.
But there were gaps in my knowledge. I was no expert in atmospheric science (or even physics for that matter - as mentioned, I was still in my first year). Nor was I an expert in economics or geopolitics. I hadn’t personally shifted through mountains of data to see if there were any contradictions or tell-tale signs of political spin. I hadn’t double checked the numbers. I hadn’t collected the data myself. I was still in a very real sense relying on trust. I was trusting that all these scientists hadn’t fabricated their findings for the sake of publication. I was trusting that I wasn’t being presented with extremely selective evidence. I was trusting that geopolitical forces hadn’t compromised the entire field.
There are of course other reasons for my certainty. Climate-change deniers (or sceptics depending on your affiliation) generally have terrible arguments that betray a lack of understanding of basic physics. I have previously engaged with them in good faith and have always found them to be disappointing.
And yet, there is inevitably a certain point at which my personal climate change knowledge ends, and I am forced to take the rest of the claims made by the scientific community on trust. I could construct arguments as to why my trust is well-founded, and why scepticism would be absurd, but they would be just that - arguments, not absolute truths.
And these arguments would - if I’m being entirely honest with myself - partly be motivated by emotion, not just reason. I see myself as a part of the scientific community - the kind of person who has the right answers.
I believe in man-made climate change for a long list of reasons - it aligns with my understanding of physics, I trust in the credibility of climate scientists and so on. But in part I believe in it simply because I like the people who told me it was real.
This is worth examining.
2. Trilemmas
There’s an epistemological thought experiment known as the Münchhausen trilemma - coined by the German sociologist and philosopher Hans Albert - which articulates the problem of justifying any truth.
Albert says when justifying knowledge, we essentially have three avenues available to us:
Infinite regress - “It’s true because the claims that support it are true”
Dogmatic justification - “I take this to be true”. Truths by assumption, axioms, (e.g. the axioms of mathematics)
Circular justification - “It’s true because it’s true”. Truths by definition (e.g. bachelors are unmarried men)
Albert’s point was to show that most claims we make about the world are reliant on other supporting claims being true. That is, we use option 1. The problem with option 1 however, is that we can simply keep asking “but how do you know the evidence is true” ad infinitum.
Try this for a moment - take something you believe to be true about the outside world. Why are you confident in this? And for that answer, ask again, why are you confident in that? Eventually the regression ends with either a dogma (an assumption that you simply have to make) or a circular definition.
To put it another way, the foundation of all knowledge is made up of either assumptions or circular reasoning. This isn’t great.
To my mind there’s a slightly more useful version of this thought experiment which comes from another German philosopher: Jakob Friedrich Fries, who offers us Fries's trilemma. It differs in one point. Fries says we can justify knowledge using:
Infinite regress
Dogmatic justification
Reliance on perception or experience (e.g. the self-evident truths of I feel pain, I see a colour)
Option 3 accounts for much of the truth we collect in person, and it is important to note because it ultimately accounts for our data input - how we interface with reality.
But while this is useful for personal justification, it changes when we use it to justify knowledge to others (which is what the Münchhausen trilemma is about). Suddenly, when we attempt to communicate our subjective truth, our experience becomes testimony. It becomes a part of option 1 - infinite regress “but how do I know you saw a colour? How do I know you feel pain?”
Both of these trilemmas beg some questions. Firstly, given the fragile foundations of truth, how do we avoid succumbing to tin-foil-hat uncertainty? And secondly, given conflicting claims about the world, how do we decide what’s true?
Philosopher Karl Popper offered an answer: conjecture and criticism. Or in one word: argument.
Much like how my conviction in the reality of man-made climate change is bolstered by arguments, so too do we all use arguments to justify our beliefs. Arguments are the scaffolding and cement that tie together our facts, observations, and why we inevitably outsource via trust the gaps in our personal investigation and knowledge.
Here’s another example: our conviction that the stars in the night sky are like our sun - monstrously large continuous nuclear explosions. Not one human, nor has any man-made device, ever been up close to another star. Our assertion that they are all like our sun is based on a mixture of evidence and explanatory argument. Most people have not studied the evidence, nor are they privy to the technical details of the arguments involved. Back a layman into a corner, and their arguments will shift from the scientific to the personal. Their arguments will be to do with why we can trust scientists as opposed to the intricacies of telescopes.
To be clear, Popper's work was mainly focused around scientific knowledge. His insight was that knowledge grows not from certainty, but from criticism and iteration. It’s the clash of ideas - the process of refining arguments, coupled with experimentation, explanation and falsification - that builds understanding and moves us towards truth. This process does not eliminate uncertainty entirely (no system does) but it offers us a way to navigate it intelligently.
My contention is that Popper’s approach extends beyond scientific knowledge. I’d wager that the fundamental unit of all knowledge creation about the outside world (and all of science) consists of arguments of this form: ones which leverage criticism, conjecture and experiment. These, coupled with explanation, reason and trust allow us to build up a framework of truth and shared knowledge.
This is a great system. Look around you. It’s evident that we can overcome these trilemmas and generate knowledge. David Deutsch once described knowledge as information with causal potential. Technology, medicine, and the very device you’re reading this on now all prove we’ve gained knowledge - information with the power to enact nuanced causes and effects in the world. This can be best explained by asserting we’ve successfully gained some knowledge.
The problem is, this system requires a few things to work - one of them being a shared commitment to truth, and another being a shared commitment to valid data. Currently, people trust vastly different sources, arriving at vastly different worldviews, and we’ve lost the ability to self-correct with rationality.
Why?
2.5 A quick aside
As with all philosophical conundrums, the questions around how science works, and how we make scientific knowledge, are still hotly contested.
Popper’s methods are by no means universally accepted as a satisfactory answer to either of these questions (either prescriptively or descriptively). Plus, alternative answers as to how and why science works - and what its methods consist of - vary wildly. Feyerabend and his adherents assert that there is in fact no scientific method at all, and there is even a school of thought that questions whether science is actually a truth seeking enterprise to begin with (scientific anti-realism). It would be deeply ironic if - in an essay that purports to be about truth, trust and the pitfalls of dogma - I failed to mention this.
I’m also not claiming that science and truth seeking are a purely rational enterprise (there are excellent arguments for the importance of irrationality - be that intuition, accidents, random creative hunches and so on). Michael Strevens has an excellent book (which I would thoroughly recommend) called The Knowledge Machine that makes the point that if anything, the strict focus on empirical evidence and testing in science could even be considered irrational.
Still, many would agree that science is at the very least iterative, requiring some degree of rationality, and that truth-seeking as a whole has a trust-dependent component. These elements are under threat. Popper’s approach is a useful one to emphasise and highlight the decay in these vital ingredients. Feel free to quibble with me in the comments if you so wish.
3. Don’t trust what you see online
I remember being 12 years old in a school assembly, where our head teacher passionately proclaimed “don’t trust everything you read online, anyone can put stuff on there”. My friends and I sort of knew to be wary of the internet, but it didn’t exactly seem like a threat. It felt more akin to an unreliable book that you had to go out of your way to read. At that time - back in 2006 - smartphones didn’t even exist. The main way we could get on the Internet was through a home computer, and that was mostly a Microsoft word and RuneScape machine rather than a source of misinformation and propaganda.
My experience of the internet today is in some ways very different, but in some ways very much the same. Most of what I learn about the world I learn through the internet. And yet, I find myself increasingly paranoid about being misled or lied to - a fear that feels justified.
On the one hand, on a daily basis I will read a piece on some topic that leverages some school of thought, historical figure or thinker I have never even heard of. Often it will be brilliant, revealing some deep insight about a topic I do not know much about. When I stumble across insights like these, this is in one sense wonderful - learning new perspectives is enriching in itself - but it is also deeply troubling. With every new insight, my known unknowns grow that little bit more, and my confidence in my understanding of the world diminishes further.
There’s just so much I don’t know, and I can only quietly imagine the size of the dark continent of unknown unknowns - what I don’t even know I don’t know.
But secondly and more importantly: it is usually impossible for me to rationally assess on the spot - using the knowledge and understanding that I personally possess - whether or not what I’m being told is true, or even slightly accurate. In order to check, I need to research. But that means going to other sources.
In this way, the internet has brought the Münchhausen trilemma into sharp focus. Every claim online leads to another link, another source, another rabbit hole of evidence to evaluate and check. It’s a technological infinite regress. Option 1 of each trilemma. Sometimes I do spend hours going down rabbit holes, but of course I never reach the bottom.
In principle, I’d like to apply Poppers’ methods - to use reason and criticism to parse out what is true and what isn’t. To arrive at some balanced hypothesis given the information I’m presented with, which takes into account the biases of the various news outlets or pundits, cross-referencing this with what I already know and so on.
However, two things prevent me from doing so:
I’m woefully underqualified
There simply isn’t enough time
As an example, suppose I read some articles which talk about the economic impacts of a new piece of European legislation on AI safety. One article claims that the new legislation will hamstring businesses and do nothing to improve AI safety globally, while another says that leveraging open-source AI, coupled with the fact that most research will be spearheaded not by SME’s but by large businesses, means that the legislation won’t have much of a negative impact on the growth of AI in Europe.
In order to wield reason to rigorously assess the claims in these articles, I’d need a background in:
Computer science and AI
Economics
European Law
American Law
Business
….and possibly more
And this is just one news story. To consistently find the signal amongst the noise in the increasing chaos and complexity of the world we live in today - to truly and deeply understand it - it strikes me I would also need background in:
World history
Journalism
Epistemology
Psychology
Detailed understanding of the inner workings of world governments and policy
Microbiology
Geography
Mathematics and Statistics
And so on.
Who possesses a CV like this? Certainly not me.
Secondly - as mentioned - even if I had the polymath background to broadly apply the Popper approach, I would not have the time. This is because a) The amount of information is overwhelming. And b) as the world becomes more interconnected, more populated, more complex, the amount of information that could be relevant to any particular claim also increases. Who has the cognitive resources to even determine what all the relevant information could be, let alone process it all?
I’m not advocating being a tin-foil-hat sceptic here, but I am being serious. Not knowing - and not being able to know - what’s true on the internet is a well-known problem that is only going to worsen. Even something like AI generated video or deepfakes on their own represent an existential risk to information credibility, never mind all the other threats that continue to emerge and grow more potent.
When we observe conflicting claims about reality online, most of us not only lack the time to fact check, but we also lack the expertise necessary to critically evaluate. And when we do fact check, the seemingly infinite regress doesn’t just act to overwhelm us, it actively erodes trust in all sources. Every claim leads to another source, and often these sources lead to more contradictions and questions - it becomes impossible to determine what’s credible without prior knowledge.
And we’ve seen this happen repeatedly, one of the most damning examples being the anti-vax movement during the Covid-19 pandemic. Theories about microchips and population control were peddled on one extreme, while on the other guilt-tripping ideologues would proclaim that there were no valid questions to be asked. For the average person, evaluating the claims and data emerging about the virus and vaccines would have meant diving into scientific studies and navigating torrents of misinformation. Most of those who even attempted to do this lacked the expertise - so people picked a side.
A more Popper-inspired approach is to concede that there were things we didn’t know about the virus and vaccines - that there aren’t any certainties in science ever. And yet, it would have stressed that the overwhelming evidence and data would suggest that having a vaccine would leave one better off than not. Some rational voices held this position at the time. However, having these nuanced conversations is extremely difficult in today’s media landscape, even if you are equipped with a rigorous epistemology.
Until recently one could have the illusion of being well informed. There were (purportedly) trusted authorities, slower news cycles and fewer moving parts. One could read the morning newspaper before heading off into work and feel as if they knew what was going on - not all that was going on, but the things that ought to be known.
But now that’s not the case. Every news outlet and pundit has an explicit agenda - even if that agenda is merely to get your attention by any means necessary - sensational headlines, clickbait articles, negativity bias and more. Accusations of fake and misleading news are made every second. There’s shockingly convincing deepfakes. There’s a replication crisis in science, which is only worsened by the staggeringly bad standard of scientific journalism in media. Old favourites like propaganda, corporate or government agenda and censorship continue to be problematic.
And now we have new threats: The sheer volume of information, the complexity of information, the cultural loss of nuance and so on.
A half-decent solution to all this noise and madness? Outsource your trust to a handful of sources with integrity, whom you carefully select, and modulate your trust by rationally assessing the information they disseminate which you have expertise in. Further, understand that this is an iterative and ongoing process.
We are not doing this.
4. The Ideological Shelter
If there are two things that are almost universally unfashionable, it’s appearing ignorant or being wrong. We live in a world where we are defined by our opinions. We do not have a culture of admitting to mistakes or focusing our attention towards corrections. Thinking is taxing, and it often leads us to realisations that our initial conception of the world was misguided - that the world is not black and white, but bursting with all kinds of colour, including shades of grey.
In the absence of an absolute source of rational truth, we long for a new place to anchor ourselves - a centre from which we can build up the rest of our worldview. A place with easy answers, a nice story and narrative. A way to reduce cognitive load. A map of the territory.
Ideological alignments are these maps. They solve the epistemological and cognitive problems for you.
Too much information? No problem. Ideology dictates which sources to ignore, and which to listen to. It tells you what information you can immediately write-off as fake, without spending cognitive resources rationally examining them.
How do I know it’s true if I can’t check myself? No problem either. Ideology dictates where you place your trust. You don’t need to spend time or resource regressively fact checking.
You and I have sources and authorities online (and in person) that we trust. Articles and media shared from these sources we’re more likely to believe at face value. Here’s a question: how often have these sources been wrong? What mistakes are you aware of that they’ve made in the past few years? Surely, they must have been wrong at some point - can you name any examples? If not, isn’t that concerning?
Ideologies are far more than just cognitive shortcuts. They act as paradigms in themselves. They provide not only a worldview, but also meaning and emotional reassurance.
In a world which feels progressively more chaotic and confusing, ideology is particularly seductive. We’re often attracted to people who radiate certainty - the types of people who are confident in their proclamations about the world. The ones who get it. An ideology gives a sense that you’re on the right side of history, that you’re part of the inner circle, that you’re one of the chosen ones - that you belong.
But this seduction comes at an enormous cost. By outsourcing your critical thinking to an ideology, you risk becoming blind to its flaws. Confirmation bias creeps in. Dissenting voices are not only dismissed but often demonised - they’re considered wrong, bad, evil. Of course sometimes this might be the case, but given disagreeing ideologies often participate in mutual demonisation of the opposing side it clearly cannot always be the case.
We’ve heard the phrase “facts don’t care about your feelings” ad nauseum in online discussion. But more often than not it’s your feelings that don’t care about the facts. We don’t want the truth, we want to be right.
Your feelings only care about certain facts, from certain people. Ones that affirm who you are, and ones that affirm who you are decidedly not.
How often do we parrot and repeat the lines of others we identify as being in the same communities with, with no real process of evaluation? Or quote and laugh at the absurd utterances of the opposing side, having done no assessment ourselves? How often do we indulge in these behaviours simply because it feels good?
Much like I could argue at length about why trusting in climate scientists is justified, so too could a regular watcher of Fox News argue why I shouldn’t. For every argument I can make in favour of the heliocentric theory that kick-started the enlightenment, a flat-earther could take the same evidence and twist it, reframe it, or outright deny it. Exposing weak links in their web of reasoning is generally less effective than understanding the emotional reason these ideologies appeal in the first place.
Without conceding that we have emotional ties to our views, and without the ability to say there may be some parts of my worldview that are incorrect from the outset, most rhetoric will fall on deaf ears. Popper’s framework works only when we have the time, trust, and shared commitment to engage in genuine argument to get to the truth.
This applies to even well-meaning ideologies. Let's take the case of extreme climate activists. It would not be wrong to say that these groups have sometimes amplified doomsday warnings to drive urgency. You could argue that this is morally justified - perhaps even necessary. However this approach risks distorting a complex issue and contributing to mistrust when the reality doesn’t immediately align with the most catastrophic predictions. Again, I must stress that this isn’t to deny the overwhelming evidence of climate change or its dangers - in my view it’s an undeniable existential threat. And I’m not pinning the blame, nor the existence of climate-change scepticism on extreme activism. I’m merely pointing out that ideology can sometimes simplify truths and situations in a way that backfires, even if that ideology is well-meaning and ostensibly grounded in truth. It’s difficult.
Uncertainty is scary. Scepticism is often read as ignorance. Failing to have an opinion on something is not only unfashionable, silence is often equated with violence. There’s a huge pressure online to have an opinion on the latest crisis, to talk about it, to educate yourself on it. Educate yourself is great advice, however what is often meant by this phrase is educate yourself on my views and sometimes it also means to question my views is an act of intolerance. I.e. align yourself with the right ideology, or else. You could argue that this social pressure is often morally justified, however there’s no doubt that when it comes to the question of truth, the incentive structure leads many to simply performatively parrot the view that they think will get them the least flak. Over time this breeds mistrust, not just of others, but of our own convictions.
Ultimately, the advice of “don’t trust everything you read online” that my head teacher disseminated nearly two decades ago almost everyone would nod along and agree with. However, what I suspect we quietly mean by this is “Don’t trust those sources, trust my sources - because they affirm my story about the world and who I am.”
Ideology simplifies the world, but it also necessarily distorts it. Even if the intention is to do good and honestly represent the world - even if the map is accurate, a map is useful fundamentally because of the information it leaves out. We could not navigate the roads if our maps illustrated every tree by the roadside. By forgetting that we are using maps in the first place (and that even the best ones are prone to error) - we risk becoming blind adherents to worldviews. We dull the blade of critique and conjecture that is so vital to knowledge generation, and we dismiss and accept evidence because it feels right to do so, not because we’ve reasoned it.
This is how we killed God a second time.
5. How to kill God
Before the Enlightenment, truth came from God, not reason.
The infinite regress of justification ended with religious dogma (or, depending on your beliefs, religious decree). I.e. it’s true because God made it so.
When the Enlightenment technically began is a matter of fun historical dispute. The Enlightenment followed and overlapped with the Scientific Revolution, which was a time of incredible scientific discovery and technological advancement.
Some historians argue that the Scientific Revolution began with Copernicus, specifically in 1543 when he published De revolutionibus orbium coelestium aka On the Revolutions of the Heavenly Spheres. This was the work that claimed that the Sun - not the Earth - was at the centre of the solar system: the heliocentric theory. A staggering blow. From there a torrent of rationality emerged in the following centuries. Descartes, Spinoza, Leibniz, Newton, Kepler, Galileo, Francis Bacon and so many more showed an alternative approach to truth:
In place of “God commands it” they uttered “I’ve reasoned it.”
To understand all the moving parts that made up the Scientific Revolution and the age of Enlightenment is a subject that can only be meaningfully tackled by volumes of books. That being said, I’d like to point to two parts of these movements in particular:
The first is Francis Bacon’s New Atlantis. Bacon was a polymath who lived an extraordinary life, and was determined to essentially put reason and philosophy to use. His work New Atlantis imagined a society where scientists collaborated, shared knowledge, and through building collective understanding of the world ushered in a utopia. It was a manifesto for a society built around the pursuit of truth.
To call this work a hit with the would-be-scientists of the time and the future would be an understatement. Bacon’s ideas were as radical and as inspiring to the future natural philosophers as the Beatles’ albums were to all the musicians that followed Ringo, Paul, John and George. His vision fuelled centuries of progress.
Part of that progress was inspiring a group of men to set up perhaps one of the oldest scientific journals - Philosophical Transactions. First published in March of 1665, this journal did something radical: it published knowledge that was established not by religious revelation but instead by testimony. Affluent experimentally-minded men would do experiments in front of a crowd of witnesses and record what was seen. These accounts were written up and published in Transactions to be read, critiqued, examined and repeated. This set the stage for scientific collaboration and peer-review. It meant that there was accountability for claims, and results could be verified. Much of our progress in science we probably owe to this idea alone.
These two vitally important parts of the Enlightenment point to the following: while argument-fuelled rationalism was the new machine by which knowledge was generated, what held that machine together was trust. The entire enterprise relied upon it.
It worked. Western science was born. There was an explosion in new political and social ideals - including the separation of religion and state. Divine authority was dethroned. The whole enterprise is generally seen as a positive arc in history - an acceleration of human flourishing.
But it wasn’t all sunshine and roses.
6. The assumptions of Rationalism
Nietzsche chillingly described the emergence of the Enlightenment with the line: God is dead.
This macabre (and let's face it, kind of metal) statement was Nietzsche's way of pointing out that the influx of scepticism to religious claims had not only far reaching societal impact, but also deep psychological significance.
For in the pre-Enlightenment days, God wasn’t merely the source of truth - God was the source of goodness and meaning. By undermining faith in God, Nietzsche claimed that people would lose a shared moral foundation and epistemology. No longer would they share a hymn-sheet of what the concepts of goodness and truth fundamentally meant. He worried that without a God, humanity would run the risk of falling into nihilism - a state of decay where life has no meaning. No concept of goodness or truth means one has no north star. Life loses its lustre.
How to combat nihilism? Nietzsche’s prescription was that people would need to create their own values: to become an Ubermensch. To an extent his prescription worked. Shared sources of meaning and truth did emerge - the world didn’t fall into nihilism.
In fact the Enlightenment did one better: it didn’t just kill God - arguably it replaced him.
Reason, science, and humanism became the new pillars of truth and meaning. The shared framework of rationalism allowed western societies to cooperate, innovate, and progress.
But Nietzsche saw the writing on the wall. Without a divine anchor, without some shared sacred doctrine, truth and meaning became human constructions - fragile, subjective, and easily challenged. Blind nationalism, ideological extremism, or worse could fill the void left by God’s absence. Sound familiar?
We live in a world where increasingly nothing feels true, nothing feels good, and nothing feels worth striving for. Between algorithmic distortion, fractured realities, the breakdown of trust, the signal-to-noise ratio and the sheer monstrous complexity of the world, rationalism no longer works for most people. It’s not simply too much work, it’s a fundamentally impossible task.
And lord above, we cannot seem to admit this.
There were three hidden assumptions to rationalism:
That we could trust testimony
That it was (in theory) possible for one to have the time and resources to personally assess claims made about reality.
And ultimately
Being wrong was OK. That is to say - the truth was something to figure out. To err was human. Iteration and discovery was the process. We didn’t want to be right, we wanted to get to the truth.
These assumptions no longer hold.
When we can no longer trust testimony, when the complexity of the world outpaces our ability to assess it, and when admitting we’re wrong feels like failure, rationalism begins to crack. And that crack is opening up a god-shaped chasm.
7. The Four Horseman, and the new Pantheon
Nietzsche asked, after declaring God was dead “what sacred games shall we have to invent? Is not the greatness of this deed too great for us? Must we ourselves not become gods simply to appear worthy of it?”
In God’s place we put the rational human. A kind of Ubermensch - the scientist, the journalist, the well-informed citizen. Our new god. We heralded mathematics and the sciences as the most respected subjects. Technological advancement meant supremacy. Einstein became the mascot for genius.
But chaotic forces have come to bear, and they are killing the usurper. The divine is dead, and soon the rationalist will be too. What are we to do with two corpses?
Some wish to resurrect God. In some sense this is literal: we are seeing the re-emergence of movements like Christian Nationalism and a defection back to religious doctrine. However conspiracy theories, extremism and ideological purism are doing much the same - they offer a pre-baked answer to the question of the state of the world, its enemies, and where one sits within it. They are attempts to reanimate old gods—offering certainty and belonging in a fractured world.
Meanwhile, technocrats and futurists dream of new gods: AI systems that can transcend human fallibility and arbitrate truth. What is the race for superintelligence other than this? How long is it until people unironically accept statements because the latest super-version of ChatGPT said it was so? This is divine decree all over again.
And some are trying to feebly keep the rational human alive. Trying to consume enough, write enough, think enough to understand what the hell is going on, and what we should do about it.
As we stand over the corpses of God and the Rationalist, we must reckon with the forces that brought us here. The Enlightenment gave us the tools to build a world of progress and reason, but those tools are now straining under the weight of what they built. Four forces in particular have emerged as the horsemen of the rationalist apocalypse:
Volume: The scale of information we encounter is only increasing. There is too much noise, too much to know and too much to process.
Complexity: We lack the expertise to parse and process the tidal wave of information we’re presented with. The nuance and specialisation required to critically engage with modern problems comprehensively are beyond the reach of most individuals. Most of us are guessing most of the time.
Mistrust: Most sources have an agenda - if not political, then monetary. By paying with our attention we compromise the incentive structure of most sources. There are few sources which are universally regarded as reliable from which people can base their discussions around.
Ideology: Ideology seduces us with easy answers, tribal belonging, and acts as an anaesthetic against the anxiety of an uncertain, often incomprehensible world. Being right is more important than seeking the truth. This dulls the blade of reason.
These horsemen don’t work in isolation, they trample over rationalism as a group. Volume feeds complexity, complexity enables mistrust, and mistrust sends us into the arms of ideology. Ideology reduces our capacity for nuance, further strengthening mistrust and so on and so forth.
The horsemen are loose. The corpses of God and rationalism lie at our feet. As old gods are resurrected and new ones are built, we face some questions: How do we challenge these horsemen? How will we find a new foundation for truth? Whoever becomes the king of the gods in this new pantheon will shape not just what we believe, but who we become. Who should win? Who do we want to win?
Part 2 coming soon. Subscribe to hear about it first.
I always liked how in David Deutsch’s books he has a brief summary after each chapter, so below is my attempt at a TL:DR of the above
Summary
What we believe to be true is, in large part, based on trust. We all have explanatory frameworks we use to critically examine information, but at a certain point we take things on authority. There is often an emotional component to who we decide to trust.
The Münchhausen and Fries Trilemma illustrate the fundamental difficulty with establishing complex truths. This situation can be navigated - and knowledge can be created - using Popper’s approach: argument (conjecture, criticism, explanations and so on) alongside iterative experiments. I assert that these methods apply beyond simply scientific knowledge - they apply to broader knowledge about the world as a whole.
I also mention that the debate around what approaches work either descriptively or prescriptively to reliably generate scientific knowledge is still very much unresolved, however I advocate for Popper in the context of this essay as it’s a useful way to illustrate and highlight the problems that are occurring.
The internet is where most people get their information, however sources are very difficult to critically analyse and verify. Part of this is down to technology improving (deepfakes, bots, more people creating more information) and part of this is down to the complexity of information increasing (the world is becoming more interconnected, and harder to understand). This means we have to rely on trust more.
However ideology puts people in echo chambers. Trust is given from an emotional place more often than a rational one when we are dealing with subject matters we cannot critically assess. Culturally we care about being right more than finding the truth. Mistrust is at an all time high. These forces uniquely undermine the Popperian approach to truth seeking.
The Enlightenment killed God by replacing divine revelation with rationality. Nietzsche warned that this shift would have long-term consequences—psychologically and societally. By removing God, we destroyed not just a source of truth, but a foundation for meaning.
The four horsemen of the rationalist apocalypse - Volume, Complexity, Mistrust, and Ideology - cripple our ability to establish truth. Rationalism was the new God we worshipped, and now that God is dying. Different groups are responding in different ways: some resurrect old gods, while others are attempting to create new ones. Nietzsche warned us as to what happens when societies lose a shared foundation for meaning and truth. The nihilism he feared no longer seems hypothetical.
If truth is important to a functional society, how we combat the horsemen, and how we want the new pantheon to play out, is something worth critically addressing.
I’ve never felt so understood, until tonight. I’m 18 and I have spent half of my life trying to explain why I view things the way I do. Today marks the first day of finding someone who thinks like me. I love the way you explain this and I’m excited to see where you take this!
Amazing essay. Thank you so much for your writing. I will be sharing this with everyone I believe would appreciate it.
When you wrote "outsourcing your critical thinking to an ideology", it made me think of Camus' "philosophical suicide". Whether the ideology is religious or not, we both allow it to do our critical thinking for us and we let our identity be wholly defined by it.