Why Belief Feels So Real—Even When It's Bullshit
Let’s get real: most people think they’re open-minded. They think they’re always challenging their beliefs and that they’re immune to echo chambers. But if you actually start poking at your own convictions—really poking, not just swapping one set of talking points for another—you’ll find the process is a hell of a lot messier, scarier, and more revealing than you’d like. This piece is about a recent rabbit hole I went down, the science and philosophy that kept me tumbling down, and why almost every subject is worth diving into if you want to unveil the layer of bull shit your monkey ego puts in front of you.
The Rabbit Hole Spark
This journey is not always sparked by academia or formal research. Sometimes, it starts from an almost obsessive need to understand a belief I hold, why I have them, and which ones are even credible. Unlike many who find their comfort zone following specific thought leaders or staying in the same intellectual corner of myopic-thinking, I find myself constantly being pulled toward new ideas, new perspectives, new rabbit holes. It's a pattern I've noticed: my deepest insights usually start with personal questions, with wanting to understand something about myself. But beyond just following curiosity, I've learned that seeking truth requires understanding what constitutes good evidence. With over 300 cognitive biases affecting our thinking, we need a rigorous, falsifiable process to cut through our natural tendencies toward self-deception. After all, we evolved to survive, not figure out what is true or not.
A recent realization of a friend in a echo chamber, that which lead to a fear of myself being in one, triggered this dive. I soon started weaving together threads of neuroscience and philosophy, especially around the concept of the posterior medial frontal cortex (pMFC)— this can basically be summed up as the “threat detection center” that fires up whenever your worldview is challenged (some actual neuroscientist out there is screaming at me right now, oh wait, there is only like 3 people who read this, including myself). The idea that you could literally dial up or down someone’s tribalism, religiosity, or openness to outsiders just by zapping a specific part of their brain (they can seriously do this with magnets). That was the moment the floor dropped out from under my assumptions about my beliefs.
From there, it was a short jump to reading neuroscience summaries and research notes on pMFC, cognitive dissonance, and the neural basis of tribalism and guilt. The more I dug, the more I realized how little most people (myself included) really understood about why they believe what they believe. The rabbit hole kept going: articles on Humean skepticism, podcasts on confirmation bias and echo chambers, debates, Sapolsky’s “Behave”, Gibsons “Ecological Approach to Visual Perception”—every source chipped away at the illusion of rational certainty.
The Science: Your Brain on Belief (and Why You’re Not as Rational as You Think)
Let’s talk about the pMFC for a second. This region—tucked in the medial wall of your frontal lobe—is basically your brain’s internal alarm system. It scans for threats, not just physical but ideological. When you feel that jolt of discomfort in a heated debate, or when someone calls out a core belief of yours, that’s the pMFC lighting up. It flags cognitive dissonance (the “wait, am I wrong?” feeling), and it’s the same area that gets triggered in studies where people are forced to argue against their own positions. The more that region fires, the more likely you are to double down, rationalize, and stop listening.
But here’s where it gets wild: studies using brain stimulation (TMS) to temporarily suppress the pMFC show that people become less religious, less tribal, and more open to outsiders. When the “threat detection” is dialed down, existential threats don’t feel so threatening. People report lower belief in God and less hostility toward people who criticize their group. Flip the pMFC back on, and the ideological armor returns.
The implication? Our beliefs are not the product of calm, detached reasoning. They’re the result of emotional, evolutionary processes designed to protect us from uncertainty and danger. The more central a belief is to your identity, the more likely it is to be defended by these primal systems, not by evidence.
The Philosophy: Hume, Nietzsche, and the Limits of Knowing
Long before we had fMRI machines, David Hume was already onto this. Hume argued that belief isn’t a logical deduction—it’s a vivid feeling that something is true. He would say we don’t perceive causality; we infer it from habit. We don’t reason to our beliefs; we rationalize them after the fact. Hume’s skepticism is a necessary antidote to dogmatism: all knowledge comes from experience, and even then, induction is just a habit of mind, not a guarantee. *There are great arguments against Hume’s “all knowledge comes from experience”, but we don’t have the time for that at the moment.
Nietzsche, meanwhile, saw conscience not as a divine whisper but as the internalization of tribal fear. You don’t feel guilty because God is watching—you feel guilty because your group might punish you. Modern neuroscience backs this up: the pMFC flags moral violations as social threats, and the pain of guilt is as real (and as motivating) as physical pain. Suppress the pMFC, and people feel less guilt and show less ideological defensiveness. Turn it on, and you get the opposite.
Why Challenging Your Beliefs Is So Damn Hard (and So Necessary)
Here’s the uncomfortable truth: most people think they’re challenging their beliefs, but they’re really just rearranging the furniture in their echo chamber. They’ll read a new book, listen to a contrarian podcast, or argue on Twitter, but if they’re not actively seeking out disconfirming evidence, playing devil’s advocate, and exposing themselves to perspectives that make them squirm, they’re not actually growing. It’s just self-soothing in a different key.
If you find yourself listening to the same people, reading the same kinds of books, and holding the same core beliefs year after year, chances are you’re not changing. You’re not learning. You’re not growing. You’re just getting better at rationalizing your bubble.
The only way out is to make skepticism a habit: constantly ask, “What would falsify this belief? What evidence would change my mind? Am I actually looking for information that contradicts me, or just more of what I already think?”. If you’re not always finding ways to engulf yourself in introspection, you’re not doing enough. Ignorance is a vast ocean and every single one of us is drowning. There are no exceptions.
Every Niche Has a Niche: Why Diving Into Any Subject Teaches You About Yourself
This is why almost every topic—philosophy, neuroscience, psychology, logic, even weird subfields—is exciting to learn about. When you start digging into any subject, you realize just how little you know. Every niche has a niche. The more you learn, the dumber you feel. It’s like chasing your own tail: every answer just opens up five new questions. That’s not a bug, it’s the feature.
You start to see that “knowing” isn’t the same as “understanding.” Following a recipe isn’t the same as knowing the science behind it. Understanding bread means knowing yeast, ovens, flour, eggs, microbiology—the whole damn ecosystem. The more perspectives you gather, the more you realize how much you’re missing.
And every subject you study becomes a mirror. Philosophy, psychology, neuroscience, biology—they’re not just about the external world. They’re about how you think, how you react, how you deceive yourself, and how you can (maybe) get a little bit closer to the truth. Every time you challenge a belief, you learn something about your own mind—its strengths, its blind spots, its capacity for self-deception.
What Else Can We Do in a World Like This?
So what’s the point? If every subject just reveals more ignorance, if we are just meaningless and purposeless monkeys walking around on a ball in an infinite universe, what else is there to do but work on yourself?
Almost everything can teach you something. Philosophy teaches you how to think (and how to spot your own bullshit). Logic helps you avoid dumb-ass beliefs and know when to say “I don’t know.” Neuroscience and psychology show you why you react the way you do. The Scientific process gives you a framework to eliminate biases and see the world as it is, not as you wish it to be.
If knowledge implies rational thinking, and rational thinking implies skepticism, then you have to be okay saying “I don’t know.” You have to have a process for finding truth—a way to falsify your own beliefs, to actively seek out what’s wrong, not just what feels right.
That’s why I find almost every subject interesting (except, honestly, politics). Because every topic is a tool for self-exploration. Every niche has a niche, and every rabbit hole has another tunnel. The more you learn, the more you realize you don’t know shit. And that’s the most honest place to start. If you have a visceral reaction when someone has a counter argument to a person you are always listening to, follow that shit and you’ll find out what I am talking about.
Final Thoughts: Your Beliefs Aren’t You
Here’s the kicker: your beliefs aren’t you. They’re habits of thought shaped by biology, trauma, culture, and circumstance. They can evolve. They should evolve. Your brain isn’t built for truth—it’s built for survival. But once you realize that, you can start hacking the system. You can build habits of thought that resist the pull of certainty. You can cultivate the kind of mind that doesn’t just believe—but questions. And maybe, just maybe, that’s where real freedom begins. That’s the playground of the free-thinkers