YOU Are Not a Good Person: A Rational Confrontation
YOU Are Not A Good Person.
You are not a good person. You are self-deceived. This is a confrontation, not a comfort. Deep down, an inconvenient truth lurks in your mind—an elephant in the brain that you refuse to see. Like the proverbial elephant in the room, it’s large and obvious once pointed out, yet we studiously ignore it. What is this elephant? It is the collection of hidden motives, secret self-interests, and unflattering truths about your behavior and mind that you prefer not to acknowledge. It’s the subtle but pervasive evidence that much of what you believe about your own goodness is a strategically constructed deception – a lie you tell to yourself, so that you can better lie to everyone else.
This analysis will be unapologetically blunt. It will drag your most cherished self-perceptions into the harsh light of rational scrutiny. It will force you to confront the evidence from evolutionary psychology, cognitive science, and philosophy that your morality, altruism, and virtue are often shams. We will follow the lead of The Elephant in the Brain by Kevin Simler and Robin Hanson, who document how humans systematically hide their true motives from themselves. Using their insights and a wealth of empirical studies, we will dissect the myriad ways you are not who you pretend to be – not to others, and not even to yourself.
Why such a harsh indictment? Because only through uncompromising honesty can we begin to see the “important but unacknowledged features” of our minds. Human beings, including you, have evolved to be master hypocrites. We wear a “wise veneer” of virtue, while underneath churn selfish drives, status obsessions, and survival impulses. We construct lofty explanations for our actions – “I gave to charity to help the needy”, “I spoke up because it was right”, “I deserve this because I worked hard” – when often the real reasons are more self-serving – we gave to look generous, we spoke up to signal loyalty, we claim rewards as entitlement rather than luck. Our brains are expert lawyers and publicists for our selfish genes, spinning stories that cast us as noble, kind, and justified, even when the facts say otherwise.
In the pages to come, we will mercilessly strip away these stories. We will examine the evolutionary logic that built our capacity for self-deception – how deceiving ourselves conferred an advantage in deceiving others. We will see how your conscious mind often plays the role of a naïve spokesperson, blissfully unaware of the dark machinations occurring behind the scenes in your own brain. We will challenge the social norms that encourage polite façades and taboos against speaking of ugly motives. And we will dive into hard-hitting thought experiments and data – from Peter Singer’s famous drowning child scenario to psychological studies of altruism, honesty, and cruelty – all to demonstrate the yawning chasm between the person you think you are and the person your actions reveal you to be.
Brace yourself. This will not be gentle. As Arthur Schopenhauer – a philosopher renowned for his pessimistic view of human nature – might say, truth often wears a stern face. If you flinch or feel defensive, remember: that is just the elephant in your brain trying to stay hidden. Our task here is to drag that elephant into view, no matter how much “you” (your conscious self) want to look away. In doing so, we follow Oscar Wilde’s wry advice: “If you want to tell people the truth, make them laugh; otherwise they’ll kill you.” There may be moments of dark humor or irony in what follows, but make no mistake – the intent is deadly serious.
By the end of this analysis, one conclusion will stand clear: you are not the paragon of virtue you imagine. You are a human animal with hidden motives in everything you do. Your brain routinely lies to you about why you behave as you do, preserving a self-image of goodness while excusing all manner of selfishness and moral failure. This is not an insult; it is a biological and psychological fact, backed by copious evidence. It is time to face it with eyes wide open.
So let’s begin, by exploring why we even have an elephant in the brain – why evolution would mold creatures who are strangers to their own true selves.
The Evolution of Self-Deception: Why We Hide the Truth from Ourselves
It might seem paradoxical: why would evolution produce minds that deceive themselves? Wouldn’t it be better to have a perfectly rational, self-aware brain that knows exactly why it does things? In theory, yes – if the goal were simply to understand the world. But the goal of evolution is survival and reproduction, not truth. And in the relentless social competition of human life, lying is a powerful tool. Those who lie effectively often gain advantages – they can cheat, steal, or free-ride while avoiding punishment. However, lying brings a risk: others are adept at detecting lies. Over millennia of tribal living, our ancestors evolved keen senses for spotting cheaters and dissemblers. A telltale flicker of the eyes, a hesitation in the voice, an inconsistent story – such signs could expose a liar and lead to shame or exile.
Thus arose an arms race between deception and detection. According to biologist Robert Trivers, one of the pioneers of evolutionary psychology, this arms race shaped our very minds. His bold hypothesis: self-deception evolved as a strategy to fool others more effectively. In Trivers’ words, “We deceive ourselves the better to deceive others.” By hiding the truth from our own conscious mind, we hide it more deeply from potential observers. You cannot leak signals of guilt or doubt that you do not feel. If you genuinely believe your own bullshit, you will present a more confident, sincere front, and others will be more likely to believe you.
Consider what this means: natural selection may favor ignorance in certain parts of our mind. An overly self-aware person, who always knew their every selfish impulse, might inadvertently reveal those impulses through nervous behavior or subtle cues. But a person who has compartmentalized their mind – who pushes the ugly motive out of awareness and convinces themselves their motive is pure – will lie more fluidly. Their body language will be relaxed, their tone earnest. The deception will be convincing because it isn’t entirely a deception at the conscious level. As Simler and Hanson put it, our brains are like skilled lawyers or spin-doctors: “experts at flattery and excuses,” constantly editing the story to make “ourselves look as good as possible”. Meanwhile, the “self-conscious parts” of you keep your thoughts pure and untainted by the scheming going on backstage.
For self-deception to be useful, two conditions must hold. First, others must have partial visibility into your mind. If your thoughts and intentions were completely private, there’d be no need to lie to yourself; you could scheme openly in your head and simply lie with words. But human communication is richer and more involuntary than words alone. Our faces betray emotions, our voices tremble under stress, our eyes dilate when we’re excited or afraid. Psychologists have shown that people unconsciously leak signals – microexpressions, tone, posture – that can tip off an observer to our true state. Thus, to be an effective liar, it helps if even you don’t know you’re lying. By hiding reality from your conscious mind, you hide it from onlookers as well.
Second, there must be some benefit to misleading others. Indeed, the social world of humans is rife with competition where deception pays. From mating to status battles to politics, those who can manipulate others’ perceptions have the upper hand. As The Elephant in the Brain notes, we are biologically primed to fight for power, status, and sex, even as our conscious narratives deny such base drives. Daily life is essentially an ongoing competition – for attention, affection, resources, prestige. Within that competition, a bit of subterfuge can go a long way. We praise cooperation and honesty publicly, but in private, each individual (and each gene) strives to come out on top. Trivers even speculates that this liar vs lie-detector arms race is one reason humans evolved such high intelligence – our big brains might be, in part, the result of needing to outsmart each other in social games.
This perspective sounds cynical, but it is increasingly supported by research. Neuroscientists and psychologists have identified many ways the mind keeps truths out of view. We have motivated reasoning – we readily believe what benefits us, and scrutinize or forget what doesn’t. We have confirmation bias, cherry-picking information that validates our self-image. We even have the ability to forget or distort memories that threaten our ego. As Trivers succinctly summarized: “At every stage of information processing – from its biased arrival, to its biased encoding, to its selective retrieval – our minds tilt towards self-serving outcomes.”In short, your brain is not a truth-seeker, it is a survival instrument. Modeling reality accurately is useful only up to a point; beyond that, if a slightly warped view of reality helps you thrive, your brain will gladly adopt the distortion.
This is why an introspective journey into your own motives can be so fraught – you are up against your brain’s active attempts to suppress and mask certain truths. There is an “introspective taboo” at work. We’re quite willing to analyze trivial preferences or technical problems, but when it comes to asking “Why did I really do that generous act?” or “Was I truly being honest just now, or did I shade the truth for advantage?”, we shy away. It’s deeply uncomfortable to question our own virtue. There’s an implicit social taboo as well: questioning others’ motives is considered rude or insulting in most contexts. Pointing out that a colleague’s charity gala might be more about publicity than philanthropy, for example, will win you no friends. Society collectively agrees to look the other way; we collude in a shared polite fiction that everyone’s stated motives are the real ones, in order to maintain social harmony. (We will revisit this collusion later.)
But here, we will violate that taboo. We will turn the lens of suspicion directly onto you, the reader, and dissect your moral self-image. Remember: it’s not personal in the sense of singling you out – everyone exhibits these self-deceptive traits, as overwhelming evidence shows. It is personal in the sense that you are not exempt. Don’t comfort yourself thinking “Yes, people are self-deceptive, but I’m relatively honest with myself; I really am a good person.” That thought itself is likely a self-deception. As we shall see next, almost everyone believes they are morally above average, and by the laws of math and logic, most of those people are wrong.
You Are Not a Good Person: The Illusion of Moral Superiority
Take a moment to reflect on your own character. Do you consider yourself more ethical than the average person? More compassionate, more fair, more “on the side of good” than most? If you are like the vast majority of people, the honest answer (if honesty could be assured) is yes. Research consistently finds that most individuals believe they are morally superior to others. In one study, 100% of participants rated themselves above average on at least one moral trait such as honesty or kindness. Obviously, we cannot all be above average. This is the illusion of moral superiority – a pervasive self-deception in social life.
Psychologists Ben Tappin and Ryan McKay quantified this phenomenon and found it to be uniquely strong. People’s “irrationally inflated” self-assessments of moral traits were greater in magnitude than self-delusions in any other domain, including intelligence or competence. In other words, lying to ourselves about being good is more common and more exaggerated than lying to ourselves about being smart or attractive. We desperately want to view ourselves as virtuous. This positive illusion serves to uphold our self-image and justify our actions. It also, as Tappin and McKay note, contributes to conflict: when both sides of a dispute are convinced of their righteousness, reconciliation becomes near-impossible. Each side sees the other as immoral and themselves as noble, a recipe for stalemate or worse.
Ask yourself sincerely: on what basis do you consider yourself a good person? Perhaps you recall acts of kindness you’ve done, or the fact that you generally don’t break the law, or that people have told you you’re nice. Perhaps you think, “I have good intentions. I mean well. I’m not a murderer or a thief. I care about others.” The elephant in your brain eagerly nods along – “Yes, yes, that’s right, you are good. You help people. Any mistakes you made were minor or justified.” Meanwhile, your mind conveniently forgets or rationalizes the countless moments that contradict this rosy image: the times you ignored someone in need, the selfish choices, the petty lies, the harsh words to loved ones, the moral compromises when it was inconvenient to be ethical.
We maintain our moral self-image by filtering our perception. Philosopher Daniel Batson, who has studied moral behavior for decades, describes “moral hypocrisy” – the tendency to appear moral (to oneself and others) without actually being moral. In one of Batson’s experiments, subjects faced a choice: assign themselves to a fun, rewarding task and another person to a dull task, or vice versa. Nearly all agreed the moral thing to do was to be fair, e.g. flip a coin to decide. But when given the chance, a majority cheated – either by not flipping the coin at all and just taking the better task, or by flipping it in a rigged way – yet still reported feeling they had acted fairly. They took steps to appear fair (some even performed an ostentatious coin flip) but manipulated the outcome in their favor. In Batson’s setup, among those who ostensibly flipped the coin, far more than 50% (often around 80-90%) somehow “won” the toss and got the good task. Statistically, this is almost impossible without cheating. These people then often convinced themselves that fate had just happened to favor them, or they simply did not dwell on the contradiction. This is moral hypocrisy in action: wanting to be seen as moral (and to see oneself as moral), while avoiding the cost of actually being moral.
Does this scenario sound familiar? Perhaps you’ve never been in a psychology experiment like that, but have you ever taken more than your fair share while telling yourself you were fair? Ever “bent” a rule for your own benefit and immediately given yourself an excuse? Consider everyday situations: splitting a bill and undercalculating your portion (unconsciously, of course); taking credit for a success that was largely due to others; breaking a promise and finding a slew of justifications ready at hand. Your memory may not readily supply these little sins – because self-serving memory bias is part of self-deception – but they are almost certainly there.
We are quick to spot such hypocrisy in others (we’re keen to catch rivals cheating or lying), but painfully slow to see it in ourselves. In fact, one of the reasons we judge others so harshly and gossip about their failings is that it makes us feel better by comparison. Psychologically, we bolster our own moral superiority by diminishing someone else’s. “I may not be perfect, but at least I’m not that guy.” Yet, as Trivers wryly observed, we are all playing the same game: “We deceive ourselves to fool others, and we do it so well that we hardly recognize the deceit.”
The falsehoods in human social norms amplify this. Social norms – the rules of behavior we all publicly endorse – often serve as moral window dressing. We collectively say, “Cheating is wrong,” while quietly acknowledging (in private, or via subtext) that a little cheating is okay if you don’t get caught. Companies declare “Integrity!” in their values statements while engaging in shady practices under pressure to hit targets. Politicians extol honesty and service while currying favor and padding their own influence. And you, individually, profess values that you frequently fail to live up to. This isn’t a personal failing so much as a human universal. Our norms pretend to an ideal of behavior that we flout whenever temptation and lack of oversight combine. We have unspoken “wiggle room” in nearly every norm: lying is bad except when telling the truth would hurt someone’s feelings or hurt us; stealing is bad except maybe taking office supplies or pirating movies doesn’t really count; treating people equally is right, but nepotism for your beloved friend or family member feels justifiable, etc. We maintain the official story that we follow the norms, while quietly making exceptions whenever it’s advantageous and we think we can get away with it.
Consider honesty. We like to think of ourselves as honest people. Yet studies of everyday behavior find that lying is astonishingly common. In one study that had people keep daily diaries of their interactions, college students admitted to telling about two lies per day on average, and community adults about one lie per day. Most of these lies were minor – excuses, false praise, small deceptions to ease social relations – but they are lies nonetheless. If you protest, “Well I don’t lie every day,” be cautious: that itself might be a lie to yourself. Many of these falsehoods are so routine we hardly register them: “Sorry I’m late, traffic was crazy” (when really you lost track of time), “You look great in that dress” (which you privately think is unflattering, but you don’t want to offend), “I’ll definitely call you tomorrow” (with no intention to, but it ends the conversation politely). We justify these as “white lies”, distinguishing them from serious “real” lies. Perhaps that distinction matters for harm caused, but from the standpoint of self-perception, it means we habitually lie while maintaining an internal narrative that “I am an honest person who only lies when I have a good reason.” In other words, we excuse our own dishonesty by diminishing its significance. We do not extend the same generosity to others’ lies – their lies (especially if against us) are evidence that they are fundamentally untrustworthy. This double-standard is another facet of self-deception: the actor-observer bias or self-serving bias, by which our transgressions are excused due to circumstance, while others’ transgressions are due to their character.
To truly confront your inescapable self-deception, you must internalize this sobering fact: If placed in the same situations that have tempted others into wrongdoing, you likely would do no better. Perhaps you have not been tested in extreme ways, so you comfortably assume your moral fiber is strong. But history and experiments suggest otherwise. Ordinary people can rapidly become monsters or cowards under the right (or wrong) conditions – all while preserving a sense that they are justified.
Hypocrisy Revealed by Thought Experiments and Real Ones
Philosopher Peter Singer offered a famous thought experiment that brutally exposes our moral hypocrisy. Imagine you’re walking past a shallow pond and see a small child drowning. You can save the child easily, but you’ll ruin your expensive shoes and be late to work. Would you save the child? Of course – virtually everyone says it would be not only morally right but morally obligatory to wade in and rescue the child at such a minor cost to yourself. Failing to help would make you, in your own eyes and others’, a heartless monster.
Now Singer extends the scenario: what if the child in peril is far away – say, a starving child in a distant country, whom you could save by donating the equivalent cost of those expensive shoes to a reputable charity? If you believe all human lives have equal value, the situation is morally analogous: you can sacrifice a luxury (nice shoes or a bit of money) to save a child’s life. Yet how many of us actually behave as if the distant child’s life is as valuable as the one before our eyes? Almost no one does. Singer points out the uncomfortable truth: Every time we splurge on a fancy dinner, a new gadget, or a luxury vacation instead of donating to effective charities that save lives, we are effectively letting children die for the sake of our enjoyment.
This conclusion makes people squirm, because it implies that by normal Western standards of spending, almost all of us are doing something morally equivalent to walking past the drowning child. We don’t feel as awful about it, because the dying distant children are abstract, not vividly in front of us. But logically, Singer argues, distance is not a moral differentiator. If you agree the drowning child must be saved, you should also agree that you ought to sacrifice similarly to save a child from diarrhea in Bangladesh or malnutrition in Sudan. Yet most of us do not live up to that standard. We might give a little to charity, sure – but nowhere near the level of sacrifice we’d make to save an immediate child, and nowhere near what the moral calculus would demand if we were consistent.
What is Singer doing here? He is calling out our elephant in the brain, naming the “everyday human hypocrisy” between our stated ideals and our revealed preferences. Our stated ideal: “I care about children dying; every life is precious.” Our actual behavior: we spare a pittance for far-away lives and spend lavishly on ourselves. We might donate, but as The Elephant in the Brain notes, only about 13% of American charitable giving goes to helping the global poor – those who need it most. The bulk of charity goes to things like churches, alma maters, arts organizations, or local causes that often directly or indirectly benefit the giver (or their community). Even when we give to help others, we do it inefficiently. One analysis found that 85% of Americans claim to care about charity effectiveness, but only 3% actually compare charities to give where it helps most. We often give based on emotional impulse – donating to a disaster relief fund after seeing a sad story on TV, or to a hospital charity because a friend asked – rather than based on where our money could save the most lives. In fact, we often don’t even care to know how our donation will be used. When Princess Diana died in 1997, British people spontaneously donated over £1 billion to a hastily created charity in her name, before that charity had any plan whatsoever for the funds. The donors got the warm glow of “doing something” without any clue of actual impact. Studies show we’ll pay the same amount to save 2,000 birds or 200,000 birds from oil spills – an example of “scope neglect”, indicating that we respond to the idea of doing good but tune out the quantitative reality of how much good is done. If we truly cared only about results, our willingness to pay would scale with the number of lives saved (or animals rescued, etc.), but it generally doesn’t.
All these data paint a clear picture: we want to feel like good, caring people, but we do not actually want to incur too much cost to be good. We balance our altruism carefully against our comfort. We find a level of giving or helping that maintains our self-image and social image, but preserves the lion’s share of resources for ourselves. And crucially, we do this balancing act without admitting to ourselves that this is what we’re doing. Very few would outright say, “Yes, I value my new iPhone more than a child’s life in a poor country.” Instead we concoct rationalizations: “It wouldn’t really make a difference”, “I work hard, I deserve nice things”, “I give some, I can’t give to everyone”, “It’s the government’s job to help those kids”, and so on. Some of these excuses have a kernel of truth or practical consideration, which makes them easier to accept – but they also conveniently let us off the hook.
The point here is not that you must now donate everything and live like a monk (Singer’s argument notwithstanding), but to recognize the yawning gap between the morality we espouse and the morality we live. That gap is where self-deception thrives. Your brain knows how to paper over the discrepancy. It forgets the inconvenient comparisons (drowning child vs. daily luxuries); it emphasizes the good you have done (“I donated clothes last Christmas!”); it compartmentalizes – you create a mental wall such that when you enjoy a $5 latte, you don’t actively think, “This could have been malaria medication for a child”. If such thoughts intrude, you quickly shoo them away as “guilt trips” or unrealistic standards. Thus, you continue feeling like a decent person.
Now, confront a harder truth: It’s not just in charitable giving that you fall short. This hypocrisy extends to all aspects of social and moral life. You probably believe you are reasonably fair and just. Yet experiments show that given the slightest leeway, most people will favor themselves unfairly. For example, in workplace settings, people subconsciously favor colleagues who flatter them or share similarities, even if others are more qualified – while insisting they are objective. You believe you’re compassionate, but consider: have you ever looked away from someone suffering right in front of you because stopping to help was inconvenient?
A classic real-life experiment in the 1970s put seminary students – people training to be priests, of all things – to such a test. Researchers John Darley and Daniel Batson asked these theology students to rush across campus to give a talk about the Parable of the Good Samaritan (a Bible story about helping a stranger in need). On the way, each student encountered a man slumped in a doorway, appearing destitute and in distress – an obvious setup echoing the parable they were about to preach. You’d think almost everyone would stop. But what determined whether they helped was simply how rushed they felt. When the students believed they had plenty of time, some did offer help (though still only 63%). Under moderate hurry, help rates dropped to 45%. And when they thought they were late and the experimenters were expecting them urgently, only 10% stopped to help the suffering stranger. The rest literally stepped over the person or ignored him, even when they were on their way to talk about being a Good Samaritan! In post-interviews, many hadn’t even fully consciously registered the man or had immediately justified why they couldn’t stop. These seminarians undoubtedly viewed themselves as compassionate individuals (and in calmer circumstances, they might be), but in that moment, their immediate self-interest (get to my obligation on time) overrode their moral ideals. And notably, they likely rationalized it: “I’m sure someone else will help him,” or “I can’t miss this talk; it’s important, I’ll help later if he’s still there.” The mind finds a way to maintain the narrative: “I’m still a good person; I just had to prioritize this other good (the talk) right now.”
Shocking as it is, this kind of situational moral failure is the norm, not the exception. The vast majority of people can be induced to abandon their purported principles if the context encourages it – and they’ll usually justify it after the fact. We see this in the infamous Milgram experiment as well. Psychologist Stanley Milgram recruited average citizens and told them to administer what they believed were painful electric shocks to a participant (actually an actor) as part of a “learning experiment.” Despite hearing the victim scream and beg for mercy (and eventually go silent as if unconscious or worse), 65% of people continued to administer shocks up to the maximum lethal voltage, simply because an authority figure calmly said, “The experiment requires that you continue.” These were not sadists or psychopaths. They were ordinary folks who, once caught in a situation with authoritative pressure and a diffused sense of responsibility, performed acts of potential cruelty. And you can be sure that if you debriefed them afterward (Milgram did), they had ready rationalizations: “I was just following orders,” “I thought I had no choice,” “I was told it was necessary for science, and besides, they said the responsibility was on them.” Very few said, “I continued because I’m an immoral person who doesn’t care about others’ pain.” No – they each constructed a story in which what they did was acceptable or not reflective of their true character. We conveniently label the participants who went all the way as uniquely obedient or susceptible, implicitly telling ourselves we would never do such a thing. But decades of replication and analysis of Milgram’s work suggest that, under those conditions, a majority of us would do the same – and our brains would find a way to excuse us to ourselves.
These examples underscore a key insight: Your moral behavior is highly context-dependent, but your self-concept pretends it’s stable. You think, “I’m a good person, so I would do good even in hard situations.” In reality, you do good when it’s easy, when it aligns with your interests, or when social expectations and surveillance are high. When circumstances make bad behavior easy and consequence-free, chances are you (like most) will at least nibble at the forbidden fruit. And if caught, you’ll earnestly explain how you had no choice or didn’t realize what you were doing.
The Strategies of Self-Deception: How We Lie to Ourselves and Keep Believing It
By now it should be clear that there is a pattern: a gap between our ideals and our actions, papered over by excuses and selective attention. To truly appreciate the inescapable nature of this self-deception, we must examine how your mind pulls it off. It’s one thing to claim “people deceive themselves,” but how do you fool your own brain? Understanding the mechanisms can help you notice them in real time – though beware, knowledge alone might not fully save you, since these processes are often unconscious and habitual. Nonetheless, let’s shine a light on a few of self-deception’s favorite tricks:
Hiding the Truth in the Unconscious
One potent mechanism is simply to keep uncomfortable truths out of your conscious awareness. Earlier we discussed the evolutionary rationale for this: if you don’t consciously know your selfish motive, you won’t leak it. Your mind is composed of many parts (modules or subsystems, as psychologists say), and these parts can have differing information. It is entirely possible for one part of your brain to initiate an action for a self-serving reason, and another part (the conscious narrative-making part) to be oblivious to that reason. The conscious part will then happily fabricate a post hoc explanation that paints you in a good light. This process is so seamless you usually don’t notice the fabrication.
Neuroscientist Michael Gazzaniga’s split-brain patient studies are a dramatic illustration: with the communication between hemispheres severed, one side of the brain could initiate an action (like standing up), and the verbal left hemisphere, which didn’t know the real reason, would instantly cook up a plausible explanation (“I felt like getting a soda”) to explain the behavior. While that scenario is extreme (due to a cut corpus callosum), the principle extends to normal brains: we all have an inner confabulator. We act for reasons we don’t acknowledge, and then our conscious self makes up respectable reasons, genuinely believing them.
Simler and Hanson liken the conscious mind to a press secretary – a spokesperson whose job is to present the best version of you to the world (and to yourself). The press secretary doesn’t need full access to the government’s internal decisions; he just needs a plausible story to tell. Likewise, “we – the self-conscious parts of the brain – manage to keep our thoughts pure,” while other parts (the intuitive, emotional, and strategic circuits) handle the dirty motives. When you do something ostensibly altruistic, like donating to charity or helping a colleague, the press secretary announces, “I did it because it’s good to help!” Meanwhile, if the real motive was partially to impress someone or to feel superior, that knowledge is suppressed. You might get a vague hint of it (perhaps a slight thrill when others notice your good deed), but you won’t dwell on it. If someone accuses you of ulterior motives, you’ll likely bristle with indignation – your press secretary has convinced you of your officially pure motives.
This is why it’s often easier to spot others’ hidden motives than our own: we aren’t their press secretary. We observe their behavior and can cynically speculate what they’re really after, because we’re not invested in their virtuous narrative. But you are deeply invested in your narrative. So your mind practices a kind of information hygiene: unpleasant facts about yourself are sanitized or kept in the dark. As Trivers put it, “We hide reality from our conscious minds the better to hide it from others.” This hiding can involve active forgetting (motivated forgetting of things we’ve done that don’t square with our self-image), or distortion (remembering events in a way favorable to us). It also involves attentional biases – we simply don’t think about the contradictions too much. You might fleetingly notice, “Hmm, I’m making this donation partly because my boss is watching,” but then you quickly shift attention to how much the charity needs it or how good it is to give. The uncomfortable motive slips into the mental background.
Rationalization: The Art of Believable Excuses
Even when confronted with evidence of our less-than-noble behavior, we have a trump card: rationalization. A rationalization is essentially a strategic reasoning after the fact – you’ve done something for sketchy reasons, but now you concoct a logical, ethical, or acceptable reason for it. The key is that you often come to believe the rationalization. This isn’t a conscious cynic saying to themselves, “Ha, I’ll fool everyone with this excuse.” No, in many cases you sincerely adopt the excuse as the truth.
For example, suppose you skipped volunteering at a shelter because you wanted to stay home and watch movies. If someone asks, you might immediately say, “I wasn’t feeling well.” But it’s not just a lie to them; you might begin to recall that you did feel a bit tired, and convince yourself, “Yes, I wasn’t 100%, it was probably better that I didn’t go; I wouldn’t have been much help.” Now you feel justified. Or consider you lash out at a friend in an argument, saying something quite mean. Later, you tell yourself, “I had to say it, they left me no choice, they were being so unreasonable.” You emphasize how awful their behavior was until your harsh reaction seems, in context, understandable or even necessary.
Rationalizations can be extremely elaborate (entire philosophies can be built to justify societal selfishness, for instance), or mundane (“I deserve this extra slice of cake because I’ve been so good lately”). They often contain partial truths, which make them convincing. Maybe you were a little under the weather; maybe your friend was being obnoxious. But the rationalization inflates extenuating circumstances and deflates your agency or selfish impulse.
Notice the structure: rationalizations frequently portray us as victims of circumstance (“I had no choice…”), or as having noble intentions despite bad outcomes (“I was just trying to X…”), or they minimize the harm (“It’s not a big deal that I did Y…”). We spin a story where, given the situation, any reasonable person (especially a good person like us) would do the same. If needed, we’ll even claim the moral high ground in our wrongdoing: “I lied to protect your feelings, that’s why I wasn’t truthful.” Sometimes that’s valid – but often it’s convenient. The lie also protected ourselves from discomfort, but we highlight the altruistic angle.
In essence, our reasoning faculties are often lawyers for our emotions and desires rather than impartial judges. We have an impulse (say, greed, lust, anger) – then reason comes in not to veto it, but to justify it. Studies show that people’s moral reasoning is largely post hoc: we make an intuitive decision about what’s right for us, then we rationalize why that was the right thing morally. The elephant (emotion/instinct) moves, and the rider (intellect) follows, telling a story of why the elephant’s direction was sensible all along. You might think, “No, I carefully deliberate on moral choices.” Sometimes, perhaps – but often those deliberations are biased without you realizing. You might systematically avoid information that would make your favored option look bad, or set the criteria of judgment in such a way that your preferred outcome wins.
For example, a person deciding whether to cheat on their taxes might start by thinking, “I value honesty.” But then they recall stories of government waste, and think, “My money is better in my hands; I use it for my family. Also, everyone cheats a little. And I did give to charity, which offsets this.” They construct a mental ledger where somehow cheating becomes almost virtuous or at least excusable. By the end, they may truly believe that not only is cheating okay in this case, it might actually be the morally justifiable thing given those considerations. This flexibility of moral reasoning is scary when you realize it operates in you as well. With enough motivation, you can justify nearly anything to yourself.
Biased Memory and Selective Narratives
Memory is not a perfect recorder of reality; it is an active, reconstructive process prone to bias. One potent self-deception tool is biased memory. We tend to remember our successes and kindnesses more vividly (or interpret past events to cast us in a good light), and forget or diminish our failures and cruelties. This is evident in how people recount their personal histories: listen to someone’s life story and you’ll often hear a tale that subtly (or not so subtly) emphasizes how they made mostly good choices, were justified in the bad ones, and how any hardships were due to others or fate. We each author an internal autobiography that is part truth, part fiction, engineered to keep us feeling morally and rationally competent.
This is not to say we never feel guilt or shame – we do, when reality’s slap is too hard to ignore. But even then, notice what you do with guilt over time. You either make amends (restoring your self-image by “fixing” the bad deed), or if you cannot, you eventually reframe the event to lessen the guilt. Maybe you drift into thinking the person you hurt wasn’t that hurt, or they deserved it, or it was a learning experience, etc. Given enough years, people can even recall severe wrongs they committed and narrate them as “crucial growth experiences” or justify them with a philosophy that has since changed. Seldom do people continuously recall, “I did something horrible with no real excuse,” unless they are exceptionally honest or suffer psychological trauma from it. The mind either heals that wound with rationalization scar-tissue or pushes it away into the attic of memory.
Another facet is selective moral comparison. We tend to compare ourselves to those slightly worse than us in whatever domain is at stake. Feeling a tad guilty about your relative greed? You can always think of someone far more greedy and feel better. Didn’t donate to charity this year? Console yourself that at least you’re not the billionaire who gives nothing. Snapped at your kids? Remember that some parents outright abuse their children, so in comparison you’re still doing okay. These comparisons are not usually conscious and explicit (“let me find a worse person to compare to”), but rather an automatic solace the brain offers. Rarely do we spontaneously compare upwards (“I could have been kinder, like that saintly neighbor of mine”); that tends to only happen when we’re feeling particularly secure or aspirational. To maintain self-esteem, downward comparison is the norm.
Importantly, emotion plays a guiding role in what we remember and think about. We experience a pang of guilt or shame, and our mind reflexively tries to eliminate that negative emotion – either by fixing the situation or, more commonly, by thinking about it differently. Conversely, when we do something good, we feel a warm glow, and our mind savors it, amplifying the memory of our benevolence. This creates a skewed recollection: our personal highlight reel is full of our kindnesses, sacrifices, and bravery, whereas our lowlights have been edited or left on the cutting-room floor.
The Social Face and Internal Lies
We cannot discuss self-deception without noting how social feedback loops encourage it. You project an image to others; they reflect it back to you; that in turn reinforces how you see yourself. If you’ve cultivated a reputation as a “good person” among your friends, you are incentivized to believe it strongly – it’s painful to think your friends admire someone who isn’t real. So you double down on that persona, even to yourself.
Social media in modern life intensifies this: people curate their virtuous, happy selves online, receiving likes and positive comments. The external image grows glossy, and the internal self tries to keep up or at least indulge in the praise. It’s not hard to start believing your own curated image. If dozens of people comment that you’re such a wonderful, caring person (based on, say, a post about volunteer work you did), you feel pressure to align your self-concept with that. You might bury the knowledge that you actually volunteered only to fulfill a requirement or to take photos for your profile. After all, everyone saw you doing something noble; who are you to disagree with them? This is how moral posturing in public can turn into genuine (but delusional) self-confidence in one’s morality.
There is also the phenomenon of virtue signaling – publicly expressing the “right” moral sentiments – which often has less to do with actual virtue and more to do with social reward. People signal virtue to boost status among their peers. The danger is that signals can substitute for substance: if you tweet all the right opinions about justice and charity, you might feel as if you’ve contributed, absolving yourself from concrete action. The brain cashes in the social credit from moral talk, reducing the cognitive dissonance of not matching words with deeds. Meanwhile, you keep believing you’re firmly on the side of the angels because your social circle echoes the same sentiments and praises each other for them. It becomes very easy to confuse talking about doing good with actually doing good.
In sum, the strategies of self-deception are numerous and deeply ingrained. Your mind, without any conscious effort on “your” part, will hide inconvenient motives, generate noble-sounding reasons for ignoble acts, highlight evidence of your goodness, and forget evidence of your badness. It will utilize social feedback to bolster a flattering self-image and steer comparisons such that you come out looking fine. All of this happens so smoothly that you simply ride along, generally convinced that, yes, you are a pretty good person – not perfect, of course (you’ll humbly admit to minor foibles, as that even boosts credibility), but fundamentally decent and certainly more moral than the average jerk out there.
By now you might be thinking: “Alright, suppose I accept that I’m not as good as I think. What now? Do I just live with this uncomfortable knowledge?” We shall address that soon, but first, let’s consider the social dimension more explicitly – how society as a whole is built on layers of comfortable delusions, and why confronting them is so difficult.
The Falsehoods of Polite Society: Collusion and Moral Posturing
Humans are not isolated deceivers; we operate in a social matrix of deception and delusion. There is an implicit agreement among people to uphold certain collective lies. This isn’t a grand conspiracy with conscious planning; rather, it’s an emergent property of social interaction. We all benefit, to some extent, from maintaining positive illusions – both about ourselves and about the systems we participate in. Thus, there is a kind of tacit collusion: “I won’t call out your self-serving lies if you don’t call out mine.”
Consider how we handle praise and criticism. When someone does something clearly for self-gain but wraps it in virtue, observers often play along. If a wealthy executive donates a sizable sum to a prestigious university to get a building named after them, the public response is typically to applaud their generosity. Only cynics mutter about ego or tax write-offs, and those cynics are often dismissed as bitter. The norm is to publicly credit people for stated motives: praise the philanthropist’s altruism, the politician’s public service rhetoric, the influencer’s advocacy for a cause (never mind they also gained followers from it). In private, people might acknowledge the hidden motive (“Sure, he donated to improve his image, but hey, it still did some good”), but even that is often said with a shrug. We prefer not to dwell on the cynicism because it feels impolite or corrosive to social trust.
This social etiquette starts in childhood: we’re taught that it’s rude to question someone’s motives. If a classmate says they missed class because they were sick, you don’t respond, “Are you sure you weren’t faking to skip the test?” Even if you suspect it, calling them out would be seen as aggressive. Similarly, pointing out someone’s humblebrag (“I’m so exhausted from all the volunteering I did this weekend”) by saying “You just want us to applaud you” is a quick way to be labeled rude or jealous. So, we mostly keep our suspicions to ourselves or share them only with close confidants.
Public discourse, therefore, has a sanitized quality. We stick to the “official reasons” for things. As The Elephant in the Brain observes, topics like status are taboo to discuss openly. Everyone knows status and power dynamics permeate workplaces, but to speak of it bluntly – “Bob sought that project so he’d look better to the boss” – is off-limits. Instead, we cloak status moves in innocuous terms: Bob “wants a new challenge” or “is really passionate about improving our processes.” We “swaddle” raw motives in euphemisms. This polite veneer keeps society running smoothly on the surface. It would be destabilizing if we constantly said what we really thought: trust and cooperation might erode if everyone’s selfishness was laid bare. In a sense, hypocrisy is the lubricant of social interactions – by hypocritically pretending everyone is earnest and public-spirited, we avoid open conflict and can coordinate better.
La Rochefoucauld famously wrote, “Hypocrisy is the homage that vice pays to virtue.” It means that even people who act in vice find it advisable to pay lip service to virtue – precisely because virtue is esteemed. That homage, insincere as it is, actually reinforces the importance of virtue in society while allowing individuals to get away with vice so long as they bow to the virtue norm publicly. In other words, hypocrisy, while seemingly odious, is a sign that people recognize virtue’s value (if only for appearances). Societies tolerate a degree of hypocrisy because the alternative – everyone openly saying “to hell with virtue, I’ll do what I want” – is far worse. At least hypocrites pretend to be good, which sets a standard that sometimes pressures them or others into actually being a bit good.
So society has a weird self-stabilizing function: It punishes overt selfishness but rewards those who successfully mask selfishness with righteous talk. If you’re too transparently selfish or dishonest, you face social sanctions (reputation loss, etc.). But if you cloak selfish aims in acceptable justifications, you often get a pass or even praise. This social reward for good appearances in turn feeds our self-deception – we have every incentive to believe our own virtuous story, because we’ll present it more convincingly and be less likely to slip up. Indeed, Simler and Hanson note that many of our big social institutions – charity, politics, religion, education – have official narratives that differ wildly from their hidden functions, yet questioning those official narratives is frowned upon. For instance, politics is officially about wise public service and policy, but practically it’s often tribal team sports and zero-sum power struggles; however, a politician can never say “I just want power and my side to win” – they must say “I’m humbly serving the public interest.” We all kind of know the reality, but we insist on the charade.
Given this backdrop, it’s no wonder that you participate in the charade. You probably think of yourself as an honest person, yet you navigate daily social life telling people what they want to hear. We call it tact, manners, or diplomacy. In many cases it’s harmless or even kind (no need to tell Aunt Mildred her gift was ugly; a polite “Thank you, it’s lovely” spares feelings). But it habituates us to two-facedness – outwardly espousing one thing, inwardly holding another. Eventually, one forgets there is a difference. You come to believe your polite fictions because outright cynicism is too uncomfortable.
Society’s emphasis on image over substance also leads to phenomena like virtue signaling, as mentioned, and moral grandstanding. People may take public stances more to signal loyalty to their group or to appear morally righteous than to actually solve issues. Online outrage is often less about the victim or issue at hand and more about displaying one’s moral sensibilities to peers. It’s a social competition: who can appear the most outraged at injustice (thereby implying they are the most morally ardent)? Meanwhile, in private, that same person might not lift a finger to help someone in need or to change their own behavior. This dichotomy between public righteousness and private apathy is another form of widely practiced self-deception. You convince yourself that because you said the right things, you’ve done your part. It’s eerily similar to medieval indulgences: buy a letter absolving your sins and continue sinning – now, tweet the correct slogan and consider your duty done.
We also collectively maintain certain social myths because they are motivating or comforting. For example, consider the statement “Hard work and talent are what lead to success.” This is a cornerstone belief in many cultures. The reality is more complicated: luck, background, connections, systemic biases all play huge roles. Yet we collectively emphasize the virtuous causes of success and downplay the arbitrary ones. Why? It’s encouraging – it reinforces a sense of justice (people get what they deserve) and it motivates effort. People who succeed like to believe it’s because of their virtue (it flatters them and justifies their wealth), and people who haven’t succeeded like to believe they might if they try hard (it’s more hopeful than thinking the game is rigged). So we all kind of lie a bit. The wealthy philanthropist might publicly insist “Anyone can do what I did” knowing privately that not everyone starts with their advantages, but it would be in bad taste to say that. The struggling worker believes the myth enough to keep striving, perhaps blaming themselves too harshly when they don’t rise as expected (because the myth said if you don’t succeed, you must not have worked hard/talented enough). There is a quiet conspiracy to keep this moral narrative of success going, even though deep down many know it’s not fully true.
Such narratives interface with morality: they allow the fortunate to feel morally superior (I succeeded because I am industrious and virtuous) and the unfortunate to avoid resentment (those rich must have earned it). It staves off social conflict at the cost of truth. And you, absorbing these norms, learn to deceive yourself both when you’re on top (“I earned it fair and square”) and when you’re on bottom (“I guess I’m just not as deserving yet”), instead of seeing the big structural picture that might be morally unsettling (like how many systemic injustices or luck factors are involved).
Why belabor all this societal stuff? Because it shows that your self-deception is not just a personal quirk – it is a feature of being a social human. You were trained into it, rewarded for it, and it likely feels second-nature. Breaking out of it even mentally is difficult, because you’re pushing against both internal habit and external consensus. When everyone around is politely pretending, it takes a certain radical mindset to say, “No, let’s be brutally honest.” Most people will not thank you for it. As Oscar Wilde noted, tell people the truth without a spoonful of sugar (or humor) and “they’ll kill you”– if not literally, then socially. You risk ostracism or at least being labeled abrasive or cynical. Therefore, even if some part of you could see through all the BS, you have a powerful disincentive to fully acknowledge it, and even less incentive to speak it. Thus you continue to play the game, and the easiest way to play it is to actually believe in it.
In a sense, there is a false moral theater constantly playing, and we’re all actors who have partially become our roles. We wear the white hats when in public, then do some black hat deeds backstage, but we never completely remove the costume – we keep it on in our mind’s eye.
Inescapable Human Nature: You Cannot Simply Opt Out
At this point, an optimistic reader might think: “Okay, I see the problem. But I will strive to be different. I will consciously avoid self-deception and be truly moral.” It’s a noble thought – and it’s likely another self-flattering narrative. The brutal truth is that you cannot fully escape this trap. Why? Because these tendencies are baked into your nature. Knowing about them is helpful (you might catch yourself a bit more often), but it doesn’t pluck out the drives that give rise to them.
Your brain will still have selfish impulses, status instincts, and emotional biases. You will still feel the flush of pride when praised and the sting of shame when criticized. Unless you’re an enlightened sage or unusually self-disciplined, you will still find it easier to avoid hard moral actions and to slide into convenient ones. And your clever mind will still be standing by with paint and brush to whitewash those fence-sitting decisions.
Even the very act of trying to be exceptionally honest can become a point of pride – a subtle ego trap. You might start to think, “I see through the lies that others buy into; I’m more rational and virtuous because I confront the ugly truth.” Be careful: that itself can be a self-deception of superiority. It’s very easy to turn the mirror outward again and say “others are self-deceived, but I, I am enlightened.” In fact, reading this analysis and agreeing with it intellectually could ironically feed a sense of smug moral/rational superiority – which is exactly the kind of self-flattering narrative the elephant in your brain loves. It will happily swap one story for another: from “I’m a good person because I believe my own good motives” to “I’m a good person because I admit I have bad motives (unlike those other deluded people).”
The pessimistic, Schopenhauerian perspective is that humans are driven by a “will” (read: a bundle of primal desires and instincts) that will always dominate, and reason mostly serves this will. Schopenhauer viewed compassion – true empathy – as the only real moral motive, but he believed it to be rare and often overwhelmed by self-interest. Modern evolutionary thinkers would add: even our compassion has genetic underpinnings (we’re more compassionate to kin, to attractive potential mates, to allies – which all serve reproduction or reciprocal benefits). So even when you feel genuinely compassionate, there may be biological strings attached. This isn’t to say true altruism is impossible – humans do have a capacity for it – but it is fragile and highly contingent. More often our “good” behaviors are admixtures of altruism and self-interest (like wanting to help and wanting recognition, both at once).
In practical terms, it means you will continue to wrestle with conflicting motives. Sometimes you’ll overcome the selfish and do the right thing – then your inner elephant might pat itself on the back a little too hard (“Look how noble I was!”). Other times you’ll cave to temptation and then rationalize it – perhaps a bit less convincingly now that you know these tricks, but you’ll be surprised how easily you still fall for them.
So, inescapable self-deception doesn’t mean you can never catch yourself or improve, but it means you can never fully trust yourself to be objective about your own motives. There will always be some bias. Understanding this should instill humility. A truly rational person, knowing all this, does not go around vehemently asserting their moral superiority or certainty of their righteousness. Instead, they become suspicious of their own narratives, and cautious in moral judgment of themselves and others.
This is one silver lining: recognizing your own self-deception can make you less judgmental of others, since you see they are subject to the same inherent flaws. The moralistic person who says “I would never do such a thing” about someone else’s misdeed is naive; a more honest response is, “I hope I wouldn’t do that, but under those circumstances, who knows?” This echoes the old saying “There but for the grace of God go I.” It acknowledges that perhaps the main difference between you and a person who did wrong is not that you’re inherently better, but maybe you haven’t been tested in the same way or had the same incentives.
Evolutionary psychology gives another sobering insight: many of the virtues we think we have are not robust. They can be turned off or on by subtle situational cues. For example, studies find that people are more generous when observed by others (which suggests the presence of a reputation motive) and significantly less generous when they think their actions are truly anonymous. As The Elephant in the Brain reports, fully anonymous donations are exceedingly rare – under 1%– and even “anonymous” donors often make sure their close circle knows of their gift. We crave the social credit for good deeds. Knowing this, can you really claim your generosity is purely selfless? Likely not. Similarly, people’s sense of fairness can evaporate if they can gain without being caught. Most will cheat “a little” in games or taxes if they think it’s undetectable, but few want to think of themselves as cheaters, so they cheat just enough to still feel okay – a phenomenon behavioral economist Dan Ariely calls the “fudge factor.” The constraint on our dishonesty is often not absolute morality, but what we can get away with and still live with ourselves. That’s self-deception at work: our internal moral monitor allows just enough cheating such that our self-concept doesn’t shatter.
It might sound depressing that we’re wired this way. But acknowledging it is the first step to any authentic self-improvement. If you want to be a truly “good” person in a meaningful sense, you must first accept that you are not naturally one. You have to fight your nature at times, and that fight is lifelong. You have to, for instance, deliberately choose to help even when no one will know you helped – and not broadcast it afterward – to cultivate real altruism. You have to call yourself out internally: “I’m just making excuses” – and then do the right thing anyway, painfully, without the comforting excuse. You have to swallow pride and admit fault without adding a thousand justifications. None of that is pleasant or easy, which is why few saintly individuals exist.
And even if you manage those feats occasionally, you won’t consistently. Some days you’ll slip – and perhaps not even notice. You might spend years thinking you’re serving a noble cause, only to later realize you were largely driven by ego or fear. Life has a way of humbling us with such revelations.
Conclusion: Embracing the Stark Truth
We have cut through layer after layer of self-flattery and comfortable lies. What remains is a picture of human nature that is difficult to look at directly: a creature of cunning self-interest, wearing a mask of virtue so well that it believes its own mask. You are that creature. I am too. We all are. This is the elephant in the brain – that much of what we do is motivated by selfish drives we’d rather not admit, and our own mind colludes to keep us in the dark.
“You are not a good person,” was the stark claim. Now, at the end, do you see what it means? It does not mean you are a mustache-twirling villain or that you never do anything kind. It means that the story you tell about being “good” is unreliable. It means that when you weigh the evidence of your actions impartially, you find a lot of inconsistency and self-serving bias. It means that your moral compass is easily swayed by convenience, peer pressure, and unseen desires. It means that even your better angels often have ulterior motives riding on their backs.
Is there any redemption in this view? Perhaps a bittersweet one: you can become slightly less self-deceived by acknowledging these truths. Instead of complacently thinking “I’m a good person”, you shift to “I have good moments and bad moments; I must remain vigilant about my motives.” You learn to say “I don’t fully trust myself – I need systems (like accountability, transparency) to keep me honest.” For example, if you truly want to be fair, you might literally flip a coin for an allocation decision and not peek at the result – forcing fairness. If you want to give altruistically, you might donate anonymously and resist telling anyone – thus getting zero credit, to ensure it was for the cause alone. These kinds of acts can help develop real virtue by denying the self the usual payoffs (status, admiration, self-congratulation). But they are hard, and you won’t always do them.
Moreover, even as you try to improve, humility requires recognizing you’ll never be pure. There will never be a day you can proclaim, “At last, I am a truly good person!” If you do, that very pride is a failing. The best you can say is, “I’m aware of my flaws and trying to counteract them; I do good where I can, but I know I have selfish motivations too.” And frankly, there’s nothing wrong with having some selfish motivations – survival and self-care demand it. The problem is the dishonesty with self and others about those motivations.
Perhaps, then, the most “moral” thing you can do, paradoxically, is to be brutally honest about your own immoral tendencies. It won’t make them disappear, but it can keep you from climbing too high on a pedestal. It can make you more empathetic, knowing everyone struggles with this. It might make you less quick to judge and more inclined to design social systems that don’t rely on angelic behavior to work (because you acknowledge people aren’t angels).
Yet, even this enlightenment has limits. Recall the warning: do not start feeling superior for having faced this truth. That is ego creeping back in. Recognizing one’s self-deception is itself a continual effort, full of missteps. As soon as you think you’ve got it, you might be deceiving yourself about that! This is why ancient philosophies (like Stoicism or Buddhism) emphasize lifelong practice, introspection, and humility. Complete self-honesty is perhaps more aspiration than achievable end state.
The modern, rationalist Schopenhauerian take is sobering but ultimately empowering in a cold way. It says: See the world and yourself as they are – will-driven, hypocritical, yet also capable of knowledge. This knowledge is painful, but it frees you from illusion. Schopenhauer thought that seeing through the will’s illusions could lead one to a kind of resigned compassion for all beings trapped in the same plight. In a similar vein, seeing through your own pretenses can lead to a quieter ego and more genuine actions.
At the very least, after this deep dive, you can no longer claim ignorance. You have been confronted with evidence of your inescapable self-deception: the everyday hypocrisies, the hidden motives in charity and kindness, the lies you tell and believe, the malleability of your moral behavior under situational forces, and the myriad mental tricks that keep you oblivious or justified. You have seen that society as a whole reinforces these patterns, rewarding appearances over reality.
What you do with this insight is up to you. You could dismiss it – and likely slide back into comfortable self-deception (your elephant would prefer that!). Or you could embrace it as a challenge: to live a life of conscious integrity, knowing you’ll falter often. There is a certain honor in the attempt to be honest with oneself, even if perfect success is unattainable. At least you won’t be living entirely in fantasy.
In closing, remember the central thesis: You are self-deceived about your goodness. This doesn’t mean you’re evil; it means you’re human. To be human is to have an elephant in the brain – a host of hidden motives – and to usually ignore it. Perhaps we cannot remove the elephant, but we can acknowledge its presence. We can occasionally say to ourselves, “There goes my sneaky brain again, trying to make me look good. Nice try, but I see you.” In those moments, we gain a sliver of genuine self-understanding and maybe, just maybe, the power to choose a slightly better path than we otherwise would have.
Facing this truth is uncomfortable, even “uncomfortable and unconventional” as Simler and Hanson warned. It might feel as if the ground beneath your moral identity has shaken. Good. That means you are now standing on a more realistic foundation. From here, you can build a more honest self – not a “good person” in the simple, self-satisfied sense, but a flawed person who at least doesn’t lie to themselves about it. As the saying goes, the first step to wisdom is to call things by their proper names. So call your self-deception by its proper name. Drag it into the light. And then, day by day, try to act not the way that makes you look good, but in ways that actually do good – especially when no one’s watching.
You will fail at times; the elephant is strong. But every time you succeed in resisting a comforting lie and accepting an uncomfortable truth, you inch closer to something like real integrity. And even if full integrity forever lies at the horizon, the pursuit of it – fueled by clear-eyed knowledge of our nature – is a far more admirable project than resting on the false laurels of “being a good person.”
In the end, maybe that is the paradoxical path to becoming somewhat good: start by admitting, without flinching, that you are not a good person. Everything worthwhile grows from that uncompromising seed of truth.
Sources:
-
Simler, Kevin, and Robin Hanson. The Elephant in the Brain: Hidden Motives in Everyday Life. Oxford University Press, 2018. (Key ideas and examples of hidden motives in charity, conversation, etc., and the concept of introspective taboo).
-
Trivers, Robert. The Folly of Fools: The Logic of Deceit and Self-Deception in Human Life. Basic Books, 2011. (Theory of self-deception evolving to better deceive others).
-
Tappin, Ben M., and Ryan T. McKay. “The Illusion of Moral Superiority.” Social Psychological and Personality Science, vol. 8, no. 6, 2017, pp. 623-631. (Study showing almost everyone irrationally believes they are more moral than others) ( The Illusion of Moral Superiority - PMC ).
-
Batson, C. Daniel, et al. “Moral hypocrisy: Appearing moral to oneself without being so.” Journal of Personality and Social Psychology, vol. 77, no. 3, 1999, pp. 525-537. (Experiments including the coin flip task demonstrating moral hypocrisy) (What’s Wrong with Morality? | Syndicate).
-
Singer, Peter. “Famine, Affluence, and Morality.” Philosophy & Public Affairs, vol. 1, no. 3, 1972, pp. 229-243. (The drowning child thought experiment and argument about obligation to distant needy).
-
Baron, Jonathan, and Ewa Szymanska. “Parochialism.” Journal of Risk and Uncertainty, vol. 38, 2009, pp. 119-139. (Study on people preferring to help locals over foreigners, as cited by Simler & Hanson).
-
Darley, John M., and C. Daniel Batson. ““From Jerusalem to Jericho”: A study of situational and dispositional variables in helping behavior.” Journal of Personality and Social Psychology, vol. 27, no. 1, 1973, pp. 100-108. (Good Samaritan experiment with seminarians) (My Favorite Psychology Study | Psychology Today).
-
Milgram, Stanley. Obedience to Authority: An Experimental View. Harper & Row, 1974. (Milgram’s obedience experiments; finding of 65% compliance with lethal shocks) (Milgram Experiment: Overview, History, & Controversy).
-
DePaulo, Bella M., et al. “Lying in everyday life.” Journal of Personality and Social Psychology, vol. 70, no. 5, 1996, pp. 979-995. (Diary studies of frequency of lying, ~ daily lies) ((PDF) Lying in Everyday Life - ResearchGate).
-
Wilde, Oscar. The Wit and Wisdom of Oscar Wilde. (Quote: “If you want to tell people the truth, make them laugh; otherwise they’ll kill you.”).
Comments
Post a Comment