This is the Friday Let Your Life Speak newsletter, bringing you a myriad of resources to help you show up for your life with integrity. Subscriptions help me have the time to keep the resources flowing. If you appreciate that labor, please consider becoming a paid subscriber.
“Ignorance more frequently begets confidence than does knowledge” — Charles Darwin
Do you remember the Darwin Awards? It was this idea that arose in popular culture in the mid-80s about over confident, stupid people removing themselves from the gene pool, culminating in a 2006 movie about a host of characters that render themselves sterile or dead because they are so convinced they know what they’re doing when they embark on truly dangerous and ludicrous adventures. I will confess I watched that movie around the time it came out, and enjoyed considerable schadenfreude watching terrible things happen to fictionally stupid people at their own hands. What can I say? Winona Ryder was in it.
Though the Darwin Awards have faded from the zeitgeist as a phenomenon, the idea persists in cultural trends like “Florida Man”— the idea that you can Google the term “Florida Man” and, say, your birthday, and come up with a horrific headline about some guy in Florida doing something incredibly, stupid and reaping the consequences.
I stay off of the internet enough that I haven’t seen much commentary reviving this concept, if not the “Florida Man” himself, in the wake of Hurricane Ian, though I’m sure it’s out there. We love to point to those people over there experiencing calamity, reassuring ourselves that whatever horror they are living through (or have died from) is their own fault for being so incredibly lacking in common sense or basic understanding of the way the world works.
The trouble, of course, with this posture is not only that it’s cruel, but it also displays a shocking lack of humility about our own tendencies towards cognitive bias— systemic errors in thinking as we process and interpret the world around us.
“The first rule of the Dunning-Kruger club is you don’t know you’re a member of the Dunning-Kruger club.” — David Dunning
Enter: the Dunning-Kruger Effect
The Dunning-Kruger Effect was first measured in a study by researchers David Dunning and Justin Kruger in 1999. The study was inspired by reports of a Darwin Award-winning bank robbery by a guy who assumed he would be invisible to surveillance cameras by covering his face with lemon juice. Because lemon juice can be used to make invisible ink. Seriously, that happened.
The research showed that there is a significant divergence between self-assessment of competence and actual performance in those with the least amount of a given skill. In other words, folks who know nothing about something think they’re going to be really great at it, and in fact, they’re really, really not.
Interestingly, according to multiple follow-up studies, those who are most competent tend to underestimate their performance because 1) they know how much they don’t know, and 2) their competence has become so innate that they begin to believe it must be more commonplace than it is, and thus less impressive. Their underestimations are less divergent from their actual performance, however. It’s those folks who really don’t even know enough of what they’re doing to know what they don’t know who tend to grossly overestimate their likelihood of success.
As is the nature of science, any number of researchers in the ensuing 23 years have set out to disprove Dunning and Kruger’s theory. Some have argued that self-assessments are too closely tied in their accuracy to incentives, and so are an inherently unreliable data point in and of themselves. For instance, in the case of integrity, accurate self-assessment can run up against our very natural desire to see ourselves as “good” people. If accurately assessing our failures of integrity threatens our identity as “good” we can very effectively (and often unconsciously) screen those failures out or rationalize them, excluding or downplaying them into non-consideration. The more attached we are to our self-conception as “good” the more incentivized we are to overlook contrary evidence and overestimate our integrity. Without considering that powerful incentive the potential divergence between our self-assessment and our performance lacks appreciation of our unreliability.
Another argument against the Dunning-Kruger Effect is the idea of regression to the mean. In other words, our self-assessments of our skill and our performance at that skill will vary over time due to factors both within and outside of our control. On any given day those factors may conspire to produce huge divergence. Other days, less so. Over time, on average, our self-assessment and actual performance will come closer to each other.
This argument seems, in part, to be based on a different premise altogether, which is that repetitive practice will, over time, increase both our capacity and the accuracy of our self-assessment, whereas the Dunning-Kruger effect is really a snapshot of a given moment in time. The former actually supports the latter, rather than contradicting it. Presumably, most people who actually do something repeatedly do get better at it over time, or at least gain some understanding of how bad at it they are, which would move them to the middle of the Dunning-Kruger graph.
There’s also, some have pointed out, a huge question of culture when it comes to cognitive bias. In the West, and particularly in the United States, we abhor making mistakes, and gear our entire educational system around avoiding them. This method is based on the idea that intelligence is fixed and, therefore, mistakes are a sign of an inherent lack of capacity instead of room for growth. Conversely, many Asian cultures, base their pedagogy on a growth mindset, meaning intellectual capacity is developed over time through making mistakes and unpacking them. In the classroom, mistakes are held up as necessary and inevitable and students work together to understand them.
In a culture where mistakes are anathema, however, overestimating your capacity the less you know is incentivized. I mean, I’m not the sort of person who makes mistakes, for goodness’ sake! I’m great at this!
Yeah, no. Probably not.
In a 2019 interview with Vox, David Dunning talked at length about how their research has been misinterpreted, and what the true takeaways are from it. First, he insisted that the point was never to other people (i.e. “look at those stupid people over there”), but to implicate all of us in the ways that we misread our own capacity and understanding. All of us, Dunning insists, are capable of what is called naive realism:
[W]e can take some idea and spin a complete and compelling story around it that is coherent, is plausible, makes a lot of sense, is interesting — and it doesn’t necessarily mean that it’s right.
The only remedy is intellectual humility, to always function from the presumption that there are things you don’t know. “I don’t know”, in fact, is a complete and reasonable answer as often as not to any number of questions. And in the end, Dunning insists, seeking outside counsel when making consequential decisions is always the best way to go:
On a more general level, a lot of the issues or problems we get into, we get into because we’re doing it all by ourselves. We’re relying on ourselves. We’re making decisions as our own island, if you will. And if we consult, chat, schmooze with other people, often we learn things or get different perspectives that can be quite helpful.
An active social life, active social bonds, in many different ways tends to be something that’s healthy for people. Social bonds can also be informationally healthy as well. So that’s more on a top, more abstract level, if you will. That is, don’t try to do it yourself. Doing it yourself is when you get into trouble.
So, what does this have to do with integrity? A lot, actually. Because integrity is a skill, not a state of being, so we are prone to cognitive bias such as the Dunning-Kruger Effect when considering it. If we have never really thought about or practiced integrity at all, and we are prone to a fixed mindset or are deeply attached to a virtuous self-concept, then we may very well grossly overestimate our own integrity performance.
The more we actually practice our integrity, however, including the necessary self-reflection involved in discernment, the more confronted with our own imperfection we become. This, ironically, brings our self-assessment and our actual performance closer together, and also improves our integrity. We make mistakes, we own them, and we reflect on why and how we made them so we can do better next time. Or, more likely, make new and better mistakes, which is just as good in my book.
We also naturally develop greater intellectual humility through a long-term integrity practice. Because sometimes our grand ideas of how the world works, what is clearly “right” and “wrong”, just fall apart. Sometimes, even if they don’t fall apart completely, they prove to be too simplistic. If we are honest with ourselves, which is also practicing integrity, then what we believe evolves over time. We also become used to perpetually questioning ourselves. We profess a belief and that profession is always followed up with the query, “But what if that’s not true?”
Even if we come back around to the veracity of a given belief, the ability to consider its validity and what actions might follow if we believed differently keeps us intellectually and emotionally honest and open to growth. This gives our integrity practice itself integrity.
Finally, as Dunning insists, discernment isn’t ever successfully a solitary endeavor. What he refers to as seeking outside counsel, Quakers refer to as “testing”. We have to be philosophical and spiritual scientists if you will— transparent with our community about who we are and why we do the things we do while remaining open to new information that disproves or complicates our beliefs and self-concept. If we’re not seeking outside input, particularly from those who believe differently or have different life experiences, then our hypotheses about the world and ourselves are incomplete at best, and prone to serious error at worst.
You can’t practice integrity successfully with uninterrogated or factually false beliefs. Yes, there are facts, which are separate from our opinions or beliefs about them, despite what algorithms and our opinion siloes would have us believe. Integrity demands we respect them.
If you’re reading this newsletter then you’ve already likely conquered the Dunning-Kruger Effect, at least when it comes to integrity. You know, as I do, how much you don’t know! You know, as I do, how beautifully, imperfectly human you are! This is actually great news. It paves the way for continuous growth and heightened capacity as you practice your integrity for a lifetime.
Hopefully, it also gives you a healthy sense of humor about yourself, rather than a cruel sense of humor about others. I don’t know about you, but I’m ridiculous. And hilarious. I wouldn’t want to be any other way.
"In the West, and particularly in the United States, we abhor making mistakes, and gear our entire educational system around avoiding them."
Yep. And it's so horrifying. It's anti-wisdom, and denial of the huge importance of setbacks and null results in making us think creatively...
And I think you nailed it by tying fallibility to integrity. It's also credibility, in scientific circles - no credible experimenter gets everything right or formulates hypotheses with 100% accuracy. But: integrity, that thing we all aspire to. To be seen as "admirably competent." And yet somehow that gets equated with "always right" and "always speaking with total confidence of being right". That's the aspirational model that is being baked deep into the brains of kids. It's really appalling?
What I think the world needs - and on social media in parrticular - is a new wave of really influential people saying two things a lot more than everyone else:
1) "Whoops - I got that wrong, and here's why. [explanation] Anyway. My bad!"
2) "Actually, hey, I don't know! I don't understand enough about this thing to know, so I don't have an opinion on it. And hey - if YOU don't understand a thing very well, it's okay for YOU to not have an opinion on it, okay?"
So good. I have so many thoughts. (I apologize ahead of time for to many gardening references to follow)
1) I've always found I learn more from my mistakes than from my successes. I don't really learn anything if I plant something, get lucky with a thriving and beautiful plant, and think I'm awesome. Only when things go *wrong* do I actually increase what I know.
2) I also suffered from the western 'mistakes are bad' mindset. And also because I constantly compared myself to my very very smart older brother. It was incredibly freeing to get to where I could say "I don't know!" without a sense of internal shame. As ridiculous as it is to expect to magically know everything. It took me a freaking long time to get there.
3) I've adopted "Fallor, ergo sum" as my motto (roughly, I'm mistaken, therefore I am). Not knowing what it is you don't know is a hard thing to actively hold onto. I remind myself that being wrong feels EXACTLY like being right...if you don't know you're wrong.
...the example I always bring to mind is when I planted two blueberry bushes. They were small, only about a foot tall. Late November, I found one sheared down to two inches tall. Pruned pretty cleanly. My husband must have accidentally mowed it down. I asked him about it, and he angrily denied being the culprit. Said he knew where they were. I dropped it. I knew he believed it. He clearly hadn't even realized it at the time. There was other brushy stuff back there. I could totally see it having happened by accident. I knew what happened, but it wasn't important that he realize it. Fast forward to late january. I found the second one ALSO cut down cleanly to two inches. It *looked* like a lawn mower accident. But it wasn't. No one had mowed in the winter. Best guess is that rabbits had nibbled down the tender shoots. But point is I was 100% sure I knew what happened. Not a glimmer of doubt. But I was wrong. There's always something you don't know.
...but I blather. That was a thoughtful and densely packed newsletter. I need to go back and read again, because I wasn't absorbing it all the first time.
Thank you again for your thorough, interesting and thought-provoking writing!