This weekâs rebroadcast takes us back to 2007, when I was searching for a good term to describe conclusions drawn with insufficient evidence. In it, I discuss the book Black Swan by Nassim Nicholas Taleb, who has become a more polarizing figure in the years since its publication.
Previously, while answering the Greyâs Anatomy question which generated so much talkback, I found myself searching for a specific term I knew had to exist: the human tendency to consider only the samples presented, ignoring other relevant items.
It felt like a fallacy, but it didnât quite match up to any of the contenders I found online. If you squint really hard, you can make it look like a special case of the Fallacy of (Hasty) Generalization, but that seems a stretch for something which feels fairly commonplace. I ended up coining, âFallacy of Limited Samplingâ â with a mental sticky note to replace it once I found a better term.
To my surprise, I found the one over the middle of the Atlantic, during the 20+ hour flight to Africa: âsilent evidence.â
Thatâs the term Nassim Nicholas Taleb uses to describe this phenomenon in The Black Swan. He introduces it with a story from Cicero:
Diagoras, a nonbeliever in the gods, was shown painted tablets bearing the portraits of some worshippers who prayed, then survived a subsequent shipwreck. The implication was that praying protects you from drowning.
Diagoras asked, âWhere are the pictures of those who prayed, then drowned?â
Those âdrowned believersâ are silent evidence. You donât take them into account because they canât speak up for themselves. The clichĂ© is that, âHistory is written by the winners.â In fact, itâs written by whoever happens to survive.
Following a discussion of the Phoenicians, and how their lack of literature is more likely due to the fragility of their paper rather than a failure of their culture, Taleb urges us to cast our nets widely:
Consider the thousands of writers now completely vanished from consciousness: their record did not enter analyses. We do not see the tons of rejected manuscripts because these have never been published, or the profile of actors who never won an audition â therefore cannot analyze their attributes. To understand successes, the study of traits in failure need to be present. For instance, some traits that seem to explain millionaires, like appetite for risk, only appear because one does not study bankruptcies. If one includes bankrupt people in the sample, then risk-taking would not appear to be a valid factor explaining success.
Taleb calls this overlooked bulk of information âsilent evidence.â I assumed that was a term of art, but Googling it now, most of the references point back to Talebâs book. Itâs possible that he is its primary champion. Regardless, I like it, and intend to use it liberally.1
I didnât mean for this to become a book review, but since I startedâŠ
There are many things I liked about The Black Swan. In addition to silent evidence, I found myself nodding my head to his discussion of the confirmation bias (we tend to notice things that fit our theories), Platonicity (confusion of the model with what itâs modeling), and the narrative fallacy â our need to create a story which explains events after they happened, even if the causality is questionable (or impossible). Thus we write history books explaining how World War I started, when if you were reading the newspapers of the time, these âcausesâ wouldnât have shown up.
Talebâs central thesis is that there are unexpected incidents (Black Swans) which have enormous, disproportionate impact on our world: terrorist attacks, bank failures, iPods. By definition, we canât predict them â which means any prediction about the future at all is extremely dubious. The best we can do is constantly remind ourselves of the limits of our knowledge, and make some contingency for the completely unexpected.2
Iâve always been leery of statements like, âBy 2075, the U.S. population will total 1 billion.â Talebâs book helps justify my frustration at these seeemingly-scientific projections, which discount what we inherently know about the future: that we know much less than we think.
Are you enjoying this newsletter?
đ§ Forward it to a friend and suggest they check it out.
đ Share a link to this post on social media.
đŁ Have ideas for future topics (or just want to say hello)? Reach out to Chris via email at inneresting@johnaugust.com, Mastodon @ccsont@mastodon.art, or Threads @ccsont@threads.net
2024 update: I do in fact use âsilent evidenceâ a lot. But thatâs just me. It hasnât become a popular term in the last 15 years, as evidenced by the fact this blog post shows up on the first page of Google search results.
Donald Rumsfeld took a lot of flack for his Yogi Berra-like koan about âUnknown Unknownsâ at a Defense Department briefing in 2002, which Slate put in verse form. Iâm scared to say: heâs actually kind of right. Acknowledging that there are âunknown unknownsâ is important.