The Fragility of Knowledge

5 Min Read

In his excellent book The Black Swan: The Impact of the Highly Improbable, Nassim Nicholas Taleb explains:

“What you don’t know is far more relevant than what you do know.”

Our tendency is to believe the opposite. After we have accumulated the information required to be considered knowledgeable in our field, we believe that what we have learned and experienced (i.e. what we know) is far more relevant than what we don’t know. We are all proud of our experience, which we believe is the path that separates knowledge from wisdom.

“We tend to treat our knowledge as personal property to be protected and defended,” explains Taleb. “It is an ornament that allows us to rise in the pecking order. We take what we know a little too seriously.”

However, our complacency is all too often upset by the unexpected. Some new evidence is discovered that disproves our working theory of how things work. Or something that we have repeatedly verified in the laboratory of our extensive experience, suddenly doesn’t produce the usual results.

Taleb cautions that this “illustrates a severe limitation to our learning from experience and the fragility of our knowledge.”

I have personally encountered

In his excellent book The Black Swan: The Impact of the Highly Improbable, Nassim Nicholas Taleb explains:

“What you don’t know is far more relevant than what you do know.”

Our tendency is to believe the opposite. After we have accumulated the information required to be considered knowledgeable in our field, we believe that what we have learned and experienced (i.e. what we know) is far more relevant than what we don’t know. We are all proud of our experience, which we believe is the path that separates knowledge from wisdom.

“We tend to treat our knowledge as personal property to be protected and defended,” explains Taleb. “It is an ornament that allows us to rise in the pecking order. We take what we know a little too seriously.”

However, our complacency is all too often upset by the unexpected. Some new evidence is discovered that disproves our working theory of how things work. Or something that we have repeatedly verified in the laboratory of our extensive experience, suddenly doesn’t produce the usual results.

Taleb cautions that this “illustrates a severe limitation to our learning from experience and the fragility of our knowledge.”

I have personally encountered this many times throughout my career in data quality. At first, it seemed like a cruel joke or some bizarre hazing ritual. Every time I thought that I had figured it all out, that I had learned all the rules, something I didn’t expect would come along and smack me upside the head.

“We do not spontaneously learn,” explains Taleb, “that we don’t learn that we don’t learn. The problem lies in the structure of our minds: we don’t learn rules, just facts, and only facts.”

Facts are important. Facts are useful. However, sometimes our facts are really only theories. Mistaking a theory for a fact can be very dangerous. What you don’t know can hurt you. 

However, as Taleb explains, “what you know cannot really hurt you.”  Therefore, we tend to only “look at what confirms our knowledge, not our ignorance.” This is unfortunate, because “there are so many things we can do if we focus on antiknowledge, or what we do not know.”

This is why, as a data quality consultant, when I begin an engagement with a new client, I usually open with the statement (completely without sarcasm):

“Tell me something that I don’t know.” 

 

Related Posts

Hailing Frequencies Open

Link to original post

TAGGED:
Share This Article
Exit mobile version