How often do you ride on a car? Even if you don’t have your own car, you must have seen one. I want to start this piece with a small challenge for you. Using only your memory, recall it in your mind a car you often see.
Okay, I see the wheels, the window, and the overall car frame. Does it look anything like this?
Oh but wait, what about the headlights and tail lights? Where’s the handle for opening the doors? And where’re the mirrors?
Why would we miss so many of those things? Don’t we all have a clear idea what a car is like?
We believe that we know way more than we actually do.
Yes we do. In a study conducted at Yale[1], graduate students were asked about their understanding in everyday devices like toilets. Most thought that they were familiar with the device, only after they were asked to explain step-by-step how the device works did they find out how ignorant they were. Toilets are more complicated than they look.
We believe that we know way more than we do because most of the time, we only need to rely on others’ expertise to operate something. Take the bicycle and toilets as examples, we don’t really need to figure out how the whole thing works in order to operate them. As written by the authors of The Knowledge Illusion: Why We Never Think Alone,[2]
“One implication of the naturalness with which we divide cognitive labor is that there’s “no sharp boundary between one person’s ideas and knowledge, and those of other members of the group”
Very often, our knowledge and beliefs are actually someone elses’ without us even realizing it. Maybe you’ve already started to be more aware of this fact especially when the social media has such a great impact on our daily lives these days.
When deep understanding is not always required, biases arise.
The tendency that people embrace only information that supports their own beliefs is commonly known as “confirmation bias”, and it is dangerous. When we believe what we think is always right, our faulty thinking will harm the truth and disrupt our growth.
Did everyone really understand the political situations in the US before they voiced out their opinions? And it’s pretty obvious that not everyone in the UK understood the whole Brexit thing before they voted for it, right? These are just some of the many examples of how others’ beliefs and knowledge got easily spread over the internet and people just picked up those thoughts without further understanding the truth.
Business journalists often suffer from the confirmation bias. In the books The Art of Thinking Clearly[3], there’s an example about a statement “Google is so successful because the company nurtures a culture of creativity”, and how once this idea goes on paper, journalists only need to support the statement by mentioning other same successful companies without seeking disconfirming evidence. No more different perspectives, people will always see just one tip of the iceberg.
When winning becomes more important than reasoning, chaos come.
On the other hand, when presented with someone else’s argument, we tend to be more skeptical; and there comes the term “myside bias”.
In an experiment performed by a cognitive scientist Hugo Mercier,[4] participants had to answer some questions, and later they were presented their own answers but were made to believe those were others’ answers. They became a lot more critical about the answers than when they were simply asked to modify their answers to be better.
In some situations, when winning seems to be more beneficial, reasoning clearly becomes unimportant to most of us. And this makes us more blinded than ever to spot out our own weaknesses.
To think more clearly, “murder your darlings”.
“Murder your darlings” is the literary critic Arthur Quiller-Couch’s advice[5] for writers who are reluctant to cut their cherished redundant sentences in their works. We can apply this concept to how we think too.
To fight against biases, let go of your “cherished thoughts” that you have to be right, and set out to find disconfirming evidence of all your beliefs — whether they be relationships, political views or career objectives. The stronger you believe in something, the more you should seek out alternative views of it.
The rule of three
An even more effective way to overcome bias is using the rule of three[6] — identify three potential causes of an outcome. In fact, the more possibilities you can come up with, the less biased you’d be towards any single outcome.
Say next time, if you see an outcome that isn’t what you expect at work, instead of thinking it must be that irresponsible and careless guy who messed up the stuff, try to think of three potential causes: Maybe there’re instructions missing at the beginning? Maybe the guy already did his job but something went wrong afterwards? Maybe it’s something external that affected the outcome of this?
Thinking through alternative possibilities help unravel the unnecessary attachments we have to the “cherished” thoughts, so we can have a more complete picture of how things are. When you learn to “murder your darlings” and embrace different views, your horizon will be widened and you’ll see a limitless world.
Featured photo credit: Stocksnap via stocksnap.io
Reference
[1] | ^ | Steven Sloman, a professor at Brown & Philip Fernbach, a professor at the University of Colorado, The Knowledge Illusion: Why We Never Think Alone |
[2] | ^ | Steven Sloman, a professor at Brown & Philip Fernbach, a professor at the University of Colorado, The Knowledge Illusion: Why We Never Think Alone |
[3] | ^ | Role Dobelli: The Art of Thinking Clearly |
[4] | ^ | Cognitive scientists Hugo Mercier and Dan Sperber: The Enigma of Reason (Harvard) |
[5] | ^ | Sir Arthur Quiller-Couch: On the Art of Writing |
[6] | ^ | Benjamin L. Luippold, Ph.D.; Stephen Perreault, CPA, Ph.D.; and James Wainberg, Ph.D.: Overcome Confirmation Bias |
The post Human Minds Have Limitations In Reasoning, What You Believe Is Right Likely Is Wrong appeared first on Lifehack.
No comments:
Post a Comment