Critical Thinking & Heuristics
To be a good scientist, it’s healthy to have a touch of amiable skepticism. What is amiable skepticism? It’s a trait that combines being open to ideas and scientific findings - but also being wary when good evidence and reasoning doesn’t seem to support this idea.
To be open to amiable skepticism, you have to be a critical thinker. Critical thinking involves looking for holes in logic, using logic and reasoning to see if information makes sense, and considering alternative explanations.
Otherwise, you may get behaviors like confirmation bias popping up in today’s world with “fake” news. Confirmation bias is a term used when an individual latches onto some form of information that corroborates their own inherent beliefs.
To avoid this bias and to keep an open mind, you can ask and check yourself with these questions:
If your answer to the last question is yes, then science has helped you through psychological reasoning.
Psychological reasoning examines how people think and aims to understand when and to explain why people are likely to draw erroneous conclusions. People want to make sense of the world and events they’re involved in, so the brain works in succinct ways to find patterns and make connections between ideas. Sometimes, patterns appear in a situation where they do not really exist. For example, when looking at clouds, you may see faces, images or animals. When playing music backwards, you may hear satanic or hidden messages, or maybe you think big events happen in threes. (“I am not superstitious, but I am a little ‘stitious. – Michael Scott)
All of this is not to say that with these reasoning skills, people still have the ability to make errors and draw biased conclusions. In fact, research has shown that sometimes we are wrong, but wrong in predictable ways. Therefore, that’s not to say that these errors can still also make new discoveries and help to advance society. Besides confirmation bias (aka, ignoring evidence unless it fits with your current beliefs), want to know what are other major biases or “predictable ways we are wrong” discussed in psychology?
This bias deals with appeals to authority: when sources refer to their expertise rather than to the evidence or the facts. Take the two viral videos that have run amuck on social media since April (example 1 and example 2). Yes, they are doctors. Should you believe them just because of who they say they are and how many years they have had in this profession? A scientist with amiable skepticism may question their arguments especially if evidence accumulates against said argument. Posed in a different way: should you believe everything I have to say here just because I am a PhD candidate studying Cognitive Science? Since I’ve had at least 7+ years of experience in behavioral science, my arguments may bear more weight than your friend who took a high school psych class – but that’s your bias to appeal to authority to decide.
Just because two facts or ideas correlate, does not mean one caused the other. For example, global temperatures have been increasing, whereas the number of pirates on the high seas has decreased overall. Do you think these two facts are correlated?
Relative comparisons is when people use comparisons to judge the inherent value of something. Such as, you probably feel a little better getting an 85 on an exam after learning the class average was 75 and not 95.
Feeling like you can explain a situation after the fact is called hindsight bias. In other words, when people come up with an explanation for why events happen, even when their information can be incomplete.
(“I knew you were trouble when you walked in,” sings Taylor Swift, in typical hindsight fashion.)
I’m one of the few natives from Vegas, so sometimes gambling feels like it’s in my blood. From my experience here, I know that machines and tables have odds stacked against you, but people will still want to take the risk just for the bigger payout of the odds. Sometimes with facts and ideas, people will take similar risks – there’s still the question of whether you will end up with the big pay out or not.
Every decision is made under some degree of risk, and quite often, decision-making involves heuristics. Heuristics are fast and efficient strategies that people use to make these decisions, like common mental shortcuts, rules of thumb, or informal guidelines. The thing to keep in mind about heuristics, is that they often occur unconsciously, meaning we’re often not aware we are taking these mental shortcuts. Here’s a few examples of heuristics:
The answer is the third letter, btw.
As humans, we are positive thinking creatures and are motivated to feel good about ourselves; however, this motivation can affect how you think and highlights why people can have difficulty seeing their own weaknesses. For example, 90% of all drivers think they are better-than-average. Reality check: only 50% of drivers can be above average on any given dimension.
I know I have, which is why I have worked hard to maintain an open mindset versus a closed mindset. If you close yourself off, then you fall directly into the self-serving bias.
I hope this gives you some insight into how and what others are thinking when discussing ideas about complex human behavior.
Resource: Gazzaniga, M., Heatherton, T., and Halpern, D. (2016). Psychological Science. 5th Ed. (New York: Norton).