The Science Behind Why Your Facebook Friends Ignore Facts

in #science6 years ago

I’ve long believed that humans are rational beings. That is to say, we use logic and evidence to make decisions and determine what’s true. As it turns out, a wealth of cognitive research proves that I was decidedly wrong.We live in a world where more information is flooding our brains than ever before. Advertisers have long battled for our attention. But now, software developers battle for it too, and then sell it back to the advertisers. The more effectively Google, Twitter, and Medium can capture our attention, the more money they make.If we didn’t filter almost all of the information that we receive, we’d be completely overwhelmed. That’s why our brains use “shortcuts” to pick out the bits of information that are most likely to be useful. And by useful, I don’t necessarily mean true. By useful, I simply mean that the information will help you stay alive.

You may find yourself wondering: Why is the world so divided on religion and politics? Why do people support Donald Trump? Or Hillary Clinton? Why can’t I convince my friend to change his mind?Below, I’ll share how our brains deal with information overload — and the associated cognitive biases that prevent us from correctly understanding the facts.
The Availability Heuristic: Believing What’s Top of Mind
The availability heuristic is a mental shortcut that relies on immediate examples that come to mind to determine truth or falsehood. It posits that when it comes time to make a decision, we leverage what is already top of mind. We give greater credence to this information and tend to overestimate the probability and likelihood of similar things happening in the future.This shortcut is helpful in decision-making because we often lack the time or energy to investigate complex issues in greater depth. The availability heuristic allows people to arrive at a conclusion more quickly.However, like other shortcuts, it can lead us astray. Of course, just because something is in our mind doesn’t mean it’s true.
For example, after Donald Trump described Hillary Clinton as “crooked,” we were primed to interpret her behavior as such. The interpretation doesn’t necessarily mean she’s not crooked, it just means that our brains are more likely to come to that conclusion because it’s easier than evaluating the situation from scratch.

Believing What We Pay Attention To
Attention believing is the tendency for our conclusions to be affected by our recurring thoughts. Furthermore, Attention believing predicts that attention will be preferentially allocated towards threatening, rather than neutral or positive, stimuli.If you think what you see is the whole story, you’re displaying Attention believing. To arrive at a more accurate conclusion, you also need to consider the things you don’t see.For example, when someone looks at only one, or a few, economic data points and then determines that the economy is strong or that the government is doing a great job, he is forgoing the time and energy necessary to gain a more complete picture.
The Illusory Truth Effect: Believing What’s Repeated
Repetition is another way that misconceptions can enter our knowledge base. Per The Illusory Truth Effect, repeated statements are easier to process, and subsequently perceived to be more truthful than new statements. Our brain spends less time and effort on processing information that’s been repeated and takes it as truth simply because it’s familiar.The reverse is also true: people interpret new information with skepticism and distrust.Take the topic of nutrition. For decades, we’ve been told that eating fat is unhealthy. Despite recent studies proving the contrary, our diets continue to be high in sugar and processed carbohydrates.It doesn’t always matter what we’re told — a truth or a lie — we’re more likely to believe it if it’s repeated. It’s the frequency, not just the reality, that matters.

The Mere Exposure Effect: Believing What’s Familiar
Not only is repeated exposure more likely to make us believe something, it’s more likely to make us form a favorable opinion of it. Per The Mere-exposure Effect, also known as the familiarity principle, we tend to like things that are familiar to us.Repeated exposure of a stimulus increases perceptual fluency, which is the ease with which information can be processed. Perceptual fluency, in turn, increases positive sentiment. Familiar things require less effort to process and that feeling of ease signals truth.We are attracted to familiar people because we consider them to be safe and unlikely to cause harm. We can even adapt to like things that are objectively unpleasant, such as when former prisoners miss prison.When trying to make an important decision, have you ever come short of considering all information and possibilities? While we might like to think that we take all the facts into consideration, the reality is that we often overlook some information.If we fact checked anything and everything that crossed our minds, we’d be paralyzed. By using the “shortcuts” above — rather than coming to our own conclusions using reason and evidence — we may be wrong some of the time. However, determining what’s true is not always necessary to survive. Sometimes it’s more effective to make faster decisions and err on the side of safety.

Ease trumps truthL
Imagine if a CEO couldn’t trust his marketing team to analyze data. The CEO wouldn’t be able to focus on keeping the company alive and growing it. He’d be stuck in the minutiae of marketing analytics.Similarly, our brains have to sacrifice accuracy to increase the chances of surviving and reproducing.Humans across the world have come to different conclusions about important issues like religion and politics. Logically then, most of them must be wrong. However most people are not dying as a result of believing in the “wrong” religion.The fact that we are so divided on the election is further evidence that we are not completely rational. Even if one side was “right,” that would mean that about 50% of the population was wrong. If we were even predominately rational, wouldn’t at least a more significant portion of the population be on one side?You might say “ah, but the two-party system doesn’t make any sense.” But we have a two party system — which doesn’t make sense. That’s further evidence to my point!If people were rational, there would be no need for emotional advertising campaigns or political speeches, we would simply educate people about the facts.When I first realized that people are irrational, I was confused and frustrated. I felt hopeless. I wished things were different. But that only caused anxiety.

Accepting that most people — myself included — are irrational most of the time actually eased the stress. Now I don’t have to wonder about why my Facebook friends believe half of the crap that they share and like, why they endorse a particular political candidate that I don’t agree with, or why they believe in their respective religion.We live in an incredibly complex world. Familiar information is easier to understand and repeat. The cognitive biases above can help us make decisions faster and therefore stay alive in uncertain environments, but they don’t necessarily help us find truth. In a weird way, being irrational actually makes sense.

Coin Marketplace

STEEM 0.28
TRX 0.13
JST 0.032
BTC 61209.23
ETH 2940.40
USDT 1.00
SBD 3.73