The human brain evolved to find patterns. We find them everywhere, from nature to nurture, from math to languages, from individual human behavior to group social dynamics. It is the reason why we say that history repeats itself and why we enjoy music so much. If you look for patterns, you will undoubtedly find them; the human brain has evolved to do so.
Our species developed this pattern-detection software as a means of survival: recognizing patterns in the jungle alerted early humans to things like predators, prey, and weather changes. Recognizing the herding patterns of buffalo or the seasonal patterns of rain were paramount to surviving in a dangerous world. “Because we must make associations in order to survive and reproduce, natural selection favored all association-making strategies,” Shermer writes, “even those that resulted in false positives.” This is important, because while not recognizing a rustle in the leaves as a poisonous snake can lead us to be bitten and killed, recognizing a rustle in the leaves as a potential danger when there is no actual danger proves no harm. Our pattern recognition software is always running, regardless of whether the patterns detected are helpful or erroneous.
In his book, Shermer defines two news vocabulary words. The first is patternicity, which he defines as “the tendency to find meaningful patterns in both meaningful and meaningless data.” Our rational brains have become exceedingly good at finding patterns everywhere, even when no real pattern actually exists Our modern world gives ample opportunity for this pattern detection system to operate. The second word that Shermer brings to his readers is agenticity, which he defines as “the tendency to infuse patterns with meaning, intention, and agency.” Again, this comes from our long evolution as a species. It is important when the sky changes color—incoming storm clouds means something. The problem we have today is when we place too much importance on our false-positive beliefs. When the leaves are rustling, and we believe there is a snake, we might take the longer path home, when in reality there is no snake and we are only making our own lives more difficult. “With this evolutionary perspective we can now understand that people believe weird things because of our evolved need to believe non weird things.”
Sadly, our human brains have only one neural network that is in charge of our beliefs, as opposed to two separate areas for belief and disbelief. This is importantly true, because if you believe that ‘torture is wrong’ and also that ‘2 plus 2 makes 4,’ these occur in the same place in the brain. This can cross our wires between ethics and science, because as far as the brain is concerned, both of these beliefs are similar. Couple this phenomenon with the philosopher Baruch Spinoza’s conjecture—that beliefs come quickly and naturally while skepticism is slow and unnatural—and we have a recipe for trouble. Our strongest held beliefs are oftentimes jumps to conclusions. “The scientific principle that a claim is untrue unless proven otherwise runs counter to our natural tendency to accept as true that which we comprehend quickly.” Science is the best tool we have devised for determining the validity of claims of belief, but it is one that we do not use ubiquitously enough. The only way we know to approach real truth is the scientific method: the experimentation of a hypothesis and the examination of the results. It doesn’t matter how strongly your belief is: if it disagrees with experiment and replicable outcomes, then it is lacking. Still, if you can come to a complete scientific agreement about a phenomenon, the sun being the center of the solar system, for example, the same part of your brain that believes this to be true also might believe that we should jail people for wearing open toed shoes in public. One human can believe both of these ideas, because they are reinforced by the same neural network.
This bring us to the ultimate thesis of this book: human beings form beliefs first, and then rationally back them up second. We are the opposite of Spock from Star Trek, a character infamous for his cold hard logical thinking. We react to situations emotionally. We use our pattern-detection software to understand the world and to subsequently make guesses about how the future will play out, and then we rationalize these thoughts post-hoc. Obviously, this is not a great system for optimizing our potential. It also explains why so many people believe such wildly different (and often completely ridiculous) things.
This further complicates society and our social interactions with one another because “our perceptions about reality are dependent on the beliefs that we hold about it.” Reality exists independent of human minds, but our understanding of it depends upon the beliefs we hold at any given time. If the wind is rustling the leaves a lot, we might believe the area is full of snakes, which could cause us to move our camp to another territory. The reality may be that our first spot was safer than our new one. If we believe that gun ownership is the reason why so many mass shootings occur, that will lead us to take different actions to curb the violence than if we believe that mental illness is the culprit. Each individual’s subjective (emotional and psychological) reasons for believing what causes mass shootings are influenced by their family, friends, and culture, and our pattern finding brains reinforce these beliefs and give them meaning. This is helpful when our patternicity and agenticity align with empirical reality (when there really is a snake in the grass), and harmful when it does not. It is also what makes democracy so difficult—everybody has their own solution to the problems we face, and we all believe ourselves to be accurate and everyone else to be wrong. Coming to a collective agreement can be exceedingly difficult.
So, how do we change our beliefs? The first step is understanding how we come to them and that they are often faulty from the start. We have to recognize the role that our rationality plays in sustaining them, especially in the face of contradictory evidence. Oftentimes this is not enough, with true change only coming from deeper social and cultural shifts in the underlying zeitgeist. These changes are a product of “larger and harder-to-define political, economic, religious, and societal changes.” Nonetheless, a little more skepticism is a healthy thing—especially skepticism of ourselves and our own beliefs. It is important to occasionally hold your beliefs up in the mirror and ask yourself: are these actually, empirically, true?
Leave a Reply