Sunday, October 27, 2013

Is There Such a Thing as "Human Nature"?



Is There Such a Thing as "Human Nature"?

 

Ethan Watters unravels the Western Mind.

 
 
REUTERS / JORGE SILVA


In the summer of 1995, a young graduate student in anthropology at UCLA named Joe Henrich traveled to Peru to carry out some fieldwork among the Machiguenga, an indigenous people who live north of Machu Picchu in the Amazon basin. The Machiguenga had traditionally been horticulturalists who lived in single-family, thatch-roofed houses in small hamlets composed of clusters of extended families. For sustenance, they relied on local game and produce from small-scale farming. They shared with their kin but rarely traded with outside groups.

While the setting was fairly typical for an anthropologist, Henrich’s research was not. Rather than practice traditional ethnography, he decided to run a behavioral experiment that had been developed not by psychologists or anthropologists, but by economists. Henrich used a “game”  –   along the lines of the famous prisoner’s dilemma –  to see whether relatively isolated cultures shared with the West the same basic instinct for fairness. In doing so, Henrich expected to confirm one of the foundational assumptions underlying such experiments, and indeed underpinning the entire fields of economics and psychology: that humans all share the same cognitive machinery –  the same evolved rational and psychological hardwiring.

The test that Henrich introduced to the Machiguenga was called the ultimatum game. The rules are simple: in each game there are two players who remain anonymous to each other. The first player is given an amount of money, say $100, and told that he has to offer some of the cash, in an amount of his choosing, to the other subject. The second player can accept or refuse the split. But there’s a hitch: players know that if the recipient refuses the offer, both leave empty-handed. North Americans, who are the most common subjects for such experiments, usually offer a 50-50 split when on the giving end. When on the receiving end, they show an eagerness to punish the other player for uneven splits at their own expense. In short, Americans show the tendency to be equitable with strangers – and to punish those who are not.

Among the Machiguenga, word spread quickly of the young, square-jawed visitor from America giving away money. The stakes Henrich used in the game with the Machiguenga were not insubstantial –  roughly equivalent to a few days’ wages they sometimes earned from episodic work with logging or oil companies. So Henrich had no problem finding volunteers. What he had great difficulty with, however, was explaining the rules, as the game struck the Machiguenga as profoundly odd.

When he began to run the game it became immediately clear that Machiguengan behavior was dramatically different from that of the average North American. To begin with, the offers from the first player were much lower. In addition, when on the receiving end of the game, the Machiguenga rarely refused even the lowest possible amount. “It just seemed ridiculous to the Machiguenga that you would reject an offer of free money,” says Henrich. “They just didn’t understand why anyone would sacrifice money to punish someone who had the good luck of getting to play the other role in the game.”
The implications of these unexpected results were quickly apparent to Henrich. He knew that a vast amount of scholarly literature in the social sciences –  particularly in economics and psychology –  relied on the ultimatum game and similar experiments.

At the heart of most of that research was the implicit assumption that the results revealed evolved psychological traits common to all humans, never mind that the test subjects were nearly always from the industrialized West.
 
Henrich realized that if the Machiguenga results stood out, and if similar differences could be measured across other populations, this assumption of universality would have to be challenged.

Initially, Henrich thought he would be merely adding a small branch to an established tree of knowledge. It turned out he was sawing at the very trunk. He began to wonder: What other certainties about “human nature” in social science research would need to be reconsidered when tested across diverse populations?

With the help of a dozen other colleagues he led a study of 14 other small-scale societies, in locales from Tanzania to Indonesia. Differences abounded in the behavior of both players in the ultimatum game. In no society did he find people who were purely selfish (that is, who always offered the lowest amount, and never refused a split). Average offers varied widely from place to place and, in some societies – ones where gift-giving is heavily used to curry favor or gain allegiance – the first player would often make overly generous offers in excess of 60 percent, and the second player would often reject them, behaviors almost never observed among Americans.

Henrich’s work also made him a controversial figure. When he presented his research to the anthropology department at the University of British Columbia during a job interview a year later, he recalls a hostile reception. Anthropology is the social science most interested in cultural differences, but the young scholar’s methods of using games and statistics to test and compare cultures with the West seemed heavy-handed and invasive to some. “Professors from the anthropology department suggested it was a bad thing that I was doing,” Henrich remembers. “The word ‘unethical’ came up.”

So instead of toeing the line, he switched teams. A few well-placed people at the University of British Columbia saw great promise in Henrich’s work and created a position for him, split between the economics department and the psychology department. It was in the psychology department that he found two kindred spirits in Steven Heine and Ara Norenzayan. Together the three set about writing a paper that they hoped would fundamentally challenge the way social scientists thought about human behavior, cognition, and culture.

A modern liberal arts education gives lots of lip service to the idea of cultural diversity. It’s generally agreed that all of us see the world in ways that are sometimes socially and culturally constructed, that pluralism is good, and that ethnocentrism is bad. But beyond that the ideas get muddy.
 
That we should welcome and celebrate people of all backgrounds seems obvious, but the implied corollary – that people from different ethno-cultural origins have particular attributes that add spice to the body politic – becomes more problematic. Being that liberal arts students tend to be hyper-vigilant in avoiding stereotyping, it is rarely stated bluntly just exactly what those culturally derived qualities might be. Challenge liberal arts graduates on their appreciation of cultural diversity and you’ll often find them retreating to the anodyne notion that under the skin everyone is really alike.

If you take a broad look at social science curricula of the last few decades, it becomes clear why modern graduates are so unmoored. The last generation or two of undergraduates have largely been taught by a cohort of social scientists busily doing penance for the racism and Eurocentrism of their predecessors, albeit in different ways. Many anthropologists took to the navel gazing of postmodernism and swore off attempts at rationality and science, which were disparaged as weapons of cultural imperialism.

Economists and psychologists skirted the issue with the convenient assumption that their job was to study the human mind stripped of culture. The human brain is genetically comparable around the globe, it was agreed, so human hardwiring for much behavior, perception, and cognition should be similarly universal. No need, in that case, to look beyond the convenient population of undergraduates for test subjects. A 2008 survey of the top six psychology journals dramatically shows that more than 96 percent of the subjects tested in psychological studies from 2003 to 2007 were Westerners –  with nearly 70 percent from the United States alone. Put another way: 96 percent of human subjects in these studies came from countries that represent only 12 percent of the world’s population.

Henrich’s work with the ultimatum game emerged from a small but growing counter trend in the social sciences, one in which researchers look straight at the question of how deeply culture shapes human cognition. His new colleagues in the psychology department, Heine and Norenzayan, were also part of this trend. Heine focused on the different ways people in Western and Eastern cultures perceived the world, reasoned, and understood themselves in relationship to others. Norenzayan’s research focused on the ways religious belief influenced bonding and behavior. The three began to compile examples of cross-cultural research that challenged long-held assumptions of human psychological universality.

Some of this research went back a generation. It was in the 1960s that researchers discovered that aspects of visual perception varied from place to place. One of the classics of the literature, the Müller-Lyer illusion, showed that where you grew up determined to what degree you would fall prey to the illusion that these two lines are different in length.

Researchers found that Americans perceive the line with the ends feathered outward (B) as being longer than the line with the arrow tips (A). San foragers of the Kalahari, on the other hand, were more likely to see the lines as they are: equal in length. Subjects from more than a dozen cultures were tested, and Americans were at the far end of the distribution – seeing the illusion more dramatically than all others.

Recently, psychologists began to challenge the universality of research done in the 1950s by pioneering social psychologist Solomon Asch. Asch had discovered that test subjects were often willing to make incorrect judgments on simple perception tests to conform with group pressure. When the test was performed across 17 societies however, it turned out that group pressure had a range of influence. Americans were again at the far end of the scale, showing the least tendency to conform to group belief.

Heine, Norenzayan, and Henrich’s research continued to find wide cultural differences almost everywhere they looked: in spatial reasoning, the way we infer the motivations of others, categorization, moral reasoning, the boundaries between the self and others, and other arenas. These differences, they believed, were not genetic. The distinct ways Americans and Machiguengans played the ultimatum game wasn’t because they had differently evolved brains. Rather, Americans were manifesting a psychological tendency shared with people in other industrialized countries, tendencies that had been refined and passed down through thousands of generations in increasingly complex and competitive market economies.

When people are constantly doing business with strangers, it helps when they have the desire to go out of their way (with a lawsuit or a bad Yelp review) when they feel cheated. Because Machiguengan culture had a different history, their gut feeling about what was fair was distinctly their own. In small-scale societies with a strong culture of gift-giving, yet another conception of fairness prevailed. There, generous financial offers were turned down because people’s minds had been shaped by a cultural norm that taught them that the acceptance of generous gifts brought burdensome obligations. Our economies hadn’t been shaped by our idea of fairness; it was the other way around.

The growing body of cross-cultural research that the three researchers were compiling suggested that the mind’s capacity to mold itself to cultural and environmental settings was far greater than had been assumed. 
 
The most interesting thing about cultures may not be in the observable things they do –  the rituals, eating preferences, codes of behavior, and the like –  but in the way they mold our most fundamental conscious and unconscious thinking and perception. The different ways people perceive the Müller-Lyer illusion reflects lifetimes spent in different physical environments. American children, for the most part, grow up in box-shaped rooms of varying dimensions. Surrounded by carpentered corners, visual perception adapts to this strange new environment (strange and new in terms of human history, that is) by learning to perceive converging lines in three dimensions. When unconsciously translated in three dimensions, the line with the outward-feathered ends (C) appears farther away and the brain therefore judges it to be longer. The more time one spends in natural environments, where there are no carpentered corners, the less one sees the illusion.

But the three researchers began to notice something else remarkable: again and again there was one group of people that appeared to be remarkably unusual when compared to other populations –  with perceptions, behaviors, and motivations that almost always slid down the far end of the human bell curve.

In the end they titled their paper “The Weirdest People in the World?” By “weird” they meant both unusual and Western, Educated, Industrialized, Rich, and Democratic.
It is not just our Western habits and cultural preferences that wholly differ from the rest of the world. The very way we think about ourselves and others – even the way we perceive reality –  makes us radically distinct from other humans on the planet today, not to mention from the vast majority of our ancestors. Among Westerners, the data showed that Americans were typically the most unusual, leading the researchers to conclude that “American participants are exceptional even within the unusual population of Westerners –  outliers among outliers”
Given this discovery, they concluded that social scientists could not possibly have picked a worse population from which to draw broad generalizations about universal human traits. Generations of researchers had been doing the equivalent of studying penguins while believing that they were learning insights applicable to all birds.

Not long ago I met Henrich, Heine, and Norenzayan for dinner at a small French restaurant in Vancouver, British Columbia, to hear about the reception of their “weird” paper, which was published in the prestigious journal Behavioral and Brain Sciences in 2010. The trio of researchers are young – as professors go –  good-humored family men. They recalled their nervousness as the publication time approached. They faced the possibility of becoming outcasts in their own fields as their paper boldly suggested that much of what social scientists thought they knew about fundamental aspects of human cognition was likely only true of one small slice of humanity.
“We were scared,” admitted Henrich. “We were warned that a lot of people were going to be upset.” “We were told we were going to get spit on,” interjected Norenzayan.
“Yes,” Henrich said. “That we’d go to conferences and no one was going to sit next to us at lunchtime.”
They seemed much less concerned that they had used the pejorative acronym WEIRD to describe a significant slice of humanity, although they did admit that they could only have done so to describe their own group. “Really,” said Henrich, “the only people we could have called weird are represented right here at this table.”

Still, I felt that describing the Western mind, and the American mind in particular, as “weird” suggested that our cognition is not just different but somehow malformed or twisted.

In their paper the trio pointed out cross-cultural studies that suggest that the “weird” [Western, Educated, Industrial, Rich, Democratic] mind is the most self-aggrandizing and egotistical on the planet: we are more likely to promote ourselves as individuals versus advancing as a group. 
 
WEIRD minds are also more analytic, possessing the tendency to telescope in on an object of interest rather than understanding that object in the context of what is around it.The WEIRD mind also appears to be unique in terms of how it comes to understand and interact with the natural world. Studies show that Western urban children grow up so closed off in man-made environments that their brains never form a deep or complex connection to the natural world. While studying children from the U.S., researchers have suggested a developmental timeline for what is called “folkbiological reasoning.” These studies posit that it is not until children are around 7 years old that they stop projecting human qualities onto animals and begin to understand that humans are one animal among many. Compared to Yucatec Maya communities in Mexico, Western urban children appear to be developmentally delayed in this regard. Children who grow up constantly interacting with the natural world are much less likely to anthropomorphize other living things into late childhood.
Given that people living in WEIRD societies don’t routinely encounter or interact with animals other than humans or pets, it’s not surprising that they end up with a rather cartoonish understanding of the natural world. “Indeed,” the report concluded, “studying the cognitive development of folkbiology in urban children would seem the equivalent of studying ‘normal’ physical growth in malnourished children.”

During our dinner, I admitted to Heine, Henrich, and Norenzayan that the idea that I can only perceive reality through a distorted cultural lens was unnerving. For me, the notion raised all sorts of metaphysical questions: Is my thinking so strange that I have little hope of understanding people from other cultures? Can I mold my own psyche or the psyches of my children to be less WEIRD and more able to think like the rest of the world? If I did, would I be happier?
Henrich said I was taking this research too personally. He had not intended for his work to be read as postmodern self-help advice.

The three insisted that their goal was not to say that one culturally shaped psychology was better or worse than another –  but that we’ll never begin to understand human behavior and cognition until we expand the sample pool beyond its current small slice of humanity. Despite these assurances, I found it hard not to read a message between the lines of their research. When they write, for example, that WEIRD children develop their understanding of the natural world in a “culturally and experientially impoverished environment” and that they are in this way analogous to “malnourished children,” it’s difficult to see this as a good thing.

The turn that Henrich, Heine, and Norenzayan are urging social scientists to make is not an easy one; accounting for the influence of culture on cognition will be a herculean task. Cultures are not monolithic; they can be endlessly parsed. Ethnic backgrounds, religious beliefs, economic status, parenting styles, rural upbringing versus urban or suburban – there are hundreds of cultural differences that individually and in endless combinations influence our conceptions of fairness, how we categorize things, our method of judging and decision making, and our deeply held beliefs about the nature of the self, among other aspects of our psychological makeup.

We are just at the beginning of learning how these fine-grained cultural differences affect our thinking. 
 
Recent research has shown that people in “tight” cultures, those with strong norms and low tolerance for deviant behaviour, develop higher impulse control and more self-monitoring abilities than those from other places. Men raised in the honor culture of the American South have been shown to experience much larger surges of testosterone after insults than do Northerners. Research published late last year suggested psychological differences at the city level too. Compared to San Franciscans, Bostonians’ internal sense of self-worth is more dependent on community status and financial and educational achievement. “A cultural difference doesn’t have to be big to be important,” Norenzayan said. “We’re not just talking about comparing New York yuppies to the Dani tribesmen of Papua New Guinea.”

As Norenzayan sees it, the last few generations of psychologists have suffered from “physics envy,” and they need to get over it. Their job, experimental psychologists often assumed, was to push past the content of people’s thoughts and see the underlying universal hardware at work. “This is a deeply flawed way of studying human nature,” Norenzayan told me, “because the content of our thoughts and their process are intertwined.” In other words, if human cognition is shaped by cultural ideas and behavior, it can’t be studied without taking into account what those ideas and behaviors are and how they are different from place to place.

Henrich, Heine and Norenzayan’s fear of being ostracized after the publication of the WEIRD paper turned out to be misplaced. Response to the paper, both published and otherwise, has been nearly universally positive, with more than a few of their colleagues suggesting that the work will spark fundamental changes.

After reading the paper, academics from other disciplines began to come forward with their own mea culpas. In response to the WEIRD paper, two brain researchers from Northwestern University argued that the nascent field of neuroimaging had made the same mistake as psychologists, noting that 90 percent of neuroimaging studies were performed in Western countries. Researchers in motor development similarly suggested that their discipline’s body of research ignored how different child-rearing practices around the world dramatically influence development. Two psycholinguistics professors suggested that their colleagues also made the same mistake: blithely assuming human homogeneity while focusing their research primarily on one rather small slice of humanity.

The challenge of the WEIRD paper is not simply to the field of experimental human research (do more cross-cultural studies!); it is a challenge to our Western conception of human nature. 

For some time now, the most widely accepted answer to the question of why humans, among all animals, have so successfully adapted to environments across the globe is that we have big brains with the ability to learn, improvise, and problem-solve.Henrich has replaced this “cognitive niche” hypothesis with the “cultural niche” hypothesis. He notes that the amount of knowledge in any culture is far greater than the capacity of individuals to learn or figure it all out on their own. He suggests that individuals tap that cultural storehouse of knowledge simply by mimicking (often unconsciously) the behavior and ways of thinking of those around them. We shape a tool in a certain manner, adhere to a food taboo, or think about fairness in a particular way, not because we individually have figured out that behavior’s adaptive value, but because we instinctively trust our culture to show us the way.

When Henrich asked Fijian women why they avoided certain potentially toxic fish during pregnancy and breastfeeding, he found that many didn’t know. Regardless of their personal understanding, by mimicking this culturally adaptive behavior they were protecting their offspring. The unique trick of human psychology, these researchers suggest, might be this: our big brains are evolved to let local culture lead us in life’s dance.

The applications of this new way of looking at the human mind are still budding. Henrich suggests that his research about fairness might first be applied to anyone working in international relations or development. People are not “plug and play,” as he puts it, and you cannot expect to drop a Western court system or form of government into another culture and expect it to work as it does back home. Those trying to use economic incentives to encourage sustainable land use will similarly need to understand local notions of fairness to have any chance of influencing behavior in predictable ways.

Because of our peculiarly Western way of thinking of ourselves as independent of others, this idea of the culturally shaped mind doesn’t go down very easily.
Perhaps the richest and most established vein of cultural psychology – that which compares Western and Eastern concepts of the self –  goes to the heart of this problem. Heine has spent much of his career following the lead of a seminal paper published in 1991 by Hazel Rose Markus, of Stanford University, and Shinobu Kitayama, who is now at the University of Michigan. Markus and Kitayama suggested that different cultures foster strikingly different views of the self, particularly along one axis: some cultures regard the self as independent from others; others see the self as interdependent. The interdependent self –  which is more the norm in East Asian countries, including Japan and China –  connects itself with others in a social group and favors social harmony over self-expression. The independent self –  which is most prominent in America –  focuses on individual attributes and preferences and thinks of the self as existing apart from the group.

That we in the West develop brains that are wired to see ourselves as separate from others may also be connected to differences in how we reason. Unlike the vast majority of the world, Westerners (Americans in particular) tend to reason analytically opposed to holistically. The American mind strives to figure out the world by taking it apart and examining its pieces. 

Show a Japanese and an American the same cartoon of an aquarium, and the American will remember details mostly about the moving fish while the Japanese observer will likely later be able to describe the seaweed, the bubbles and other objects in the background. In the “rod and frame” task, where one has to judge whether a line is vertical even though the frame around it is skewed, Americans easily see the line apart from the frame, just as they see themselves as apart from the group.

Heine and others suggest that such differences may be the echoes of cultural activities and trends going back thousands of years. Whether you think of yourself as interdependent or independent may depend on whether your distant ancestors farmed rice (which required a great deal of shared labor and group cooperation) or herded animals (which rewarded individualism and aggression). Heine points to Nisbett at Michigan, who has argued that the analytic/holistic dichotomy in reasoning styles can be clearly seen, respectively, in Greek and Chinese philosophical writing dating back 2,500 years. These psychological trends and tendencies may echo down generations, hundreds of years after the activity or situation that brought them into existence has disappeared or fundamentally changed.

And here is the rub: the culturally shaped analytic/individualistic mind-sets may partly explain why Western researchers have so dramatically failed to take into account the interplay between culture and cognition. In the end, the goal of boiling down human psychology to hardwiring is not surprising given the type of mind that has been designing the studies. Taking an object (in this case the human mind) out of its context is, after all, what distinguishes the analytic reasoning style prevalent in the West. Similarly, we may have underestimated the impact of culture because the very ideas of being subject to the will of larger historical currents and of unconsciously mimicking the cognition of those around us challenges our Western conception of the self as independent and self-determined. The historical missteps of Western researchers are the predictable consequences of the WEIRD mind doing the thinking.

Ethan Watters, a contributor to This American Life, Mother Jones, and Wired, is the author of Crazy Like Us: The Globalization of the American Psyche. This article originally appeared in Pacific Standard.

Friday, October 4, 2013

The Human Condition: Struggles of Cosmic Insignificance: The Polarized Mind




The Human Condition: Struggles of Cosmic Insignificance

           

The polarized mind — a reaction to the human condition and feelings of cosmic insignificance — fixates on one point of view to the utter exclusion of all others.

By Kirk J. Schneider, PhD
October 2013

Polarizing Events Within the Mind
"The Polarized Mind" by Kirk J. Schneider, PhD, draws from the standpoint of existential psychology, and details how the polarized mind has ravaged leaders and cultures throughout history.


Photo By Fotolia/Andrew Kuzmin



In The Polarized Mind (University Professors Press, 2013), Kirk J. Schneider, PhD, states that an individual, stricken with one absolute belief to the exclusion or even demonization of others, leads to bigotry, tyranny, and vengefulness. Dr. Schneider draws on his work in the field of humanistic depth psychology to posit that polarization is caused by a sense of cosmic insignificance, heightened in the trials of personal trauma. In this selection from "The Bases of Polarization," the nature of the human condition plays a fundamental role in the formation of polarization in the human mind.


The Bases of Polarization


Throughout human history, people have repeatedly swung between extremes. Arthur Schlesinger Jr. (1986) noted these swings in his classic book The Cycles of American History. In this book, Schlesinger articulated the continuous political swings in U.S. history, particularly those between conservatism and liberalism, rigidity and permissiveness. However, there are many other forms of such swings in many other times and places.

The usual explanations for the swings of history, as well as individuals, are cultural, political, and biological. The founders of the United States swung away from the British motherland because of political and religious oppression. Certain nineteenth century abolitionists resorted to armed struggle because of unrelenting federal support of slavery. Post World War I Germany amassed a titanic arsenal, in part to avenge the humiliation it perceived at the Treaty of Versailles. McCarthyite anti-communists swelled in number following the advent of Soviet expansionism in Eastern Europe. And so on. Schlesinger provides a cogent summation of these various dynamics:
"The roots of…cyclical self-sufficiency doubtless lie deep in the natural life of humanity. There is a cyclical pattern in organic nature — in the tides, in the seasons, in the night and day, in the systole and diastole of the human heart….People can never be fulfilled for long either in the public or in the private sector. We try one, then the other, and frustration compels a change in course. Moreover, however effective a particular course may be in meeting one set of troubles, it generally falters and fails when new troubles arise."

At the individual level, too, the conventional wisdom embraces both cultural and biological explanations. Depression is now frequently considered a biologically based disorder, rooted in an imbalance of Serotonin in the brain. Anorexia, too, is often considered a biologically and culturally based condition, stemming from an overemphasis on thinness in Western fashions. Obsessive-compulsiveness, mania, criminality, and many other forms of suffering are also considered combinations of biologically or genetically based chemical imbalances and familial or cultural influences.

However, thanks to the expanding insights of psychological depth research, we now have a clearer picture that what we once took to be biologically or culturally based appears to be rooted in a much thornier problem — the condition of being human. What I mean by this is that polarization in all forms appears to be based not just on a reaction against a particular family, or society, or physiology but on the shocking nature of the human condition itself, which, at its extremes, is the most daunting condition of all.

And what is this human condition (or “condition humane,” as Andre Malreaux put it)? It is the relationship of the human being to the groundlessness of space and time, to death, and to the most radical mystery of all, existence itself.

Just consider, for example, what happens in a classic pattern of polarization. A person or persons become injured, and the injury leads to a reaction. This reaction might take the form of a religious decree to rid the world of infidels (e.g., those who had formerly attempted to undermine established religious precepts). It might take the form of a political mobilization (e.g., fanatical nationalism) in the wake of a threatened state (e.g., post World War I Germany). Or it might take the form of a humiliated individual vowing to avenge his abusers. In each of these cases there is a time-tested dynamic at play. Initially there is a sense of helplessness (despair), then there is a reaction against that helplessness (polarization, fanaticism), followed by a destructive outbreak as a result of that reaction.

Hence, if we peel back the layers of this scenario, what do we find? We certainly find physiological (e.g., fight and flight) reactions (as Schlesinger and others have noted), classical psychodynamic issues stemming from childhood (as Freud and others have contended), and behavioral dimensions, such as conditioned reactions to aversive stimuli (as Skinner and others have pointed out). But are these really the essential building blocks of polarized experiences, of stuck and life-encompassing fixations, or the extremes and fanaticisms of theocracies, military-industrial complexes, and hate-driven assassins? The emerging consensus is quite probably “no.”

Although the traditional explanations work to a point, depth research unveils that they are but harbingers or glimpses of a much more encompassing problem. For example, the latest research on the roots of extremism centers on what is aptly termed “terror management theory”. In this theory, built on a growing base of cross-cultural research, the deeper we probe the layers of psychosocial extremes, the closer we come to anxieties about existence itself — coinciding with and extending beyond the physiological, familial, and cultural. In short, according to Terror Management Theory, polarization arises from culturally diversified experiences of death anxiety, and death anxiety is aroused by an extraordinary range of secondary fears. Among these fears is humiliation, or the terror of feeling insignificant. Witness the following conclusion by Arie Kruglanski, one of the leading investigators of global terrorism. Drawing in part on Terror Management research, and in part on the data that he and his colleagues at the National Consortium for the Study of Terrorism and Responses to Terrorism have amassed, he asserts:

"Personal significance is a motivation that has been recognized by psychological theorists as a major driving force of human behavior. Terrorists feel that through suicide, their lives will achieve tremendous significance. They will become heroes, martyrs. In many cases, their decision is a response to great loss of significance, which can occur through humiliation, discrimination or personal problems…Interesting[ly], research shows that poverty is not the root cause of terrorism. Many terrorists come out of the middle class, and some [like Osama Bin Laden] are quite well-to-do."
As Kruglanski intimates, therefore, terrorism in particular and polarization in general are rooted in a very profound problem of the human situation. It is not a problem that can simply be eliminated through material comforts, physical well-being, or even in some cases loving and well-adjusted families. It is a problem that each individual must confront in varying degrees during their lifetimes, because it is not a problem that will go away. Although physical and psychological vulnerability and ultimately annihilation appear to be at the crux of this problem, I suggest that there is something even more subtle at play, something even more harrowing. This “something” is what I and others call “existential anxiety.” Existential anxiety is not just the fear of physical death but the fear of the implications of physical death. Existing in a universe that has no calculable end and no calculable beginning — that is a radical mystery. It is our terror of our bewildering fragility, our nothingness before the vastness of space and time, and our steady transformation from matter to inexplicable dust.

Trauma, shock, and disruption all signal us to this incomprehensible state of affairs. They jar us out of our comfort zones and peel back the profundities lying just beneath our routines. Virtually everyone who is polarized, I contend, has been a victim of existential panic; and virtually all of us, in varying degrees, have experienced this polarization. The question is: How do we prevent, or at least manage, the most destructive polarizations — the polarizations that wage egregious wars, that initiate relentless hatred, that concentrate obscene accumulation of wealth, and that deplete, everyday, the imperative resources of nations?

Before we can address this question, we need to look more thoroughly at the psychological bases of polarization throughout history, the damage that polarization has wreaked, and the fates that have awaited those who fought valiantly to oppose it.


Reprinted with permission from The Polarized Mind: Why It's Killing Us and What We Can Do About It by Kirk J. Schneider, PhD and published by University Professor's Press, 2013.