Tuesday, May 28, 2013

An Appetite for Aggression: The peculiar psychology of war likely holds answers for avoiding future atrocities


Summarized Reading




 

Scientific American Mind

May/June 2013
p. 46
 
 
Fear is a pure form of stress.
 
We all try as best as we can to avoid fear, but the emotion is the key to survival.
When humans recognize they are in acute danger, the brain triggers a cascade of physical alarms.
 
   Senses sharpen to take in information about prospective threats.
 
   Blood rushes to the muscles.
 
   The body releases chemicals to suppress pain.
 
Then the individual is ready either to run for cover or to stand and fight.
An aggressive move often is enough to make a foe back down.
 
Lashing out in self-defense is a common response.
 
Researchers call this facilitative aggression or reactive aggression.
 
There is another form of violence--an evil form--called appetitive aggression.
Appetitive aggression arises from the thrill of the hunt.
 
The act of planning an attack can arouse intense excitement.

Appetitive aggression is a common phenomenon--which makes it truly frightening.
 
The more that young men perceive violence as giving them feelings of superiority and pleasure that otherwise are lacking in their lives--the more frequently they engage in aggressive acts and the more the young men seek out the stimulation that hunting and killing provides.
 
Inflicting pain on others is part of a human's basic stock of behaviors--every bit as much as caring for the sick and the injured.
 
The question is why it is part of the basic stock of behaviors.

A few million years ago, our ancestors began to hunt and to eat meat.

This provided a concentrated source of energy lacking in an earlier all-vegetarian diet.
 
According to one theory, eating meat allowed humans' energy-intensive brains to grow bigger and more complex.
 
This evolutionary development provided a cognitive advantage that allowed humans to dominate the planet.
 
The more successful hunters could feed more offspring, attract more sexual partners, and attain higher status in a group.

Humans not only hunt other animals, but when conflict arises, humans hunt members of our own species.
 
Culture is what constrains individuals and society from wanton violence (malicious and unrestrained violence) by outlining who is friend and who is foe.
Humans are socialized into a code of conduct that disapproves of antagonizing those belonging to our own group.

Attention has not been devoted to understanding what the consequences are when the civil code of conduct is violated on a grand scale.
 
Insights into the psychology of war are all-important to help societies rebuild after periods of conflict.
 
Trauma research documents that the greater an individual's exposure to life-threatening events, the greater the likelihood the person will develop post-traumatic stress disorder (PTSD).
 
This connection has been observed in soldiers and civilians.
 
In the evolutionary past of humans, the question becomes whether the killing of our own kind evolved as an innate strategy for securing greater reproductive success.
 
Perhaps violence might not always lead to trauma.

An opportunity to examine the idea arose with 269 Rwandan prisoners who were accused or convicted of crimes related to the genocide that occurred in 1994.
 
The prisoners were asked about the types of trauma they encountered and were asked about the crimes they committed.
 
The severity of their PTSD symptoms were assessed and the prisoners were probed to determine whether they had acquired a taste for violence.

Two questions asked of the prisoners were:
 
   Once fighting has started, do you get carried away by the violence?
 
   Once you get used to being cruel, do you want to be crueler and crueler?
 
One-third of the men answered they did; few women said that it was true for them.
 
In two other questions, more than one-half of the men agreed that they tended to get carried away by the violence of a fight, and that defeating an opponent was more fun when they saw blood.
 
30% of the women agreed they get carried away, and 40% admitted that blood made defeating an opponent more fun.
 
Although women were not as likely to develop a zeal for aggression, they were not immune to it.

Data collected from Rwanda, Ugandan child soldiers, and South African criminal offenders showed a trend.
 
The more violent the events were that these people witnesses or committed--the higher their rating on the questionnaire concerning appetitive aggression.
 
These higher scores also predicted fewer symptoms of PTSD; the correlation is that relishing violence benefited their mental health.

Observing atrocities and engaging in them predicts the enjoyment of cruelty.
Despite these findings conflicting with society's deeply held sense that violence is morally repulsive, the findings help the public comprehend how conflicts perpetuate and how so many people can die so quickly in genocidal sweeps.

In 2011, military historian Sönke Neitzel and social psychologist Harald Welzer published excerpts from the transcripts of World War II Allies who had eavesdropped on captured soldiers from the German Wehrmacht (German armed forces; combined German military branches).
 
The conversations revealed that among some of the fighters, there was a fascination with violence and the hunting down of humans.
 
The revelations upset the former war participants who feared being labeled as monsters.
 
But the transcripts were consistent with studies.

Then there is the question as to what happens when combatants return to society.
 
The hypothesis was that their lust for violence would subside.
 
Evidence showed that those who spent more time in society after the events of atrocity, the lower their scores on aggression but the higher their symptoms of trauma.
 
This is because violence is perceived as more acceptable when operating in perpetrator mode, but it becomes less acceptable when the person is returned to the tempering influence of social culture.

All the data suggests that the thrill of the kill is not a sign of mental illness and is it is not uncommon.
 
Human ancestors on the hunt were not so different from contemporary combatants.

Both groups experience great hardships; sometimes they must track their prey for days; and they must suppress their fear of being injured or killed entirely.
To endure these conditions, behaving brutally must become rewarding and potentially pleasurable.

This is not to say that humans enjoy murder.
 
Only child soldiers--who have been recruited by force before reaching puberty--sometimes describe their first kill in glowing terms.
 
Most everyone experiences extreme stress.
 
But relentless battle can break down moral inhibitions and can alter perceptions of actions.
 
War changes people.

Former soldiers talk of how their previous life provided them the opportunity to fight, rape, and kill.
 
They discuss their overwhelming sense of isolation now that they no longer are involved in heroic deeds that other fighters understand, but that civilians reject summarily.
 
Many express how they miss the power and how the sight of blood gets them going.

Because the public struggles to comprehend what happens in war, soldiers often lack finding and receiving adequate social support when they return to civilian life.
 
They may struggle with emotions that conflict with the values of their surrounding culture.
 
They may fail to adjust to new opportunities and new ways of life.
 
In order to begin counseling combatants, it is important to understand their experiences in the heat of battle so as to integrate them back into society and to help curb the calamity brought on by violence.
 
Posted by
 
 

Saturday, May 11, 2013

What do conspiracy theories, religious beliefs and detoxifying proteins have in common?

Science News




The Curious Wavefunction
Musings on chemistry and the history and philosophy of science
The Curious Wavefunction HomeAboutContact

What do conspiracy theories, religious beliefs and detoxifying proteins have in common?



People who believe in conspiracy theories display the classic symptoms of patternicity and agenticity (Image: Caffeinated Thoughts)


Why do people believe in God, ghosts, goblins, spirits, the afterlife and conspiracy theories? Two common threads running through these belief systems are what skeptic Michael Shermer in his insightful book “The Believing Brain” calls “patternicity” and “agenticity”. As the names indicate, patternicity refers to seeing meaningful patterns in meaningless noise. Agenticity refers to seeing mysterious but palpable causal ‘agents’, puppet masters who pull the strings and bring about unexplained phenomena. God is probably the perfect example of an agent.

Patternicity and agenticity can both be seen as primitive evolutionary features of our brain that have been molded into instinctive behaviors. They were important in a paleolithic environment where decisions often had to be made quickly and based on instinct. In a simple example cited by Shermer, consider an early hominid sauntering along somewhere in the African Savannah. He hears a rustle in the grass. Is it a predator or is it just the wind? If he assumes the former and it turns out to be the latter, no harm is done. But if he assumes it’s just the wind and lets down his guard and it turns out to be a predator, that’s it; he’s lunch and just got weeded out of the gene pool. The first mistake is what’s called a ‘Type 1’ or false-positive error; the second one is a ‘Type 2’ or a false-negative error. Humans seem more prone to committing false positive errors because the cost of (literally) living with those errors is often less than the cost of (literally) dying from the false negatives. Agenticity is in some sense subsumed by patternicity; in the case of the hominid, he might end up ascribing the noise in the grass to a predator (an ‘agent’) even if none exists. The important thing to realize is that we are largely the descendants of humans who made false-positive errors; natural selection ensured this perpetuation.

Before we move on it’s worth noting that assuring yourself a place in the genetic pool by committing a false positive error is not as failsafe as it sounds.

Sometimes people can actually cause harm by erring on the side of caution; this is the kind of behavior that is enshrined in the Law of Unintended Consequences. For instance after 9/11, about a thousand people died because they thought it safer to drive across the country rather than fly. 9/11 did almost nothing to tarnish the safety record of flying, but those who feared airplane terrorism (the ‘pattern’) reacted with their gut and ended up doing their competitors’ gene pools a favor.

Yet for all this criticism of pattern detection, it goes without saying that patternicity and agenticity have been immensely useful in human development. In fact the hallmark of science is pattern detection in noise. Patternicity is also key for things like solving crimes and predicting where the economy is going. However scientists, detectives and economists are all well aware of how many times the pattern detection machine in their heads misfires or backfires. When it comes to non-scientific predictions the machine’s even worse. The ugly side of patternicity and agenticity is revealed in people’s belief in conspiracy theories.
Those who think there was a giant conspiracy between the CIA, the FBI, the Mob, Castro and the executive branch of the government are confronted with the same facts that others are. Yet they connect the dots differently and elevate certain individuals and groups (‘agents’) to great significance. Patternicity connects the dots, agenticity sows belief. The tendency to connect dots and put certain agents on a pedestal is seen everywhere, from believing that vaccines cause autism to being convinced that climate change is a giant hoax orchestrated by thousands of scientists around the world.

Notwithstanding these all too common pathologies of the pattern detection machine, it’s satisfying to find a common, elegant evolutionary mechanism in our primitive brain that would be consistent with generally favoring false positives over false negatives. What I find interesting is that this behavior even seems to exist at the level of molecules.

I realized this when I was recently studying some proteins whose exclusive job is to metabolize and detoxify foreign molecules. These proteins can be seen as the gatekeepers of the cell. Throughout evolution we have been bathed in a sea of useful, useless and toxic chemicals. Our bodies need some mechanism for distinguishing the good molecules from the bad. To enable this living organisms have evolved several proteins which bind to these molecules and in most cases change their structure or simply eject them from the cell. The most important of these are called cytochrome P450 and P-Glycoprotein. Cytochrome P450 metabolizes drugs, nutrients, hormones, poisons; basically any molecular entities that living organisms encounter in a changing environment. P-Glycoprotein is a kind of vacuum cleaner that first sucks up molecules and then throws them out.


A molecular model of cytochrome P450, a protein that metabolizes and detoxifies foreign molecules such as toxins (Image: ESRF)

Cytochrome P450 and P-Glycoprotein are crucial for detoxifying our body and letting only ‘good’ molecules pass through. But like our early hominid they are imperfect and seem to often err on the side of caution, making false positive errors. This problem is routinely confronted by drug developers who are consternated to find molecules that may perform perfectly in killing cancer cells in test tubes, but that are immediately modified or ejected out of the cells by cytochrome P450 and P-Glycoprotein when administered to test subjects like rats or human beings. Finding a putative drug compound that will not be modified or rejected by cytochrome P450, P-Glycoprotein or any number of other gatekeeper proteins is one of the biggest challenges in early stage drug development.

And yet if we think about it, both cytochrome P450 and humans are doing the bidding of patternicity and agenticity. For a human as well as for a protein, generally speaking it’s much safer to make a false positive error than a false negative one. In case of cytochrome P450, it might be ok if it discards a useful nutrient or two along with dozens of toxic chemicals. But if it lets even two or three deadly compounds from, say, snail toxin or snake venom in, those might be the last compounds it encounters during the painfully short lifetime of its human owner. Now of course, at the beginning when cytochrome P450 was in the process of evolving it probably existed in many more forms than what it does today. Some of these forms committed false positive mistakes and others committed false negatives. But it’s clear from the ongoing discussion that just like the human hearing the rustle in the grass and mistaking it for the wind, proteins which committed false negative errors were declared persona non grata by natural selection and weeded out. Those making false positive mistakes lived another day to see another molecule ejected.

To me the observation of patternicity and agenticity at the level of human brains as well as individual proteins is a testament to the enormous power and elegance of evolution in molding living organisms across an incredible hierarchy of molecules, cells, organs, individuals and societies through common mechanisms. It occurs to me that if evolution had to pick favorite lines from poems, one of them would probably be “Two roads diverged on the way to life, and I took the one which made me commit a false positive error”.

Ashutosh JogalekarAbout the Author: Ashutosh (Ash) Jogalekar is a chemist interested in the history and philosophy of science. He considers science to be a seamless and all-encompassing part of the human experience. Follow on Twitter @curiouswavefn.
The views expressed are those of the author and are not necessarily those of Scientific American.

Tags: , , , ,