Saturday, August 31, 2013

Loneliness Is Deadly

 

Slate

 

Loneliness Is Deadly

Social isolation kills more people than obesity does—and it’s just as stigmatized.



Robert Neubecker/Slate
 
 

In a society that judges you based on your social networks, loneliness can feel shameful.



Over the winter I moved from New York City to Portland, Ore. The reasons for my move were purely logical. New York was expensive and stressful. Portland, I reasoned, would offer me the space and time to do my work.
 
Upon arriving, I rented a house and happily went out in search of "my people." I went to parks, bookstores, bars, on dates. I even tried golfing. It wasn't that I didn't meet people. I did. I just felt no connection to any of them.
 
Once social and upbeat, I became morose and mildly paranoid. I knew I needed to connect to people to feel better, but I felt as though I physically could not handle any more empty interactions. I woke up in the night panicked. In the afternoon, loneliness came in waves like a fever. I had no idea how to fix it.
 
Feeling uncertain, I began to research loneliness and came across several alarming recent studies. Loneliness is not just making us sick, it is killing us. Loneliness is a serious health risk. Studies of elderly people and social isolation concluded that those without adequate social interaction were twice as likely to die prematurely.
 
The increased mortality risk is comparable to that from smoking. And loneliness is about twice as dangerous as obesity.
Social isolation impairs immune function and boosts inflammation, which can lead to arthritis, type II diabetes, and heart disease. Loneliness is breaking our hearts, but as a culture we rarely talk about it.
 
Loneliness has doubled: 40 percent of adults in two recent surveys said they were lonely, up from 20 percent in the 1980s.
 
All of our Internet interactions aren’t helping and may be making loneliness worse. A recent study of Facebook users found that the amount of time you spend on the social network is inversely related to how happy you feel throughout the day.
 
In a society that judges you based on how expansive your social networks appear, loneliness is difficult to fess up to. It feels shameful.
 
About a decade ago, my mom was going through a divorce from my step-father. Lonely and desperate for connection, she called a cousin she hadn’t talked to in several years. On the phone, her cousin was derisive: “Don’t you have any friends?”
 
While dealing with my own loneliness in Portland I often found myself thinking, "If I were a better person I wouldn't be lonely."
 
“Admitting you are lonely is like holding a big L up on your forehead,” says John T. Cacioppo of the University of Chicago, who studies how loneliness and social isolation affect people’s health.
 
He admitted that on an airplane he once became acutely embarrassed while holding a copy of his own book, which had the word Loneliness emblazoned on the front cover. He had the impulse to turn the cover inside-out so that people couldn’t see it. “For the first time I actually experienced the feeling of being lonely and everyone knowing it,” he says.
 
After the public learned of Stephen Fry’s suicide attempt last year, the beloved British actor wrote a blog post about his fight with depression. He cited loneliness as the worst part of his affliction.
 
“Lonely? I get invitation cards through the post almost every day. I shall be in the Royal Box at Wimbledon and I have serious and generous offers from friends asking me to join them in the South of France, Italy, Sicily, South Africa, British Columbia, and America this summer. I have two months to start a book before I go off to Broadway for a run of Twelfth Night there.
I can read back that last sentence and see that, bipolar or not, if I’m under treatment and not actually depressed, what the fuck right do I have to be lonely, unhappy, or forlorn? I don’t have the right. But there again I don’t have the right not to have those feelings. Feelings are not something to which one does or does not have rights.
In the end loneliness is the most terrible and contradictory of my problems.”
Most of us know what it is like to be lonely in a room full of people, which is the same reason even a celebrity can be deeply lonely. You could be surrounded by hundreds of adoring fans, but if there is no one you can rely on, no one who knows you, you will feel isolated.
 
In terms of human interactions, the number of people we know is not the best measure. In order to be socially satisfied, we don’t need all that many people. According to Cacioppo the key is in the quality, not the quantity of those people. We just need several on whom we can depend and who depend on us in return.
 
As a culture we obsess over strategies to prevent obesity. We provide resources to help people quit smoking. But I have never had a doctor ask me how much meaningful social interaction I am getting. Even if a doctor did ask, it is not as though there is a prescription for meaningful social interaction.
 
Both Denmark and Great Britain are devoting more time and energy to finding solutions and staging interventions for lonely people, particularly the elderly.
When we are lonely, we lose impulse control and engage in what scientists call “social evasion.” We become less concerned with interactions and more concerned with self-preservation, as I was when I couldn’t even imagine trying to talk to another human. Evolutionary psychologists speculate that loneliness triggers our basic, fight vs. flight survival mechanisms, and we stick to the periphery, away from people we do not know if we can trust.
 
In one study, Cacioppo measured brain activity during the sleep of lonely and nonlonely people. Those who were lonely were far more prone to micro awakenings, which suggest the brain is on alert for threats throughout the night, perhaps just as earlier humans would have needed to be when separated from their tribe.
One of the reasons we avoid discussing loneliness is that fixing it obviously isn’t a simple endeavor.
 
Even though the Internet has possibly contributed to our isolation, it might hold a key to fixing it. Cacioppo is excited by online dating statistics showing that couples who found each other online and stayed together shared more of a connection and were less likely to divorce than couples who met offline. If these statistics hold up, it would stand to reason friendships could also be found in this way, easing those whose instincts tell them to stay on the periphery back into the world with common bonds forged over the Internet.
 
Me? I moved back to New York.

Wednesday, August 28, 2013

Are allergies trying to protect us from ourselves?



PLOS Blogs

 

Are allergies trying to protect us from ourselves?





I have a love/hate relationship with spring, thanks to the aggravating bouts of hay fever that transform me into a faucet for pretty much the entire season. So I’ll admit I was a little skeptical when my editor at Scientific American asked me last week if I wanted to write about a new paper coming out in Nature suggesting that allergies may actually be a good thing. But always curious, I said sure.

Turns out it’s a fascinating—and pretty convincing—read. It’s dense, but the lead author, Yale immunobiologist Ruslan Medzhikov, was kind to take a good two hours out of his day on Monday to explain some of the gnarlier concepts to me. (Medzhikov is fascinating—you can read more about him in this profile published in Disease Models & Mechanisms.)
Medzhikov’s basic argument is that there is a convincing body of research suggesting that allergies have beneficial effects. They break down the toxic components of bee, snake, scorpion and gila monster venom, for instance, and our allergic reactions to tick saliva prevent the parasites from feeding.

Ultimately, all allergic responses work towards a common goal: avoidance and expulsion, Medzhitov argues. As I explain in my piece,
More generally, hated allergic symptoms keep unhealthy environmental irritants out of the body, Medzhitov posits. “How do you defend against something you inhale that you don’t want? You make mucus. You make a runny nose, you sneeze, you cough, and so forth. Or if it’s on your skin, by inducing itching, you avoid it or you try to remove it by scratching it,” he explains. Likewise, if you’ve ingested something allergenic, your body might react with vomiting. Finally, if a particular place or circumstance ramps up your allergies, you’re likely to avoid it in the future. “The thing about allergies is that as soon as you stop exposure to an allergen, all the symptoms are gone,” he says.
Obviously, Medzhitov’s theory is just a theory, and it involves a lot of speculation (albeit informed speculation by a really smart guy). But some research suggests an association between allergy severity and cancer risk, in that people with more allergy symptoms are less likely to develop certain cancers. (One shouldn’t read too much into this though; some other factor may drive the association. Perhaps people who eat lots of eggs are more likely to have allergies but less likely to have cancer.) But all in all, I think Medzhitov’s idea does make sense and is well-supported, and most of the outside experts I spoke with agreed, though they did raise questions about some of the specifics.

One aspect of the theory that I didn’t mention in my piece is that it could explain a medical mystery: penicillin allergies. Medzhitov argues that in addition to protecting against venoms, vector-borne diseases and environmental irritants, allergies also evolved to protect against a class of toxins called haptens: proteins that bind to extracellular or membrane-bound proteins in the body, rendering them useless and ultimately causing all sorts of problems. As it turns out, in some people, the penicillin molecule undergoes transformation into a hapten. This transformation is very slow and inefficient—very few penicillin markers turn into haptens, which is a good thing because haptenated penicillin could be dangerous—but nevertheless, some people may develop allergic responses to these few haptenated penicillin molecules, and this can result in an allergic hypersensitivity to the drug, Medzhitov posits.

In the case of something like a penicillin allergy, management is fairly simple (though medically inconvenient): avoid penicillin. The problem today is that there may be millions of allergens in the form of environmental pollutants and irritants, and they may simply be unavoidable. This idea could help explain why allergic diseases have become more common in recent decades: We’re exposed to many more pollutants now than we were 50 years ago, and this chemical flurry could be dialing up our innate defense systems to a constant level of 11. An allergy may be protective, but “if it’s taken to an extreme, it is pathological,” Medzhitov says. I wonder, then, if we may have built ourselves a world that will forever make us sick.

Citations:

Palm, N., Rosenstein, R., & Medzhitov, R. (2012). Allergic host defences Nature, 484 (7395), 465-472 DOI: 10.1038/nature11047
Medzhitov, Ruslan (2011). Innovating immunology: an interview with Ruslan Medzhitov Disease Models & Mechanisms, 4 (4), 430-432 DOI: 10.1242/dmm.008151
Akahoshi M, Song CH, Piliponsky AM, Metz M, Guzzetta A, Abrink M, Schlenner SM, Feyerabend TB, Rodewald HR, Pejler G, Tsai M, & Galli SJ (2011). Mast cell chymase reduces the toxicity of Gila monster venom, scorpion venom, and vasoactive intestinal polypeptide in mice. The Journal of clinical investigation, 121 (10), 4180-91 PMID: 21926462
Wada T, Ishiwata K, Koseki H, Ishikura T, Ugajin T, Ohnuma N, Obata K, Ishikawa R, Yoshikawa S, Mukai K, Kawano Y, Minegishi Y, Yokozeki H, Watanabe N, & Karasuyama H (2010). Selective ablation of basophils in mice reveals their nonredundant role in acquired immunity against ticks. The Journal of clinical investigation, 120 (8), 2867-75 PMID: 20664169
Sherman, P., Holland, E., & Sherman, J. (2008). Allergies: Their Role in Cancer Prevention The Quarterly Review of Biology, 83 (4), 339-362 DOI: 10.1086/592850

Creative Commons License
The Are allergies trying to protect us from ourselves? by Body Politic, unless otherwise expressly stated, is licensed under a Creative Commons Attribution-NonCommercial 3.0 Unported License.

 

Saturday, August 17, 2013

A History of Violence




In the decade of Darfur and Iraq, and shortly after the century of Stalin, Hitler, and Mao, the claim that violence has been diminishing may seem somewhere between hallucinatory and obscene. Yet recent studies that seek to quantify the historical ebb and flow of violence point to exactly that conclusion.
 

A HISTORY OF VIOLENCE
by Steven Pinker

Introduction

Once again, Steven Pinker returns to debunking the doctrine of the noble savage in the following piece based on his lecture at the recent TED Conference in Monterey, California.

This doctrine, "the idea that humans are peaceable by nature and corrupted by modern institutions—pops up frequently in the writing of public intellectuals like José Ortega y Gasset ("War is not an instinct but an invention"), Stephen Jay Gould ("Homo sapiens is not an evil or destructive species"), and Ashley Montagu ("Biological studies lend support to the ethic of universal brotherhood")," he writes. "But, now that social scientists have started to count bodies in different historical periods, they have discovered that the romantic theory gets it backward: Far from causing us to become more violent, something in modernity and its cultural institutions has made us nobler."

Pinker's notable talk, along with his essay, is one more example of how ideas forthcoming from the empirical and biological study of human beings is gaining sway over those of the scientists and others in disciplines that rely on studying social actions and human cultures independent from their biological foundation.

JB
STEVEN PINKER is the Johnstone Family Professor in the Department of Psychology at Harvard University. His most recent book is The Blank Slate.

A HISTORY OF VIOLENCE

In sixteenth-century Paris, a popular form of entertainment was cat-burning, in which a cat was hoisted in a sling on a stage and slowly lowered into a fire. According to historian Norman Davies, "[T]he spectators, including kings and queens, shrieked with laughter as the animals, howling with pain, were singed, roasted, and finally carbonized." Today, such sadism would be unthinkable in most of the world. This change in sensibilities is just one example of perhaps the most important and most underappreciated trend in the human saga: Violence has been in decline over long stretches of history, and today we are probably living in the most peaceful moment of our species' time on earth.
 
In the decade of Darfur and Iraq, and shortly after the century of Stalin, Hitler, and Mao, the claim that violence has been diminishing may seem somewhere between hallucinatory and obscene. Yet recent studies that seek to quantify the historical ebb and flow of violence point to exactly that conclusion.

Some of the evidence has been under our nose all along. Conventional history has long shown that, in many ways, we have been getting kinder and gentler. Cruelty as entertainment, human sacrifice to indulge superstition, slavery as a labor-saving device, conquest as the mission statement of government, genocide as a means of acquiring real estate, torture and mutilation as routine punishment, the death penalty for misdemeanors and differences of opinion, assassination as the mechanism of political succession, rape as the spoils of war, pogroms as outlets for frustration, homicide as the major form of conflict resolution—all were unexceptionable features of life for most of human history. But, today, they are rare to nonexistent in the West, far less common elsewhere than they used to be, concealed when they do occur, and widely condemned when they are brought to light.

At one time, these facts were widely appreciated. They were the source of notions like progress, civilization, and man's rise from savagery and barbarism. Recently, however, those ideas have come to sound corny, even dangerous. They seem to demonize people in other times and places, license colonial conquest and other foreign adventures, and conceal the crimes of our own societies. The doctrine of the noble savage—the idea that humans are peaceable by nature and corrupted by modern institutions—pops up frequently in the writing of public intellectuals like José Ortega y Gasset ("War is not an instinct but an invention"), Stephen Jay Gould ("Homo sapiens is not an evil or destructive species"), and Ashley Montagu ("Biological studies lend support to the ethic of universal brotherhood"). But, now that social scientists have started to count bodies in different historical periods, they have discovered that the romantic theory gets it backward: Far from causing us to become more violent, something in modernity and its cultural institutions has made us nobler.
To be sure, any attempt to document changes in violence must be soaked in uncertainty. In much of the world, the distant past was a tree falling in the forest with no one to hear it, and, even for events in the historical record, statistics are spotty until recent periods. Long-term trends can be discerned only by smoothing out zigzags and spikes of horrific bloodletting. And the choice to focus on relative rather than absolute numbers brings up the moral imponderable of whether it is worse for 50 percent of a population of 100 to be killed or 1 percent in a population of one billion.
Yet, despite these caveats, a picture is taking shape. The decline of violence is a fractal phenomenon, visible at the scale of millennia, centuries, decades, and years. It applies over several orders of magnitude of violence, from genocide to war to rioting to homicide to the treatment of children and animals. And it appears to be a worldwide trend, though not a homogeneous one. The leading edge has been in Western societies, especially England and Holland, and there seems to have been a tipping point at the onset of the Age of Reason in the early seventeenth century.
At the widest-angle view, one can see a whopping difference across the millennia that separate us from our pre-state ancestors. Contra leftist anthropologists who celebrate the noble savage, quantitative body-counts—such as the proportion of prehistoric skeletons with axemarks and embedded arrowheads or the proportion of men in a contemporary foraging tribe who die at the hands of other men—suggest that pre-state societies were far more violent than our own. It is true that raids and battles killed a tiny percentage of the numbers that die in modern warfare. But, in tribal violence, the clashes are more frequent, the percentage of men in the population who fight is greater, and the rates of death per battle are higher. According to anthropologists like Lawrence Keeley, Stephen LeBlanc, Phillip Walker, and Bruce Knauft, these factors combine to yield population-wide rates of death in tribal warfare that dwarf those of modern times. If the wars of the twentieth century had killed the same proportion of the population that die in the wars of a typical tribal society, there would have been two billion deaths, not 100 million.

Political correctness from the other end of the ideological spectrum has also distorted many people's conception of violence in early civilizations—namely, those featured in the Bible. This supposed source of moral values contains many celebrations of genocide, in which the Hebrews, egged on by God, slaughter every last resident of an invaded city. The Bible also prescribes death by stoning as the penalty for a long list of nonviolent infractions, including idolatry, blasphemy, homosexuality, adultery, disrespecting one's parents, and picking up sticks on the Sabbath. The Hebrews, of course, were no more murderous than other tribes; one also finds frequent boasts of torture and genocide in the early histories of the Hindus, Christians, Muslims, and Chinese.

At the century scale, it is hard to find quantitative studies of deaths in warfare spanning medieval and modern times. Several historians have suggested that there has been an increase in the number of recorded wars across the centuries to the present, but, as political scientist James Payne has noted, this may show only that "the Associated Press is a more comprehensive source of information about battles around the world than were sixteenth-century monks." Social histories of the West provide evidence of numerous barbaric practices that became obsolete in the last five centuries, such as slavery, amputation, blinding, branding, flaying, disembowelment, burning at the stake, breaking on the wheel, and so on. Meanwhile, for another kind of violence—homicide—the data are abundant and striking. The criminologist Manuel Eisner has assembled hundreds of homicide estimates from Western European localities that kept records at some point between 1200 and the mid-1990s. In every country he analyzed, murder rates declined steeply—for example, from 24 homicides per 100,000 Englishmen in the fourteenth century to 0.6 per 100,000 by the early 1960s.

On the scale of decades, comprehensive data again paint a shockingly happy picture: Global violence has fallen steadily since the middle of the twentieth century. According to the Human Security Brief 2006, the number of battle deaths in interstate wars has declined from more than 65,000 per year in the 1950s to less than 2,000 per year in this decade. In Western Europe and the Americas, the second half of the century saw a steep decline in the number of wars, military coups, and deadly ethnic riots.

Zooming in by a further power of ten exposes yet another reduction. After the cold war, every part of the world saw a steep drop-off in state-based conflicts, and those that do occur are more likely to end in negotiated settlements rather than being fought to the bitter end. Meanwhile, according to political scientist Barbara Harff, between 1989 and 2005 the number of campaigns of mass killing of civilians decreased by 90 percent.
The decline of killing and cruelty poses several challenges to our ability to make sense of the world. To begin with, how could so many people be so wrong about something so important? Partly, it's because of a cognitive illusion: We estimate the probability of an event from how easy it is to recall examples. Scenes of carnage are more likely to be relayed to our living rooms and burned into our memories than footage of people dying of old age. Partly, it's an intellectual culture that is loath to admit that there could be anything good about the institutions of civilization and Western society. Partly, it's the incentive structure of the activism and opinion markets: No one ever attracted followers and donations by announcing that things keep getting better. And part of the explanation lies in the phenomenon itself. The decline of violent behavior has been paralleled by a decline in attitudes that tolerate or glorify violence, and often the attitudes are in the lead. As deplorable as they are, the abuses at Abu Ghraib and the lethal injections of a few murderers in Texas are mild by the standards of atrocities in human history. But, from a contemporary vantage point, we see them as signs of how low our behavior can sink, not of how high our standards have risen.

The other major challenge posed by the decline of violence is how to explain it. A force that pushes in the same direction across many epochs, continents, and scales of social organization mocks our standard tools of causal explanation. The usual suspects—guns, drugs, the press, American culture—aren't nearly up to the job. Nor could it possibly be explained by evolution in the biologist's sense: Even if the meek could inherit the earth, natural selection could not favor the genes for meekness quickly enough. In any case, human nature has not changed so much as to have lost its taste for violence. Social psychologists find that at least 80 percent of people have fantasized about killing someone they don't like. And modern humans still take pleasure in viewing violence, if we are to judge by the popularity of murder mysteries, Shakespearean dramas, Mel Gibson movies, video games, and hockey.

What has changed, of course, is people's willingness to act on these fantasies. The sociologist Norbert Elias suggested that European modernity accelerated a "civilizing process" marked by increases in self-control, long-term planning, and sensitivity to the thoughts and feelings of others. These are precisely the functions that today's cognitive neuroscientists attribute to the prefrontal cortex. But this only raises the question of why humans have increasingly exercised that part of their brains. No one knows why our behavior has come under the control of the better angels of our nature, but there are four plausible suggestions.

The first is that Hobbes got it right. Life in a state of nature is nasty, brutish, and short, not because of a primal thirst for blood but because of the inescapable logic of anarchy. Any beings with a modicum of self-interest may be tempted to invade their neighbors to steal their resources. The resulting fear of attack will tempt the neighbors to strike first in preemptive self-defense, which will in turn tempt the first group to strike against them preemptively, and so on. This danger can be defused by a policy of deterrence—don't strike first, retaliate if struck—but, to guarantee its credibility, parties must avenge all insults and settle all scores, leading to cycles of bloody vendetta. These tragedies can be averted by a state with a monopoly on violence, because it can inflict disinterested penalties that eliminate the incentives for aggression, thereby defusing anxieties about preemptive attack and obviating the need to maintain a hair-trigger propensity for retaliation. Indeed, Eisner and Elias attribute the decline in European homicide to the transition from knightly warrior societies to the centralized governments of early modernity. And, today, violence continues to fester in zones of anarchy, such as frontier regions, failed states, collapsed empires, and territories contested by mafias, gangs, and other dealers of contraband.

Payne suggests another possibility: that the critical variable in the indulgence of violence is an overarching sense that life is cheap. When pain and early death are everyday features of one's own life, one feels fewer compunctions about inflicting them on others. As technology and economic efficiency lengthen and improve our lives, we place a higher value on life in general.

A third theory, championed by Robert Wright, invokes the logic of non-zero-sum games: scenarios in which two agents can each come out ahead if they cooperate, such as trading goods, dividing up labor, or sharing the peace dividend that comes from laying down their arms. As people acquire know-how that they can share cheaply with others and develop technologies that allow them to spread their goods and ideas over larger territories at lower cost, their incentive to cooperate steadily increases, because other people become more valuable alive than dead.

Then there is the scenario sketched by philosopher Peter Singer. Evolution, he suggests, bequeathed people a small kernel of empathy, which by default they apply only within a narrow circle of friends and relations. Over the millennia, people's moral circles have expanded to encompass larger and larger polities: the clan, the tribe, the nation, both sexes, other races, and even animals. The circle may have been pushed outward by expanding networks of reciprocity, à la Wright, but it might also be inflated by the inexorable logic of the golden rule: The more one knows and thinks about other living things, the harder it is to privilege one's own interests over theirs. The empathy escalator may also be powered by cosmopolitanism, in which journalism, memoir, and realistic fiction make the inner lives of other people, and the contingent nature of one's own station, more palpable—the feeling that "there but for fortune go I".
Whatever its causes, the decline of violence has profound implications. It is not a license for complacency: We enjoy the peace we find today because people in past generations were appalled by the violence in their time and worked to end it, and so we should work to end the appalling violence in our time. Nor is it necessarily grounds for optimism about the immediate future, since the world has never before had national leaders who combine pre-modern sensibilities with modern weapons.

But the phenomenon does force us to rethink our understanding of violence. Man's inhumanity to man has long been a subject for moralization. With the knowledge that something has driven it dramatically down, we can also treat it as a matter of cause and effect. Instead of asking, "Why is there war?" we might ask, "Why is there peace?" From the likelihood that states will commit genocide to the way that people treat cats, we must have been doing something right. And it would be nice to know what, exactly, it is.
[First published in The New Republic, 3.19.07.]

John Brockman, Editor and Publisher
Russell Weinberger, Associate Publisher

contact: editor@edge.org
Copyright © 2007 By
Edge Foundation, Inc
All Rights Reserved.