The Self-Compassion Solution – Love Yourself, Too

Posted Posted in Jayne's blog

DOWNLOAD THE ENTIRE MAY 2017 NEWSLETTER including this month’s Freebie.

Self-compassion, at its most basic level, means treating yourself with the same kindness and understanding that you would a friend. People who struggle with this concept, research shows, do not necessarily lack compassion toward others. Rather they hold themselves to higher standards than they would expect of anyone else. Developing self-compassion allows them to recognise and accept their own feelings rather than constantly challenging themselves to “do better.”

A growing number of people are discovering that practicing self-compassion can be a surprisingly effective alternative to the crippling yet common habit of shame-laden self-criticism. Since the birth of self-compassion as a scientific construct—with the publication of a seminal paper by psychologist Kristin Neff of the University of Texas at Austin in 2003—the volume of academic publications investigating self-compassion has snowballed.

In the past few years self-compassion has gone mainstream, as some of its researchers and practitioners— including Neff— have written books and created workshops to popularise the concept. Untold numbers of life coaches, mindfulness teachers and psychotherapists now tout the benefits of self-compassion. Psychotherapists see it as a natural component of well-studied therapies that focus on accepting and gradually changing unhelpful thoughts or behaviour patterns, such as cognitive-behavioural therapy and acceptance and commitment therapy.

Yet many people resist self-compassion, fretting that being compassionate toward ourselves will make us egocentric, self-indulgent or weak. If we are easy on ourselves after a setback, we wonder, will we turn soft and complacent? This question is one of many self-compassion research has tried to answer. The conclusion: a resounding “no.” As mounting evidence shows, self-compassion is typically a source of both personal and interpersonal strength, making self-compassionate individuals more emotionally stable, more motivated to improve themselves and generally better to be with.

Buddhist Roots

Neff, the pioneer in the scientific study of self-compassion, became interested in the topic in the 1990s. As a Ph.D. candidate struggling with the breakup of her first marriage, she was full of shame and self-loathing. She began attending meditation classes and exploring Buddhist thought.

Neff knew that compassion entails concern with another’s pain and a desire to alleviate that person’s suffering, but she had never thought about directing that energy toward herself until she read Buddhist teacher Sharon Salzberg’s book Lovingkindness. She felt transformed by its message that showing kindness to oneself is essential for showing genuine love toward others. She soon began to lay the groundwork to study self-compassion scientifically.

Through her reading, Neff discerned three indispensable elements of self-compassion: kindness toward yourself in difficult times, paying attention to your suffering in a mindful, non-obsessive way, and common humanity, or the recognition that your suffering is part of the human experience rather than unique to you. These three components (along with their opposites) became the basis of the questions Neff used to develop a self-compassion, an instrument she published in 2003 in the Journal Self and Identity that is now widely used by other researchers in assessing a person’s level of this trait.

Using this scale, Neff has shown that self-compassion correlates with important real-world outcomes. In particular, she found that people who score high in self-compassion are less prone to anxiety and depression.

Psychologist Juliana Breines first encountered Neff’s work while she was an undergraduate at the University of Michigan. Breines suspected self-compassion could help people get off the roller coaster of “contingent self-esteem”—that is, the problem of tying your evaluation of yourself to fluctuating factors such as academic achievement and others’ approval. Many studies have demonstrated that this kind of thinking is not conducive to mental health or learning. But Breines worried self- compassion might also undermine motivation. As she puts it, “Self-compassion might be comforting, but does it let you off the hook too easily?”

Breines tested this question a few years later, as a Ph.D student at the University of California, Berkeley. In one of a series of experiments, she and her colleagues had 86 undergraduates take a tough vocabulary quiz. To see the effect of self-compassion on study behaviour, they told one group that it was common to find the test difficult and urged subjects not to be too hard on themselves. A second group got a self-esteem message instead: “Try not to feel bad about yourself—you must be intelligent if you got into Berkeley.” A third group received no additional statements.

Then the researchers measured how long the undergrads would study for a second, similar test. As they reported in 2012 in Personality and Social Psychology Bulletin, the self-compassion group went on to spend 33 percent more time studying for the subsequent quiz than the self-esteem group and 51 percent longer than the neutral control group— a sign that self-compassion bolsters motivation. Being kind to yourself can make it safe to fail, which encourages you to try again.

In a pair of 2012 studies led by social psychologist Ashley Batts Allen, researchers investigating self-compassion in older adults found both psychological and practical benefits. In the first study, with 132 participants ranging from 67 to 90 years old, they found that people who were strongly self-compassionate reported a greater sense of well-being even when they were in poor health. In the second study, involving 71 seniors, self-compassion predicted how likely they were to be willing to use a walker if necessary. The self- compassionate people were just less bothered by the fact that they needed help. If you are low in self-compassion, you’re using too much emotional energy thinking about the bad feelings and not enough addressing the real issues. For example, denying one problem—insisting on not using a walker—can create further difficulties, such as a hip fracture. The mindfulness component of high self-compassion, in contrast, leads people to acknowledge and accept reality, without an emotional judgment. The common-humanity component helps, too, by, for example, allowing one to recognise that everyone has physical limitations with age.

In 2014 Leary and his colleagues studied 187 mainly African-American people living with HIV. Patients who were higher in self-compassion showed healthier reactions to life with the potentially deadly virus: they experienced less stress, felt less shame about their condition, and were more likely to express a willingness to disclose their HIV status and to adhere to medical treatment. And a 2015 meta-analysis of 15 studies with a total of 3,252 participants, published in Health Psychology, found links between self-compassion and health-promoting behaviours related to eating, exercise, sleep and stress management.

Bouncing Back to Normal

Research indicates that the self-compassionate are more psychologically resilient and better able to regain emotional well-being after adversity. People who used self-compassionate language after their divorce, for example, recovered more quickly than those who had a more self-critical or self-pitying (“Why me?”) outlook on the relationship’s failure, according to a 2012 study of 109 adults.

Caregivers, too, can benefit. Raising an autistic child, for instance, is more emotionally difficult than other forms of parenting, with levels of stress and hopelessness that tend to correspond to the severity of the child’s symptoms. Yet a 2015 study of 51 parents of autistic children found that those mothers’ and fathers’ self-compassion was more important than the severity of the child’s symptoms in predicting a caregiver’s well-being.

Yet another example comes from 115 combat veterans of the wars in Iraq and Afghanistan. In a 2015 study in the Journal of Traumatic Stress, self-compassionate war veterans experienced much less severe post-traumatic stress disorder (PTSD) symptoms than those lower in self-compassion, even after accounting for the level of combat exposure. It is a powerful testament to the idea that it’s not what you face in life, it is how you relate to yourself when you face very hard times.

Recent studies of people with other psychiatric disorders, including binge eating and borderline personality disorder, suggest that self-compassion helps recovery. Allison Kelly, a psychologist at the University of Waterloo in Ontario who has studied the effect of a self-compassion intervention on binge-eating disorder, points out that recovery requires not only learning to tolerate urges to binge but also figuring out how to bounce back after giving in to those urges. If, like a drill-sergeant coach or critical teacher, you’re threatening yourself into change and beating yourself up whenever you slip up, it makes it hard to feel calm and confident. It often takes away the ability to reflect and learn from what you’re going through.

Self-compassion might seem to go hand in hand with self-esteem. In fact, self-compassion can coexist with low self-esteem and can buffer against it. In a 2015 longitudinal study led by Sarah Marshall, a psychologist at Australian Catholic University, researchers tracked a group of 2,448 students as they moved from ninth to 10th grade. Marshall found that high self-esteem was a precursor to good mental health, regardless of the students’ level of self-compassion. But self- compassionate kids who had low self-esteem also showed good mental health.

That news is good because it is usually easier to raise someone’s self-compassion than his or her self-esteem. It is hard to get someone with low self-esteem to like themselves until they develop more social skills or get a better job or whatever. By comparison, the bad habits of low self-compassion, such as denying a problem exists or beating yourself up, are easier to break.

Stronger Relationships

Recent research suggests that self-compassion is also good for relationships. Neff led a 2013 study of 104 couples that looked at how self-compassionate people treat their romantic partner—as rated by that partner. In general, men and women who scored high in self-compassion were seen as more caring and supportive (and less controlling and verbally aggressive) than individuals low in self-compassion.

Yet Neff has also found that most people have an easier time being compassionate to others than to themselves. A striking illustration is another 2013 study in which she measured both self-compassion and self-reported compassion for others among 384 college students. Neff found absolutely no correlation between the two forms of compassion; similar studies of practicing meditators and of ordinary adults showed only weak correlations. She has also noticed that practitioners of Buddhist metta, or loving-kindness, meditation—in which you start by wishing yourself well and go on to extend your goodwill toward an increasingly widening circle of empathy— give short shrift to the beginning section. Instead they focus on kindness to others.

But if people find it easier to show compassion to others than to themselves, how can we understand the results from the couples study? Neff believes that being kinder to others than to yourself, though possible, will not carry people through long-term relationships.

This interpretation dovetails with findings, published in 2013 in Self and Identity, that revealed how self-compassionate people handle interpersonal conflicts. The study, led by applied statistician Lisa Yarnell, involved 506 undergraduates. Yarnell, now at the American Institutes for Research, found that students high in self-compassion were better at balancing the needs of themselves and of others and felt better about a conflict’s resolution than those low in self-compassion. The self-compassionate individuals reported lower levels of emotional turmoil and greater relational well-being.

These findings have implications for full-time caregivers, who have long been known to be at risk for burnout and “compassion fatigue,” a deadening of compassion through overuse. In fact, a 2016 cross-sectional survey study of 280 registered nurses in Portugal suggested that although nurses with higher levels of empathy were at greater risk of compassion fatigue, empathy was not a risk factor if it was accompanied by self-compassion.

Teaching Self-Compassion

If being self-compassionate has so many positive outcomes, can people learn to treat themselves more kindly?

One promising intervention is mindful self-compassion, or MSC, an eight-week workshop that Neff developed with Christopher Germer, a clinical psychologist who teaches part-time at Harvard Medical School. The MSC program, designed for the general public, explains the research on self-compassion and introduces a variety of exercises, such as savouring pleasant experiences, touching yourself soothingly, using a warm and gentle voice, and writing a letter to yourself from a loving imaginary friend.

In a small study published in 2013, Neff and Germer reported that 25 people (mainly middle-aged women) who completed an MSC workshop reported higher gains in self-compassion and well-being than a similar group randomly assigned to the wait list for the workshop. Furthermore, the workshop participants maintained their gains a year later. Interestingly, people in the control group also showed some gains in self-compassion—the control group’s self-compassion scores rose 6.5 percent between the pre-test and the post-test phases, whereas the experimental group’s self-compassion scores rose 42.6 percent. This result initially puzzled the researchers—until they discovered that the wait-listed group used the time to learn about self-compassion independently through books and Web sites.

It remains unclear how much the MSC participants’ success is related to the training itself as opposed to, say, being in a group or having caring teachers, notes Julieta Galante, a research associate in psychiatry at the University of Cam- bridge. Last year Galante and her colleagues published the results of an online, four-week randomised controlled study of only the loving-kindness meditation— an exercise often used to cultivate compassion for yourself and others but not targeted specifically to relieve suffering. The team found no difference between the meditation group and a control group doing light physical exercise.

Furthermore, many people dropped out of the intervention, some actually describing intense, troubling emotions—crying uncontrollably or realising they had no uncomplicated relationships in their lives. Germer and Neff brace their workshop participants for this possibility, using the firefighting metaphor of “back draft” to explain the phenomenon: just as flames rush out of a room as oxygen returns, old pain can surface amid an influx of compassion in people starved of love. It is possible that before taking a course, some individuals may need to ease into self-compassion practice slowly, perhaps with the aid of a therapist.

Paul Gilbert, a professor of clinical psychology at the University of Derby in England, agrees. In his years of treating victims of childhood abuse or neglect, he has observed that kindness can back-fire. Anything that stimulates fragile attachment systems can trigger memories of past trauma, particularly in cases of childhood abuse. There are so many fears and resistances to compassion that it “would just blow fuses to start with exercises for the general public,” Gilbert says.

The compassion-focused therapy (CFT) that he developed for such patients and tested through small-scale studies starts with psycho-education and proceeds gradually. Gilbert explains to patients, for example, that self-criticism is not their fault and shows how it may have developed as a way to protect themselves from threatening parents. Once patients understand that neither their genes nor their early environment are their fault, they can begin to let go of shame—and start taking responsibility for their future.

REFERENCES

The Compassionate Mind: A New Approach to Life’s Challenges. Paul Gilbert. New Harbinger Publications, 2009.

Self-Compassion: The Proven Power of Being Kind to Yourself. Kristin Neff. William Morrow, 2011.

Self-Criticism and Self-Compassion: Risk and Resilience. Ricks Warren, Elke Smeets and Kristin Neff in Current Psychiatry, Vol. 15, No. 12, pages 18–21, 24–28 and 32; December 2016.

Metta Institute describes “Metta Meditation”:

www.mettainstitute.org/mettameditation.html

Let Food Be Thy Medicine – In Search of the Optimal Brain Diet

Posted Posted in Jayne's blog

DOWNLOAD THE ENTIRE APRIL 2017 NEWSLETTER including this month’s Freebie.

Carolyn feels great these days. She exercises. She’s socially active. She spends as much time with her four grandchildren as possible. But it wasn’t always that way. A retired radiology film librarian from Pittsburgh, she began feeling apathetic and isolated seven years ago. She’d just lost her mother and her two sons had moved away. She had also struggled with excess weight, diabetes and chronic lung disease. She was grieving, eating a worrisome amount of junk food and slipping into what looked a lot like depression.

A few years later a friend told Carolyn about a depression-prevention study at the University of Pittsburgh. She signed up immediately. All 247 participants were, like her, older adults with mild depressive symptoms— people who without treatment face a 20 to 25 percent chance of succumbing to major depression. Half received about five hours of problem-solving therapy, a cognitive-behavioral approach designed to help patients cope with stressful life experiences. The rest, including Carolyn, received dietary counseling. Guided by a social worker, she discovered that she liked salmon, tuna and a number of other “brain-healthy” foods—which quickly replaced all the criaps, cake and sweets she was eating.

When the trial concluded in 2014, the results came as a surprise—to the researchers at least. The dietary counseling was not meant to have any substantial effect; Carolyn’s group was the experiment’s control. And yet psychiatrist Charles Reynolds and his colleagues discovered that both interventions had significantly reduced the risk of depression—by approximately the same amount. When they reviewed the data, all the patients scored on average 40 to 50 percent lower on the Beck Depression Inventory test, a common measure of depressive symptoms, 15 months after their sessions ended. What is more, only about 8 percent, regardless of the therapy they received, had fallen into major depression.

It cannot be ruled out that a placebo effect (oh yeah!) contributed to the improvements seen in both groups. Meeting with a health care professional and being proactive about getting better in and of itself may have helped participants feel more upbeat. In Carolyn’s view, however, she had reversed her downward spiral largely by changing how she ate.

She is not alone in making that connection. Among scientists and clinicians there is a growing appreciation of the critical interplay between diet and brain health. The evidence is preliminary, and it is hard to tease out cause and effect. Perhaps people who eat well are also apt to have other healthy brain habits, such as regular exercise and good sleep routines. Or maybe depressed people tend to self-medicate with a large tub of icecream. But the data continue to accumulate. Every year the list of correlations between certain foods and mental well-being grows: fish and other sources of omega-3 fatty acids might help fend off psychosis and depression; fermented foods such as yogurt, pickles and sauerkraut seem to ease anxiety; green tea and antioxidant-rich fruits may help keep dementia at bay. And so on. There is probably no single ingredient, no happy seed from the jungles of beyond, that is sure to secure a better mood or mental acuity into old age. But there do appear to be specific dietary patterns—calibrated by millions of years of human evolution—that boost our cognitive and psychological fitness. Within the emerging field of nutritional psychiatry, consensus is building about just what types of diets are best. And perhaps most exciting is the prospect that dietary intervention could serve as a valuable adjunct to medication and other therapies for mental disorders—just as it does in so many other areas of medicine.

Good Diet, Bad Diet

When it comes to promoting brain health, the diet supported by the strongest data draws on traditional eating patterns from Italy, Greece and Spain. The so-called Mediterranean diet consists primarily of fruits, vegetables, nuts, whole grains, fish, lean meats in moderation, olive oil and maybe a little red wine. In 2011 public health expert Almudena Sánchez-Villegas of the University of Las Palmas de Gran Canaria and her colleagues assessed the relation between this diet and depression in more than 12,000 healthy Spaniards over the course of six years. They found that compared with people who did not eat a Mediterranean diet, those who did were significantly less likely to succumb to depression. For the subjects who followed the diet most closely, the risk dropped by a substantial 30 percent.

Sanchez-Villegas later confirmed the association in another large trial. The PREDIMED (Prevention with Mediterranean Diet) study—a multicentre research project evaluating nearly 7,500 men and women across Spain—initially looked at whether a Mediterranean diet, supplemented with extra nuts, protects against cardiovascular disease. It does. But in 2013 Sánchez-Villegas and other investigators also analysed depression data among PREDIMED’s participants. Again, compared with subjects who ate a generic low-fat diet, those who adhered to the nut-enriched Mediterranean diet had a lower risk for depression. This was especially true among people with diabetes, who saw a 40 percent drop in risk. Perhaps these patients, who cannot adequately process glucose, benefited the most because the Mediterranean diet minimised their sugar intake.

Indeed, a central feature of the diet is that it is low in sugar, as well as processed foods and fatty meats, which are common-place on most Western menus. Leading nutritional psychiatry researcher Felice Jacka of Deakin University and the University of Melbourne in Australia was one of the first to demonstrate an association between stereotypical Western diets and depression and anxiety. Most recently, she has drawn another link between poor diet and, quite literally, a shrinking brain. In September 2015 she and her colleagues discovered that older adults who consumed a Western diet for four years not only suffered higher rates of mood disorders but also had a significantly smaller left hippocampus on MRI scans. The hippocampus, composed of two seahorse-shaped arcs of brain tissue deep underneath our temples, is critical to memory formation. Jacka focused on the hippocampus because animal studies have also noted diet-related changes there.

Scientists have proposed a number of possible mechanisms to explain this damage. Jacka’s findings parallel other research revealing that high-sugar diets can prompt runaway inflammation and trigger a cascade of other metabolic changes that ultimately impair brain function. Ordinarily inflammation is part of our immune system’s arsenal to fight infection and encourage healing, but when it is misdirected or overly aggressive, it can destroy healthy tissues as well. According to numerous studies, inflammation plays a role in a range of brain disorders—from depression and bipolar disorder to possibly autism, schizophrenia and Alzheimer’s disease.

Two meta-analyses from 2010 and 2012 collectively reviewed data from 53 studies and reported significantly elevated levels of several blood markers of inflammation in depressed patients. And numerous studies have reported increased or altered activity of immune cells called microglia—which play a key role in the brain’s inflammatory response—in patients with psychiatric disorders, including depression and schizophrenia. It is not clear whether inflammation causes mental illness in some cases, or vice versa. But the evidence suggests that many if not most known risk factors for psychiatric disorders, especially depression, promote inflammation; these include abuse, stress, grief and certain genetic predilections.

Jacka’s work repeatedly points to traditional diets such as Mediterranean, Japanese and Scandinavian ones—all of which tend to be non-inflammatory—as being best for our neurological and mental health. There is no doubt that stress and uncomfortable emotions can cause us to reach for the biscuit tin — but consistently the data show that the main constituents of a healthy brain diet include fruits, vegetables, legumes, nuts, fish, lean meats and healthy fats such as olive oil.

Brain-Building Fatty Acids

Increasingly, researchers are finding that the power of these more traditional diets extends beyond just supplanting bad food with good. Last summer neuroscientists Amandine Pelletier, Christine Barul, Catherine Féart and their colleagues at the University of Bordeaux discovered that a Mediterranean diet may actually help physically preserve neuronal connections in the brain. They used a highly sensitive neuroimaging analysis technique called voxel-based morphometry to identify subtle changes in brain anatomy over time. And last September nutritional epidemiologist Martha Morris of Rush University and her co-workers reported that the MIND diet—a hybrid of the Mediterranean and the high-nutrient, low-salt DASH diet (Dietary Approaches to Stop Hypertension)—may help slow cognitive decline and possibly even help prevent Alheimer’s. When they tested cognitive ability in 960 older adults, those who had followed the MIND diet for roughly five years achieved scores matching those of people 7.5 years younger.

Our evolutionary backstory could explain these neuro-protective effects. Sometime between 195,000 and 125,000 years ago, humans may have nearly gone extinct. A glacial period had set in that probably left much of the earth icy and barren for 70,000 years. The population of our hominin ancestors plummeted to possibly only a few hundred in number, and most experts agree that everyone alive today is descended from this group. Exactly how they—or early modern humans, for that matter—managed to stay alive during recurring glacial periods is less clear. But as terrestrial resources dried up, foraging for marine life in reliable shellfish beds surrounding Africa most likely became essential for survival. Graduate student Jan De Vynck of Nelson Mandela Metropolitan University in South Africa has shown that one person working those shellfish beds can harvest a staggering 4,500 calories an hour.

The archaeological record corroborates the idea and indicates that our ancestors depended on a diet heavy in shellfish and cold-water fish—both rich sources of omega-3 fatty acids. These fats may have driven the evolution of our uniquely complex brains, which are 60 percent fat in composition. One omega-3 in particular, docosahexaenoic acid, or DHA, is arguably the single nutrient most strongly associated with brain health.

In 1972 psychiatrist Michael Crawford, now at Imperial College London, co-published a paper concluding that the brain is dependent on DHA and that DHA sourced from the sea was critical to mammalian brain evolution, especially human brain evolution. For more than 40 years he has argued that the rising rates of brain disorders are a result of post–World War II dietary changes—especially a move toward land-sourced food and, subsequently, the embrace of low-fat diets. He feels that omega-3s from seafood were critical to the human species’ rapid neural march toward higher cognition.

Many studies have confirmed DHA’s importance to the development, structure and function of the human brain: it is a component of neuronal cell membranes, facilitates neuron-to- neuron communication, and is also thought to boost levels of brain-derived neurotrophic factor, a protein that supports the growth and survival of brain cells. Given the starring role this and other omega-3 fatty acids play in shaping and maintaining our most complex organ, it makes intuitive sense that incorporating more of them into our diet—by emphasising sea-food—might, as the nutritional data suggest, protect the brain from going haywire. Also of note, DHA appears to decrease chronic brain-harming inflammation.

Fatty acids aside, there is another important link between our ancestors’ diets, inflammation and mental health. As we evolved, the 100 trillion bacteria, fungi and other microorganisms that colonise our bodies and constitute 90 percent of our cells came along for the ride. This so-called microbiota—and its collective genes, the microbiome—makes a critical contribution to the formation and function of our digestion and immune system. A growing number of findings now suggest that disrupting it through poor eating habits comes at a cost to the brain.

A Blow to the Gut

In one striking (if slightly nauseating) experiment in 2014, then 23-year-old student Tom Spector wiped out about a third of the bacterial species in his gut by limiting his diet to McDonald’s fast food. It took only 10 days. Spector played the guinea pig for two reasons: as a project to complete his genetics degree and to provide data for his father, Tim, a genetic epidemiology professor at King’s College London, who studies how processed diets affect gastrointestinal bacteria. The Spector family’s research did not assess specific health consequences—they were measuring only the drop in floral diversity in Tom’s gut—but Tom did report feeling lethargic and down after days of burgers, fries and sugary soda. The decline in species was so drastic that Tim sent the results to three laboratories for confirmation.

Diet-induced shifts in the microbiota of the kind Spector brought on himself can rapidly ratchet up inflammation in the gut. On top of the ill effects just described, gastrointestinal inflammation can deplete our supply of serotonin, a neurotransmitter long tied to depression and other psychiatric disorders. About 90 percent of our serotonin is produced in the gut when certain microbes interact with cells lining the gastrointestinal tract (some microbes even produce a portion of our serotonin themselves). But by-products of inflammation convert serotonin’s metabolic precursor, tryptophan, to a compound that generates neurotoxic metabolites linked with depression, schizophrenia and Alzheimer’s.

The good news is that dietary changes can not only wreck our microbial diversity, they can boost it, reducing gastrointestinal inflammation in the process. In 2015 a group at the University of Pittsburgh conducted a study in which 20 African-Americans from Pennsylvania swapped diets with 20 rural black South Africans. In place of their usual low-animal-fat, high-fibre diet, the Africans consumed burgers, fries, hash browns, and the like. The Americans eschewed their normal fatty foods and refined carbohydrates for beans, vegetables and fish. After just two weeks the Americans’ colons were less inflamed, and fecal samples showed a 250 percent spike in butyrate-producing bacterial species. Butyrate is thought to reduce the risk of cancer. The South Africans, on the other hand, underwent microbial changes associated with increased cancer risk.

“Dietary changes are the easiest way to alter your microbiome and help to control inflammation,” says psychiatrist Emily Deans of Harvard Medical School. She believes diet is every bit as important as pills and psychotherapy in managing mental illness—a view informed by her own clinical practice. She discusses nutrition with most of her patients, believing it can really help in managing conditions like depression, at least in some patients. Deans also feels that timing of meals can influence mood, and research suggests that eating on a regular schedule can improve mental health.

Deans acknowledges that science has a long way to go before we fully understand the brain-diet relation. She is also wary of the massive probiotic industry that has, like the supplement industry in general, barrelled ahead of the minimal but growing scientific evidence suggesting that probiotics might be effective in preventing or treating mental illness. “You can do studies with, for example, certain vitamins, and some might turn out positive and others negative,” she explains. “But the truth is vitamins exist in all sorts of different chemical states in food and in just one state in supplements.” This difference in form between nutrients in food versus pills explains why the data tend to favour nutrition through diet rather than supplementation. “I think we can safely say that certain dietary patterns seem to promote a healthy microbiome,” Deans speculates, “like the Mediterranean diet and diets that include lots of fibre, fermented foods and fish.” And a healthy microbiome may be essential for a healthy brain.

Food for Thought

For seven years now Carolyn has been eating better—focussing on seafood and cutting back on sugar. She has lost weight, and her diabetes is under control. “It’s part of a whole new way of life,” she glows, “knowing that what I eat can affect how I feel.” That awareness is building momentum among patients and practitioners alike. In March 2015 a large team of clinicians and researchers published a report in the Lancet Psychiatry on behalf of the International Society for Nutritional Psychiatry Research—an organisation Jacka co-founded in 2013. Citing modest therapeutic gains yielded by many psychiatric drugs, the authors called for the integration of nutrition-based approaches into mental health care. “The emerging and compelling evidence for nutrition as a crucial factor in the high prevalence and incidence of mental disorders,” they wrote, “suggests that diet is as important to psychiatry as it is to cardiology, endocrinology, and gastroenterology.”

Thanks to our evolutionary lineage (and plenty of fish), attention to our diets may prove critical to reversing the rising rates of mental illness around the world; lowering the proportion of people struggling with various forms of dementia; and staving off milder psychiatric symptoms and disorders. There is little doubt that eating right can help shuttle us through tough times— just as it may have done 160,000 years ago for a small group of humans huddled in coastal African caves.

One of the leading proponents of leveraging diet to better brain health, Jacka is encouraged that interventional studies— in which patients are actually “prescribed” a particular diet and tracked over time—are finally getting under way. Such research will be able to offer more definitive proof of the connection between diet and mental and cognitive well-being. Jacka’s own group is now conducting a randomised controlled investigation to assess the effectiveness of dietary changes in adults with major depression. The current trial is the first to attempt to directly address the question: ‘If I improve my diet, will my depression improve?’ Her team hopes to have answers later this year.

In the meantime, many doctors and patients are beginning to see dietary interventions as a beacon of hope after several decades of disappointing psychiatric drug development. Too many patients suffering from mental illness or dementia do not respond adequately to existing medications, if at all. For example, selective serotonin reuptake inhibitors such as Prozac—one of the most commonly prescribed drug classes for treating depression—appear effective only in severe cases; they are often no better than placebo for mild to moderate disease. As scientists learn more about the pathologies behind mental and cognitive disorders, new and promising therapeutic targets will surely emerge. But it is clear that nutrition-based treatment plans—free from side effects and low in cost—will also figure prominently in the future of dementia and psychiatric care.

REFERENCES

■ Mediterranean Dietary Pattern and Depression: The PREDIMED Randomized Trial. Almudena Sánchez-Villegas et al. in BMC Medicine, Vol. 11, Article No. 208; September 20, 2013.

■ Early Intervention to Pre-empt Major Depression among Older Black and White Adults. Charles F. Reynolds et al. in Psychiatric Services, Vol. 65, No. 6, pages 765–773; June 2014.

■ The Origins and Significance of Coastal Resource Use in Africa and Western Eurasia. Curtis W. Marean in Journal of Human Evolution, Vol. 77, pages 17–40; December 2014.

■ In Search of the Optimal Brain Diet. Brett Stetka in Scientific American Mind, Vol. 27, No. 2, pages 26–33; March/April 2016.

■ Nutritional Medicine as Mainstream in Psychiatry. Jerome Sarris et al. in Lancet Psychiatry, Vol. 2, No. 3, pages 271–274; March 2015.

■ Western Diet Is Associated with a Smaller Hippocampus: A Longitudinal Investigation. Felice N. Jacka et al. in BMC Medicine, Vol. 13, Article No. 215; September 8, 2015.

Molecules of Desire

Posted Posted in Jayne's blog

DOWNLOAD THE ENTIRE FEBRUARY 2017 NEWSLETTER including this month’s Freebie.

Although few of us spend time contemplating the molecular messengers at work in our brain, we owe a tremendous amount to them—and to dopamine in particular. It plays a part in movement, motivation, mood and memory. But it also has a dark side. The neurotransmitter is implicated in addiction, schizophrenia, hallucinations and paranoia. Yet dopamine is best known for its role in pleasure. In the popular press, dopamine is delight; the brain’s code word for bliss; the stuff that makes psychoactive drugs dope. Articles and documentaries describe dopamine as what makes life worth living, the chemical that permits every enjoyable moment to be savoured, the “hit” everyone is chasing whether through social media, psychoactive substances, sports, food, sex or status.

But it may be time to rethink these ideas. Nora Volkow, director of the National Institute on Drug Abuse and a longtime researcher of this neurotransmitter, says that dopamine is not the pleasure molecule in the simple, direct way it is typically portrayed in the media. Its function is apparently much more nuanced.

Today the precise nature of dopamine is a matter of much controversy. Some researchers argue that dopamine, when acting within what has become known as the brain’s reward system, signals desire. Others claim that it helps the brain predict rewards and direct behaviour accordingly. A third group splits the difference, saying both explanations can be valid. Ironically, if there is anything scientists now agree on about this neurotransmitter it is that dopamine does not neurologically define joy. Instead this little molecule may unlock the intricate mystery of what drives us.

Missing Motivation

In 1978 Roy Wise, then at Concordia University in Quebec, published a seminal paper on dopamine. He depleted levels of the neurotransmitter in rats with antipsychotic medications and found that the rats would stop working to receive yummy foods or desirable drugs such as amphetamine. The animals could still make the movements needed to obtain what they should have craved, which suggested that their behaviour changed because the experience was no longer rewarding. Dopamine, at least when acting in a circuit located near the middle of the brain, seemed to be necessary for anything to feel good.

Over the next decade, data in support of Wise’s idea only grew. So when neuroscientist Kent Berridge began researching dopamine around that time, he believed, like most of his colleagues, that it was a “pleasure” signal. Berridge’s own work was focused on facial expressions of pleasure, which are surprisingly congruent among mammals. Even rats will avidly lick their lips when they receive sweet food and open their mouth in disgust after encountering a bitter taste—as will human babies. Typically mammalian expressions of satisfaction intensify when, for example, a hungry rat receives an especially tasty treat or a thirsty rat finally drinks water. Berridge thought that studying and measuring these responses could further confirm the idea that dopamine means pleasure to the brain.

His colleague at the University of Michigan, Terry Robinson, had been using a neurotoxin to destroy dopamine neurons and create rats that modelled severe symptoms of Parkinson’s. Berridge decided to give sweet foods to these rodents and see if they appeared pleased. He expected that their lack of dopamine would deny them this response. Because they were so dopamine-depleted, Robinson’s rats rarely moved if left alone. They did not seek food and had to be fed artificially. Unexpectedly, however, their facial reactions were completely normal—they continued to lick their lips in response to something sweet and grimace at a bitter meal.

They tried again and again, but Berridge and his colleagues got the same results. When they conducted an experiment that basically created the opposite conditions—by ramping up dopamine levels in rats using electrodes implanted in appropriate regions—the rats did not lick their chops more eagerly when eating, as the “dopamine is pleasure” theory predicted. Indeed, sometimes the animals actually seemed less pleased when they scoffed down their sweets. Nevertheless, they kept eating far more voraciously than normal.

The researchers were puzzled. Instead of producing pleasure, dopamine seemed to drive desire. Desire itself can be enjoyable in small doses—but in the long run, if it is not satisfied, it is just the opposite. Eventually Berridge and Robinson realised that the pleasure involved in seeking a reward and that of actually obtaining it must be distinct. They labelled the drive that dopamine seemed to induce as “wanting” and called the joy of being satiated, which did not seem to be connected with dopamine, “liking.”

This dissociation fit with studies of Parkinson’s patients. They are still, after all, able to enjoy life’s ups but often have problems with motivation. Perhaps the most vivid example of this occurred in the early 20th century, when an epidemic of encephalitis lethargica left thousands of people with an especially severe parkinsonian condition. Their brains were so depleted of dopamine that they were unable to initiate movement and were essentially “frozen in place” like living statues. (The film Awakenings, which starred Robin Williams as neurologist Oliver Sacks, was based on the doctor’s 1973 memoir of treating such patients.) But a sufficiently strong external stimulus could spark action for people with this condition. In one case cited by Sacks, a man who typically sat motionless in his wheelchair on the beach saw someone drowning. He jumped up, rescued the swimmer and then returned to his prior rigidly fixed position. One of Sacks’s own patients would sit silent and still unless thrown several oranges, which she would then catch and juggle.

A class of medications known as dopamine agonists are used to re-enable movement and motivation. Dopamine receptors perceive these drugs as the real thing and react accordingly. Consequently, the medications can offer excellent relief from tremors, rigidity and other movement problems. But the drugs can also have some destructive and distressing side effects. Some patients go from essentially not having enough motivation to having too much—or, at least, the motivation that the drug ignited was misdirected. In addition to overeating, problems on dopamine agonists can include gambling, obsessions (e.g with an iPhone), compulsive online shopping and intrusive sexual desires. Subjectively, the experiences described by patients are nearly identical to those reported by people with more typically caused addictions.

Many people with addictions experience an escalation in desire that, similarly, is not accompanied by a similar increase in enjoyment. This is what Berridge and Robinson call the “incentive sensitisation” theory of dopamine action, which they introduced in 1993 and which has been bolstered by more recent studies. In 2005, for example, a research team tracked the brain activity of eight people with an addiction to cocaine as they pushed a button to self-administer the substance. In line with Berridge’s “wanting” theory, activity along dopamine pathways peaked just before button pushing.

What dopamine does is to take the things you encounter, for example, little cues, things you smell and hear, and if they have a motivational significance, it can magnify that significance. This then raises the incentive to pursue them. Berridge found that placing dopamine directly into the nucleus accumbens of rats, will make them work two to three times harder to get what they crave, but it will not amplify the pleasurable experience of rewards once they are obtained.

Prediction Engines

More recently, other researchers have focused on a different function for dopamine in the brain’s motivational systems. They say that the brain uses dopamine in these regions not so much as a way to spur behaviour through wanting but as a signal that predicts which actions or objects will reliably provide a reward. It encodes the difference between what you’re getting and what you have expected. This is known as the “reward prediction error” theory of dopamine.

In a series of experiments begun in the 1980s, Wolfram Schultz and his colleagues showed that when monkeys first get something pleasant—in this case, fruit juice—their dopamine neurons fire most intensely when they drink the liquid. But once they learn that a cue like a light or a sound predicts the delivery of delicious stuff, the neurons fire when the cue is perceived, not when the reward is received. This response changes when the value of the reward shifts. If a reward is bigger or better than expected, the dopamine neurons fire more in response to this happy surprise; if it is nonexistent or smaller than anticipated, dopamine levels crash.

In a 2016 study, Schultz and his colleagues asked 27 participants undergoing magnetic resonance imaging to look at a computer screen with a series of rectangles, each representing a “range” of money (for example, £0 to £100), without specific values indicated. A crosshair landed somewhere along a rectangle to indicate a cash prize. In several trials, people would guess at and then (virtually) receive the corresponding amount. Meanwhile the researchers tracked activity in select dopamine hotspots. They found that activity in the substantia nigra and ventral tegmental area, neighbouring regions in the midbrain, was linked to people’s prediction errors—whether they were pleasantly surprised or disappointed by the prize. In addition, the activity in this area over the course of the experiment related to how well participants adapted their estimates as they gained insight from past mistakes. Schultz therefore sees dopamine as playing a role in how we learn what to seek and what to avoid.

In this view, dopamine does not signify how pleasant an experience will be but how much value it has to the organism at that particular moment. Schultz notes that dopamine neurons do not distinguish among different types of reward. They are only interested in the value and don’t care whether it is a food reward, liquid reward or money. They are specific about the prediction error, but they don’t care what the reward is.

Schultz suggests dopamine serves as a common currency system for desire. For example, when the brain receives a signal that the body needs water, the value of water for that individual at that time should rise. Because this makes a cold drink more attractive, quenching thirst will be prioritised, avoiding dehydration. Yet, Schultz explains, “if I fall in love, then all my other rewards become relatively less valuable.” A glass of water will pale in comparison to a chance to be with the beloved.

From this perspective, it is easy to see why dopamine would be critical to addiction. If drugs or other compelling pleasures alter the way the reward system determines what is valuable, the addictive behaviour will be given top priority and motivation will shift accordingly. Seeing dopamine this way can also explain a number of psychological phenomena. Consider how people typically prefer a smaller reward now to a bigger one later, what economists term “delay discounting.” This shift occurs because as rewards recede into the distant future, they are far less powerful than those that are just about to be received—and are represented by progressively lower amounts of dopamine.

Moreover, if dopamine codes reward prediction error, it could also account for the so-called hedonic treadmill, that sadly universal experience in which what initially makes us ache with desire, over time becomes less alluring, requiring a greater intensity of experience, new degree of novelty or higher dose to achieve the same joy. (You buy a new car, but driving it soon becomes routine and you start to crave a fancier one.) According to reward prediction error theory, when there is no prediction error—when something is just as pleasant as expected, no more or less—dopamine levels do not budge. But your current pleasure may increase expectations for the next experience, the prediction error is less high and your reaction is less strong. (This logic would also confirm the 1965 hypothesis by Mick Jagger et al. regarding the low probability of getting long-term satisfaction!!)

Other researchers have begun putting Schultz’s ideas to the test. In 2016, neuroscientist Read Montague and his colleagues published findings involving 17 people with Parkinson’s who had brain implants that could measure changes in dopamine in the striatum, another midbrain area linked to rewarding experiences. They found that dopamine signalling might be even more nuanced than making a simple calculation that compares experience with expectations.

In the experiment, the patients played a game that involved betting on a simulated market. While playing, they considered the possible outcomes of various choices and later evaluated their decisions based on what had actually occurred. Here the dopamine signals that were recorded did not track a simple reward prediction error. Instead they varied by how the bets came in compared with how the investment would have fared if they had chosen differently. In other words, if someone won more than she expected but could have won even more if she had made a different choice, she had less dopamine release than if she had not known there was a way she could do even better.

In addition, if someone lost a few dollars but could have lost a lot more if he had made a different choice, dopamine would rise somewhat. This finding explains why knowing that “it could have been worse” can make what would otherwise feel awful into a positive—or at least less dire—experience.

Putting It All Together

Although some scientists view the reward prediction error theory and Robinson and Berridge’s incentive sensitisation theory as incompatible, they do not directly falsify each other. Indeed, many experts think that each captures some element of the truth. Dopamine might signal wanting in some neurons or circuits and could signify reward prediction error in others.

Alternatively, these functions may operate on different timescales—as suggested by a 2016 study in rats conducted by colleagues of Berridge and Robinson. The study, published in Nature Neuroscience, found that changes in dopamine levels from second to second were congruent with dopamine as an indicator of value, which supports the reward prediction error hypothesis. Longer-term changes, over the course of minutes, however, were linked with changes in motivation, which bolsters the incentive sensitisation theory. Our feelings, in a sense, are decision-making algorithms that evolved to guide behaviour toward what was historically most likely to promote survival and reproduction. Pleasure can cue us to repeat activities such as eating and sex; fear drives us away from potential harm. But if the brain regions that determine what you value go askew, it can be extremely difficult to change your behaviour because these areas will make you “want” to continue and will also make the addictive behaviour “feel” right.

Daniel Weintraub, a psychiatrist at the University of Pennsylvania, says often Parkinson’s patients develop impulse-control disorders while taking dopamine agonists, which is roughly 8 to 17 percent of people taking such drugs. The fact that stopping these drugs can end addictive behaviour so abruptly and decisively shows how critical dopamine is in driving it.

Although dopamine can modulate our drives, it is not the only determinant of what we do and what matters to us. Ultimately what we humans seek and value is a little more complicated than our fleeting desires.

REFERENCES

Awakenings. Oliver Sacks. Duckworth, 1973.

Dopamine Reward Prediction Error Coding. Wolfram Schultz in Dialogues in Clinical Neuroscience, Vol. 18, No. 1, pages 23–32; March 2016.

■ Parkinson’s Disease Foundation: www.pdf.org

The Currency of Desire. Maia Szalavitz in Scientific American Mind, volume 28, number 1, pages 48-53; January/February 2017.

Can Exercise Help Depression Better Than Medication?

Posted Posted in Jayne's blog

DOWNLOAD THE ENTIRE JANUARY 2017 NEWSLETTER including this month’s Freebie.

The fact that exercise improves physical health is so well known as to be a platitude. Decades of research demonstrate that regular exercise lowers the risk of many illnesses—heart disease, obesity, diabetes, cancer—and extends the average life span. In contrast, the benefits of exercise for mental health are not quite so obvious or well publicised. We work out to “get in shape,” and some of us depend on bike rides, neighbourhood jogs or yoga to help clear our mind and relieve stress. But how often do we seriously consider exercise as a viable treatment for mental illness, one just as effective as medication or counselling? Can a steady routine of physical workouts really help to keep psychological disorders in check?

In the case of depression, the collective evidence to date suggests that the answer is an emphatic yes. Exercise is by no means a panacea, and in severe cases of depression, it may be futile on its own. But scores of experiments now show that exercise is much more than a temporary distraction from mental woes or some ultimately inconsequential palliative. It appears to combat depression in a number of ways: by strengthening our biochemical resilience to stress, encouraging the growth of new brain cells, bolstering self-esteem and possibly even counterbalancing an underlying genetic risk for mental illness. For most people with mild to moderate depression, exercise is one of the strongest, safest, most practical, most affordable and even enjoyable treatments available.

On the Strength of the Evidence

Major depression— an illness characterised by a persistent low mood or loss of interest in typically pleasurable activities, often accompanied by insomnia, fatigue, poor concentration or feelings of worthlessness—is one of the leading causes of disability and death around the globe, according to the World Health Organisation. At any given time, it afflicts around 350 million people worldwide. Only a fraction of sufferers seek help, and of those, only a third respond to standard treatment, which is usually counselling and medication. Antidepressant drugs are often costly and can have serious side effects, driving many patients to search for less expensive, safer, more natural solutions. In a survey of more than 2,000 U.S. adults published in 2001, more than half of the respondents with depression said that they had turned to some kind of alternative treatment, such as yoga, herbal medicines or acupuncture.

Psychologists and clinicians have studied exercise as an alternative treatment for depression for at least 30 years. James Blumenthal at Duke University in the U.S. was one of the pioneers. In the 1980s, while researching how exercise helps patients with cardiovascular disease, he and his colleagues noticed an inadvertent secondary benefit: working out seemed to improve people’s moods and reduce symptoms of depression. They decided to investigate. One of their early studies, published in 1999, tracked the health of 156 elderly men and women diagnosed with depression as they exercised regularly or took antidepressants, or both. After 16 weeks, all three groups had improved equally, but relapse rates were lowest among patients who exercised.

In a follow-up study, published a decade later, they divided more than 200 adults with depression into four groups, each receiving a different intervention: supervised exercise classes, exercise at home, medication or placebo. They found that patients engaging in supervised exercise fared better than those working out at home and achieved nearly equivalent remission rates as those taking antidepressants: 45 versus 47 percent, respectively. By comparison, the home exercise group reached a 40 percent remission rate and the placebo group, 31 percent.

More recently, in a similar study in 2015, Swedish scientists assigned 946 patients with mild to moderate depression to one of three 12-week treatments: three-times-a-week sessions of yoga, aerobics or strength training; Internet-based cognitive-behavioural therapy; or standard counselling plus medication. Patients in all groups improved, but those engaging in exercise experienced the greatest benefits. Internet-based therapy came in as a close runner-up, but the typical standard treatment plan lagged behind both alternatives.

To date, numerous meta-analyses have kept score on the accumulating data. They do not all agree: a few have found no indication that exercise is helpful or have found that it offers only very small effects or greatly diminished benefits in the long term. But most have reached similar optimistic conclusions. A 2013 review by the nonprofit organization Cochrane, regarded as a leader in evidence- based medicine, concluded that exercise is just as effective a treatment for depression as medication and counseling.

A recent meta-analysis, published in 2016, echoes Cochrane’s finding. A team of international researchers examined 25 of the most rigorous experiments and determined that exercise, especially moderate to vigorous aerobic exercise under professional supervision, is indeed a potent treatment for depression. When they adjusted their analysis to account for ‘weak studies’—those most prone to some kind of experimental bias—they found an even stronger effect, suggesting that some previous meta-analyses may have underestimated exercise’s benefits for mental health. The researchers further calculated that it would take at least 1,000 contradictory studies to negate the affirming evidence that has piled up so far. Yet another review computed that when exercise is used to treat depression, success rates increase by as much as 67 to 74 percent.

How Much Is Enough?

Some researchers have attempted to figure out what types of exercise and intensity levels are most effective as an anti-depressant. In a frequently cited study from 2005, for example, psychiatrist Madhukar Trivedi (University of Texas Southwestern Medical Center) and his colleagues tracked the health of 80 adults with mild to moderate depression for three months as they exercised three to five times a week on a treadmill or stationary bicycle at low intensity (seven kilocalories per kilogram per week) or at a higher intensity, as recommended by public health authorities (17.5 kilocalories per kilogram per week). At the end of the three months, the adults who exercised at the higher intensity had lessened the severity of their depression by 47 percent, compared with only 30 percent for the low-intensity group and 29 percent for a group who engaged in stretching rather than aerobic exercise.

On the basis of studies such as this one, some psychologists, clinicians and health authorities have gone as far as publishing specific recommendations. Trivedi prescribes three to five 45- to 60-minute sessions of aerobic exercise (walking, running, cycling, or using a treadmill, stationary bike or elliptical trainer) each week at an intensity of 50 to 85 percent maximum heart rate. “The ideal is probably at least 16 kilocalories per kilogram of body weight, which works out to 1,200 to 1,500 kilocalories each week for average body weight,” Trivedi says. “If you can talk to your spouse on the phone, you’re not working out at the right intensity.”

Likewise Central Queensland University exercise psychologist Robert Stanton advises 30- to 40-minute sessions of aerobic exercise—walking, cross training or stationary cycling—three to four times a week at low to moderate intensity for at least nine weeks. And the National Institute for Health and Care Excellence advocates group-based physical activity programs for patients with mild to moderate depression, consisting of at least three 45-minute sessions a week for at least 10 weeks.

Other experts, however, think it may be too soon to get so specific. A 2013 reiew paper, for instance, concluded that both cardiovascular and resistance exercise, either alone or in combination, are effective at treating depression but that there are not yet enough data to definitively favour one form of physical activity over another.

Why Exercise Works

In the past decade scientists have uncovered numerous details about how exercise alters the brain, and the body as a whole, in ways that alleviate and protect against depression. The second you start running, pedaling or lifting a dumbbell, your body’s chemistry begins to change. Exercise boosts your heart rate, sending blood, oxygen, hormones and neuro-chemicals surging through the body. In the moment, the body responds to exercise as a kind of stress—but it is ultimately beneficial. Some evidence suggests that habitual moderate exercise rewires the brain and immune system to better cope with physical and mental strain. The better the body becomes at dealing with stressors of all kinds, the lower the risk of a depressive episode. In fact, many researchers think of depression as a disorder of managing stress.

Exercise also seems to mimic some of the chemical effects of antidepressant medication. Based on increasing evidence, some scientists argue that certain cases of depression result from the impaired growth of both brain cells and the connections between them. Studies have documented the atrophy and loss of neurons in brain regions such as the amygdala, hippocampus and prefrontal cortex in patients with major depression. Antidepressants that increase levels of serotonin and other neurotransmitters might work by reinvigorating neural proliferation, a process that depends in part on a molecule called brain-derived neurotrophic factor (BDNF). In studies with both animals and people, exercise enhances the production of BDNF.

In one 2001 study, for example, rats given an antidepressant and the opportunity to run produced higher levels of BDNF compared with animals that only ran or only received medication. Moreover, they were better at enduring a stressful experience, swimming for longer in an inescapable water tank before giving up—a test designed to approximate the onset of depression. In an analogous human study in 2016, Brazilian researchers divided 57 adults taking the antidepressant sertraline for moderate to severe depression into two groups: one attended four weekly sessions of aerobic activity for 28 days, and the other did not exercise. Symptoms abated similarly in both groups, but the exercise group improved on lower doses of antidepressants. The authors suspect that exercise enhanced the biochemical effects of the drugs. Similar studies have shown that simply recommending healthy lifestyle changes, such as establishing better sleep routines and getting more exercise, can dramatically boost the efficacy of antidepressants from a mere 10 percent remission rate with the drugs alone to a 60 percent remission rate.

And in a small but intriguing 2015 study, physician Helmuth Haslacher and his colleagues at the Medical University of Vienna compared the mental health and genomes of 55 elderly marathon runners and endurance bicyclists with those of 58 nonathletes. Among the nonathletes, they found a statistically significant correlation between the number of depressive symptoms these individuals experienced and a particular gene variant that interferes with normal BDNF production. Among the athletes, however, there was no such correlation. The researchers concluded that by stimulating BDNF production, long-term, vigorous aerobic exercise might actually counteract a genetic susceptibility to depression.

Neurobiology may also explain why, in addition to exercise countering depression, the inverse seems to be true: correlations in epidemiological surveys suggest that physical inactivity, while sometimes the result of depression, may also be a major risk factor for subsequently developing it. In a 2014 study of more than 6,000 elderly U.K. citizens, the more time they spent watching television, the more likely they were to report symptoms of depression (although this was not true for other sedentary activities such as reading). Those who participated in some form of vigorous physical activity at least once a week experienced less depression. Likewise, a 2015 survey of nearly 5,000 Chinese college students found that the more time a student spent in front of a TV or computer screen, the more likely he or she was to have depressive symptoms. In contrast, the risk for depression dropped the more physically active a student was, regardless of age, gender or residential background. A meta-analysis of 24 studies, involving nearly 200,000 participants, reached the same conclusion: sedentary behaviour was associated with an increased risk of depression. On average, active people are 45 percent less likely to be depressed than inactive people, according to the U.S. Office of Disease Prevention and Health Promotion.

The Feel-Good Factor

Beyond these physiological reasons, many social and psychological factors help to explain why working out can alleviate symptoms of depression. In comprehensive interviews, people who have struggled with the disorder say that exercise energises them, gives them a sense of purpose and achievement, elevates their self-esteem and mood, regulates appetite and sleep cycles, and distracts them from negative thoughts. For those who exercise in a group, it can also provide a welcome opportunity for social interaction.

First, though, many people with depression must overcome a severe lack of motivation. Jennifer Carter, director of sport psychology at the Ohio State University Wexner Medical Center, has come up with a bevy of practical tips, for example, “There are 1,440 minutes in a day. Perhaps you can find 30 of those to exercise.” Getting over that initial hurdle of low motivation seems to depend in particular on how much satisfaction and self-agency people experience while working out. Enjoyment is fundamentally linked to how much people stick with exercise. If you do what is most fun and entertaining, whatever that might be, then you are more likely to stick to it. Research suggests that exercise as therapy succeeds when people choose the type and intensity. Most people prefer a moderate intensity, around or just below the ventilatory threshold—the point at which breathing becomes noticeably laboured. In 2011 Patrick Callaghan, head of health sciences at the University of Nottingham in England, and his colleagues asked 38 women with depression to exercise on treadmills in small groups three times a week, either at a prescribed intensity or one they personally selected. After a month, the women who chose how much to exert themselves had lower levels of depression and higher self-esteem compared with the other group.

Despite the mounting evidence that exercise can remedy some forms of depression, skepticism persists in academia and health care. Trivedi has found that there is a general bias that exercise is not a bona fide treatment—it’s just something you should do in addition to treatment, like trying to sleep and eat well. Even though recognition of exercise as a treatment is increasing, only some health insurance companies pay for gym time, and when they do, they often offer small temporary discounts.

Patients will need to change their thinking as well. It can be hard for patients to think of exercise as a form of treatment, since we usually exercise to look good and/or to lose weight. Most individuals do not understand the degree to which exercise can reshape their mood, too. Even if you feel like falling apart and doing nothing, exercise will get you out and about. Depression makes you feel like everything you are about to do is useless and pointless – and that is exactly what exercise fights: you have to get up and go!

REFERENCES

Exercise for Depression. G. M. Cooney et al. in Cochrane Database of Systematic Reviews, No. 9, Article No. CD004366; 2013.

Physical Exercise Counteracts Genetic Susceptibility to Depression. H. Haslacher et al. in Neuropsychobiology, Vol. 71, No. 3, pages 168–175; May 2015.

Exercise as a Treatment for Depression: A Meta-analysis Adjusting for Publication Bias. Felipe b. Schuch et al. in Journal of Psychiatric Research, Vol. 77, pages 42–51; June 2016.

From Our Archives

The New Group Therapy. Tegan Cruwys, S. Alexander Haslam and Genevieve A. Dingle; September/October 2014.

Head Strong. Ferris Jabr in Scientific American Mind, volume 28, number 1, pages 26-31; January/February 2017.

Christmas Cheer – or Christmas Crisis?

Posted Posted in Jayne's blog

DOWNLOAD THE ENTIRE DECEMBER 2016 NEWSLETTER including this month’s Freebie.

This season brings varied and complex emotions: joy and nostalgia, love and loneliness. New research helps to explain how these feelings affect us—with some practical implications for making the season brighter and more meaningful.

Embrace the Nostalgia:The bittersweet emotion increases feelings of vitality

On holidays, it’s natural to feel a longing for times gone by—a childhood spent singing carols or meals spent with now departed loved ones. Recently scientists have explored the bitter-sweet feeling of nostalgia, finding that it serves a positive function, improving mood and possibly mental health.

A new paper illuminates why it works, finding that this loningly sentimental feeling does not cement us in the past but actually raises our spirit and vitality.

In several experiments conducted online and in the laboratory, when subjects were induced to experience wistful reverie via sentimental song lyrics or memories, they reported greater self-continuity, as measured by a validated index that asks participants how much they agree with statements such as “I feel connected with my past” and “important aspects of my personality remain the same over time.” Constantine Sedikides, a psychologist at the University of Southampton (England) and the main author of the paper, which was recently published in Emotion, had shown this effect in a 2015 paper. But here they found that nostalgia boosted self-continuity by increasing a sense of social connectedness. Sentimental recollections often include loved ones, which can remind us of a social web that extends across people—and across time.

The researchers found this pattern in American, British and Chinese participants. They also went a step further and observed, via questionnaires about other concurrent feelings, that self-continuity brings a feeling of vitality—of “energy and spirit.”

Tim Wildschut, one of Sedikides’s Southampton collaborators on the paper, notes there are many ways people elicit nostalgia—looking at photographs, cooking certain meals, sharing stories or playing music. He calls the feeling, which we naturally experience several times a week, “a psychological immune response that is triggered when you experience little bumps in the road.” So if you are feeling a bit undone over the holidays, pull out a photo album and spend some time revisiting your past.

The Christmastime Suicide Myth: Rates are low before, on and after Christmas, but New Year’s Day sees a spike

It seems logical that for a depressed person, the holidays might be especially tough—extra stress, loneliness and sad reminders of lost loved ones—so perhaps the popular belief that suicides spike around Christmastime is no surprise. Yet the data tell a different story. Recent studies from several countries show that rates in December and on Christmas in particular tend to be the lowest of the year, but other major holidays do see spikes, especially New Year’s Day.

In one such study, suicide rates in England consistently dipped on Christmas and spiked on New Year’s Day during the 15-year study period, according to the paper published in June in the Journal of Affective Disorders. The researchers reported an overall peak in springtime, and the highest daily rates were observed on Mondays. An older study reported a similar Christmas Day decline in the U.S., with rates up to 15 percent lower than average on that day.

Findings reported in 2015 in the European Journal of Public Health concur, showing about 25 percent fewer suicides around Christmas-time in Austria. Rates there were particularly low on Christmas Eve and remained so until January 1, when the most suicides of any day of the year occurred. The authors also observed higher rates on Mondays and Tuesdays, as well as during the week after Easter.

There are exceptions to this trend. Australia and Mexico do see slightly elevated rates on both Christmas Day and New Year’s Day (as well as on Mother’s Day and Mexican Independence Day). But most nations follow the myth-busting pattern: suicide rates are lower than average around Christmas.

According to clinical psychologist Martin Ploderl, a co-author of the Austrian study, there is typically more social connection for many people around Christmas, which is an established protective factor for suicide. Psychiatric hospital admissions also decrease during this period.

So why the spike on New Year’s? Researchers suggest “the broken promise effect” may explain it—along with the increases after Easter and weekends. Many of us are familiar with the feeling after holidays: ‘Was that it? I expected more fun, more relaxation, and tomorrow I have to go back to everyday life….For depressed people, the broken promise of Christmas and the blank year lying ahead may increase hopelessness and thus suicide risk. The greater alcohol consumption that takes place on New Year’s Eve and Day may also play a role in lowering inhibitions, and some people may postpone their planned suicide so that their families and friends can enjoy Christmas (yes, honestly!).

The More Rituals the Merrier: Family traditions of any type boost enjoyment of gatherings

Some people go home for the holidays hoping just to survive, burying their attention in their phones or football to avoid conflict with relatives. Yet research now suggests that is the wrong idea. Family rituals— of any form— can save a holiday, making it well worth the effort of getting everyone in the same room.

In a series of studies published in the Journal of the Association for Consumer Research, hundreds of online subjects described rituals they performed with their families during Christmas, New Year’s Day and Easter, from tree decoration to egg hunts. Those who said they performed collective rituals, compared with those who said they did not, felt closer to their families, which made the holidays more interesting, which in turn made them more enjoyable. Most surprising, the types of rituals they described—family dinners with special foods, religious ceremonies, watching the ball drop in Times Square—did not have a direct bearing on enjoyment. But the number of rituals did. Apparently having family rituals makes the holidays better and the more the merrier.

The study could measure only correlations between subjects’ responses, leaving causality uncertain—Do rituals increase holiday pleasure, or do people who already enjoy the holidays choose to perform more rituals? Yet enjoyment ratings were higher when given after, versus before, describing rituals, suggesting that simply thinking about rituals can put a warm and fluffy filter on one’s experience.

Whatever the ritual is, and however small it may seem, it does appear to help people get closer to one another. Some participants reported that with some rituals they didn’t even know why they did them, but they still seemed to work.

It could be that rituals offer “small, nonobvious ways” to get people to share an experience without feeling awkward or forced. Compare that with “obvious ploys” such as saying, “Hey, everyone, it’s time to watch The Queen’s Speech”….which might be more likely to produce a whole lot of resistance.

So wherever you are at Christmas, and whoever you are spending it with, I hope that you have a wonderful and memorable time. May the coming year bring you health, fulfilment and happiness, and may your dreams keep on coming true.

See you back here again in January 2017!

Take a Break! November 2016

Posted Posted in Jayne's blog

DOWNLOAD THE ENTIRE NOVEMBER 2016 NEWSLETTER including this month’s Freebie.

Around the world, especially in industrial nations, over-worked employees and the scientists who study them are reaching similar conclusions. Overwhelming evidence now confirms that downtime of all kinds—whether it be a meditation session, lunchtime stroll through the park or a weeklong (or more) holiday—is crucial for productivity and overall health. When we are relaxing or daydreaming, the brain does not slow or stop. Rather—much as an array of crucial molecular, genetic and physiological processes transpire during sleep—many mental processes require periods of waking rest during the day. Downtime restores attention and motivation, fosters creativity, improves work efficiency, and is essential to both achieve our highest levels of performance and simply make it through the day.

Under Pressure

Psychologists began formally studying the health and habits of workers in the first decades of the 20th century. Pioneering workplace psychologist Walter Dill Scott, elected president of the American Psychological Association in 1919, focused on how best to choose employees with the most appropriate skills. In the early 1900s Hugo Münsterberg published the first textbooks explicitly focused on human behaviour in the workplace, a field that is now variously known as industrial/organisational, occupational or, simply, work psychology.

Although the field has long been interested in the relations among stress, rest and productivity, it was not until the 1980s and 1990s that topics such as work-related fatigue, mental breaks and work-life balance received widespread attention. In the past decade the number of studies on such issues has increased dramatically.

Researchers identify several reasons for this new emphasis, such as the growing number of couples and families in which both partners are managing full-time careers, as well as the rising prevalence of white-collar desk jobs in which the psychological toil of work takes precedence over the kind of physical repercussions associated with hard labour. But the biggest impetus is probably the advent of technology that makes it possible to keep working 24/7 and remain in touch with colleagues even when far away from the office. We’ve created a culture of immediate responsiveness. It is getting to the point where thanks to mobile devices we can work from anywhere, and we can interrupt one another anytime.

Studies confirm that many modern employees, are perpetually preoccupied with work: even when they get a break, they feel obligated to keep working. The European Union mandates 20 days of paid holiday, but the U.S. has no federal laws guaranteeing paid time off, sick leave or breaks for national holidays. Canada, Japan and Hong Kong mandate just 10 or fewer days of annual holidays; in the U.S., workers receive an average of just eight days after one year on the job. But a 2014 survey by Harris Interactive found that Americans use only half of their eligible holiday days and paid time off. A 2015 report by Expedia showed that Americans collectively neglect 1.3 million years of vacation annually. And in several surveys, U.S. workers have confessed that they do not fully unplug from phone or email even when they are on holiday or ill. The Americans are not alone when it comes to this.

Larissa Barber, a workplace psychologist at Northern Illinois University, and her colleagues recently coined a new term for such feelings: workplace telepressure, a nagging preoccupation with work-related emails and related communications, combined with a compulsion to respond immediately. It appears to be tied to our (increasing) culture of busyness. Being busy means status and prestige, and if you are not busy and overwhelmed, it might mean you are not important or not working hard enough (ouch!).

In a survey of more than 300 part- or full-time workers published last year in the Journal of Occupational Health Psychology, Barber and her colleagues found that employees who reported greater workplace telepressure missed more days of work, experienced more physical and mental burnout, and did not sleep as well as their less email-obsessed peers. Barber also found that telepressure can lower the quality of an employee’s work: responsivity doesn’t always mean productivity – all it shows is that someone is responding and available, but that is different from doing good work.

The increasingly intrusive nature of work-related communication is especially troubling in light of one of the strongest conclusions from the past decade of occupational psychology research: to maximise the benefits of breaks, we need to fully disengage from our jobs—physically and mentally. Charlotte Fritz, an organisational psychologist at Portland State University who published a review paper on ‘disengagement’ last year, found that no matter how we look at it, detachment is good for well-being: the benefits include lower exhaustion, higher positive mood, better sleep and better quality of life.

The (Qualified) Case for More Vacation

Some of the most rigorous research on how uninterrupted downtime improves health and productivity has been carried out by Leslie Perlow. In one four-year study, she and her team monitored the work habits of employees at the Boston Consulting Group, who were used to working nearly nonstop. Every year the researchers insisted that employees take regular time off, even when they felt they should be in the office. In one experiment, published in 2009 in the Harvard Business Review, each of four consultants on a team took a break from work one day a week. In a second experiment, every member of a team scheduled one weekly night of uninterrupted, non-negotiable personal time.

Everyone resisted at first, fearing that work would pile up. But the consultants gradually came to love their compulsory time off because it restored their willingness and ability to work, making them more productive overall. After five months the study subjects were more satisfied with their jobs, more likely to see a long-term future at the company, more satisfied with their work-life balance and prouder of their accomplishments. These initial experiments were so successful that within four years, the Boston Consulting Group had implemented the same practices in more than 2,000 teams in 66 offices in 35 countries.

Collectively, studies by Perlow and other researchers suggest that the current model of consecutive 40-hour workweeks, punctuated by two-day weekends and one or two holidays a year, is not ideal for mental health or productivity. Psychologists have established that, like weekends and evenings, holidays have genuine physical and psychological benefits: they reduce stress, promote creativity and revitalise attention.

Yet a comprehensive meta-analysis, published in 2011 by Jessica de Bloom, a psychologist now at the University of Tampere in Finland, demonstrates that these benefits generally fade within two to four weeks. In one of her own studies, for example, 96 Dutch workers reported that compared with their typical daily experience they felt greater energy and happiness, less tension and more satisfaction with life during a winter sports holiday between seven and nine days long. Within just one week of returning to work, however, all sense of renewal had vanished. A second experiment on four and five days of respite came to essentially the same conclusion.

A holiday is like applying a single ice cube to a burn: it will help for a little while, but soon enough the discomfort returns. Many people save up their holiday time to use all at once. De Bloom’s findings show that it is not necessarily true that longer breaks or holidays have better results. It seems that regularity is much more important.

Given the current work climate, the prospect of frequent breaks during which employees disconnect completely from their jobs may seem unlikely, but it is a much more pragmatic and affordable strategy than lengthy holidays. Baby steps involve curtailing job-related communications in the evenings and on weekends.

Some companies do set boundaries on work email: in 2011, for example, Volkswagen prevented employees from accessing work-related emails on company-issued phones during nonwork hours. France and Germany have restricted after-hours work communication in certain sectors or situations. But such practices are the exception. In one 2012 survey, only 21 percent of organisations had a formal policy limiting use of work-issued mobile devices during off-hours.

On an individual level, Barber recommends strictly managing expectations. Replying too quickly too often sets up unrealistic standards. On her class syllabus, she explicitly states when she is available to reply by email and when she is not. Meanwhile de Bloom spreads her holidays as much as possible throughout the year. And her polite but firm out-of-office email response cites studies on the benefits of mentally detaching from work during vacation!

Hitting Refresh

Weekends and holidays aside, simple daily practices can allow workers to mentally detach from their desk work. Tony Schwartz, a journalist and CEO of the Energy Project, has made it his mission to advise people on implementing these practices. Building on the available science, his company provides coaching and consultations for organisations that want to help employees avoid burnout and dissatisfaction.

The Energy Project instructs workers to get seven to eight hours of sleep each night, use every holiday day, take naps and other small breaks throughout the day, learn to meditate and take on the most challenging projects first to give them maximum attention. Although their approach counters the reigning cultural conviction that busier is better, the organisation has partnered with Google, Apple, Facebook, Coca-Cola and a wide range of Fortune 500 companies. According to Schwartz, their strategies have pushed workers’ overall engagement well above average levels (as measured by self-reports of how much people enjoy their job and are willing to take on extra duties). Google has maintained the partnership for more than five years.

More than a decade of research has uncovered the fact that although our mental resources gradually ebb from dawn to dusk, breaks can restore at least some of these cognitive faculties. Naps, for instance, can sharpen concentration and improve the performance of both the sleep-deprived and fully rested on all kinds of tasks. In a 2002 study, 26 physicians and nurses working three consecutive 12-hour night shifts dozed for 40 minutes at 3 a.m. while 23 of their colleagues worked continuously without sleeping. Although doctors and nurses who had taken a siesta scored lower than their peers on a memory test at 4 a.m., at 7:30 a.m. they actually outperformed their counterparts on a test of attention, more efficiently inserted a catheter in a virtual simulation and appeared more alert during an interactive simulation of driving a car home.

Some start-ups and progressive companies provide employees with spaces to nap at the office, but most workers do not have that option. Another restorative solution is spending more time outdoors, away from man-made spaces. Marc Berman, a psychologist at the University of South Carolina, studies the hypothesis that natural environments restore our attention. Built-up environments, such as busy city streets, he argues, may overwhelm the brain with noisy, glaring stimuli, whereas the calm and quiet of green spaces, such as parks and forests, allow the mind to relax and recuperate.

In one of the few controlled experiments in this area, published in 2008, Berman asked 38 University of Michigan students to complete two attention-draining tasks: first studying lists of numbers and reciting them from memory in reverse order, then memorising the locations of words in a grid. Half the students subsequently ambled along an arboretum path for about an hour, and the other half walked the same distance through busy downtown Ann Arbor. Back at the laboratory the students once again memorised and recited strings of numerals. On average, the volunteers who had spent their time amid trees rather than city traffic recalled 1.5 more digits than the first time they took the test; those who had walked through the city improved by only 0.5 digit—a small but statistically significant difference between the two groups.

Clearing the Mind

In addition to enhancing one’s powers of concentration, downtime can strengthen attention—something that scientists have gleaned through studies of meditation. In the past decade mindfulness training has become incredibly popular as a strategy to relieve stress, exhaustion and anxiety—especially for over-worked nine-to-fivers (or nine-to-niners, as is often the case).

Critics of mindfulness research observe, correctly, that studies on the benefits of this practice are typically small and that they lean on subjective reports; the science of mindfulness is still not a rigorous one. Nevertheless, at this point researchers examining the benefits of mindfulness have gathered enough evidence to conclude that meditation can improve mental health, hone concentration and strengthen memory. Experiments that contrast longtime expert meditators with novices or people who do not meditate often find that the former outperform the latter on tests of mental acuity.

In a 2009 study, for instance, neuroscientist Sara van Leeuwen, then at Goethe University Frankfurt in Germany, and her colleagues tested the visual attention of three groups of volunteers: 17 adults around 50 years old with up to 29 years of meditation practice; 17 people of the same age and gender who were not longtime meditators; and another 17 young adults who had never meditated before. These participants viewed a series of letters flashed on a computer screen, concealing two digits in their midst. Volunteers had to identify or guess both numerals; recognising the second number was often difficult because earlier images masked it. Performance on such tests usually declines with age, but the expert meditators outscored both their peers and the younger participants.

Changes to the brain’s structure and to behaviour most likely explain these improvements. Over time expert meditators may develop a more intricately wrinkled cortex—the brain’s outer layer, which is critical for many sophisticated mental abilities, such as abstract thought. These practitioners may also have increased volume and density in the hippocampus, an area that is absolutely crucial for memory. Finally, meditation appears to thicken regions of the frontal cortex that we rely on to regulate our emotions and prevent the typical wilting of brain areas responsible for sustaining attention as we age.

At this point, scientists are still unsure of how quickly these changes occur, although some studies suggest that a few weeks of meditation or a mere 10 to 20 minutes daily can sharpen the mind. But there is likely a catch: as with holidays, a few studies indicate that regularity is ultimately more important than the length of any one session. Just 12 minutes of daily mindfulness meditation helped to prevent the stress of military service from corroding the working memory of 34 U.S. marines in a 2011 study conducted by Amishi Jha, now at the University of Miami, and her colleagues. Jha likens mindfulness training to push-ups: as a mental workout: it is low-tech and easy to implement. In her own life, she looks for any opportunity to practice, such as her 15-minute daily commute.

One of the most impressive ‘workplace revolutions’ is the story of Mark Bertolini.

Tt occurred during a family holiday. But it was not a happy moment; in fact, it nearly killed him.

In February 2004 Bertolini, then 47 years old, was on a skiing trip with his family in Vermont. While speeding downhill, he collided with a tree and fell down a ravine. The accident fractured bones in his neck and back and severely damaged nerves in his arm. Yet he lived, gradually regaining mobility despite chronic pain. Not wanting to remain on pain medications for the rest of his life, he turned to yoga and mindfulness meditation. He was so impressed by these pain- and stress-reducing therapies that he started to wonder whether his 50,000 employees might benefit from them, too. Bertolini is chief executive officer of the health insurance giant Aetna.

By 2010 Bertolini had enlisted the help of the American Viniyoga Institute and the meditation instruction company eMindful to customise free yoga and meditation classes for Aetna employees, even providing spaces at the office to practice. And he did not stop there. He also teamed up with health psychologist Ruth Wolever, then at Duke University and now at Vanderbilt University, to formally investigate the outcomes of these innovations. In a three-month study of more than 200 Aetna employees, individuals who engaged in meditation and yoga slept better, felt less stressed overall and had more efficient heartbeat recovery rates after stress than those who abstained. In a follow-up study involving more than 1,000 employees, presented this past May at the International Congress of Integrative Medicine and Health, meditation and yoga were correlated not only with less stress but also with 47 to 62 minutes of increased productivity per week. The practices even seemed to reduce employees’ spending on health care. (The studies were funded in part by Aetna and eMindful, but all were reviewed by independent committees at Duke.)

Since Bertolini introduced mindfulness and yoga courses to Aetna, more than 13,000 employees have participated. Now the company is deciding how best to extend these benefits beyond their offices to their 22.9 million health insurance members.

There has been an formidable increase in recent years in peer-reviewed work on mindfulness and related relaxation techniques in the workplace. Maybe we need to start to work against the culture of always being busy and develop more realistic expectations of what our brains and bodies can handle. As for me…..I am already starting to think about when we can go to Japan again…..and maybe for a short break within the next months ;=)

REFERENCES

■ The “Busy” Trap. Tim Kreider in Opinionator Blog, New York Times. Published online June 30, 2012. http://opinionator.blogs.nytimes. com/2012/06/30/the-busy-trap

■ The Upside of Downtime. Jackie Coleman and John Coleman in Harvard Business Review; December 6, 2012. https://hbr. org/2012/12/the-upside-of-downtime

■ Toward a Model of Work Redesign for Better Work and Better Life. Leslie A. Perlow and Erin L. Kelly in Work and Occupations, Vol. 41, No. 1, pages 111–134; February 2014.

■ Recovery Processes During and After Work: Associations with Health, Work Engagement, and Job Performance. Jessica de Bloom, Ulla Kinnunen and Kalevi Korpela in Journal of Occupational & Environmental Medicine, Vol. 57, No. 7, pages 732–742.

■ Give Me a Break. Ferris Jabr. in Scientific American Mind., Vol. 27 (5), page 44-49; May/June 2016.

You Smell Sick!

Posted Posted in Jayne's blog

DOWNLOAD THE ENTIRE SEPTEMBER 2016 NEWSLETTER including this month’s Freebie.

Being alive is a smelly business.

Our bodies constantly release by-products of the processes that go on inside us—and it is more than just a curiosity or a cause for dismay. A growing amount of research suggests that it might one day be possible to sniff someone’s breath, skin or bodily fluids to help diagnose a disease.

For years researchers have investigated the idea that animals, especially dogs, might be able to tell sick people from healthy individuals by smell. For some diseases (for example, epilepsy), trained animals can be surprisingly good at it. Devices that detect volatile compounds can also pick up subtle differences between diseased and healthy tissue samples, breath or other substances. The list of illnesses studied this way is long—cancers of the stomach, lung, breast and pancreas, cirrhosis, tuberculosis, and many more. Researchers have even reported on a Scottish woman who is capable of identifying people who have Parkinson’s disease by their scent—in one case, months before the diagnosis was made.

One recent paper in Chemical Senses suggests that traumatic brain injury causes a change in the urine of mice that other mice can be trained to sniff out. Not only was there an odour change, but it lasts for quite a while, which suggests the smell may be the result of a process involved in the brain’s response. Researchers are interested in developing a quick, noninvasive test to aid in detecting whether children playing contact sports have received brain injuries. Nothing like that currently exists, and a smell test, should it transfer to humans, could be very useful.

But just recognising a change in smell is one thing. Figuring out exactly what molecules are different and why that is the case is another. There are good arguments for taking that second step before a test ever leaves the laboratory. If you do not know exactly what has been altered, you cannot be sure how good a marker of disease an odour really is.

For example, Raed Dweik, a physician and professor at the Cleveland Clinic, had what looked like a thrilling find several years ago: a signal that showed up in the breath of every hospitalised patient with a certain disease and in none of the healthy control subjects. The team thought it might have discovered something really important. But on further analyses the scientists found that it was actually just one of the volatile compounds in the cleaning solution in the hospital. Every patient had indeed exhaled it—but it had nothing directly to do with the illness.

Car exhaust fumes also show up in people’s breath, it turns out. And other factors can muddy the waters: microbes that live in the mouth and gut, oral hygiene, and whether the sample is from the beginning of a breath or its tail end. Also, a breath or other odour-based test has to meet the same rigorous criteria as a blood test while being better at its job in some way than existing tests.

Given these challenges, the most appealing targets for sniff tests are covert diseases for which there is no existing blood test or method of early detection. George Preti, a researcher at the Monell Chemical Senses Center, is investigating using a sniffer tool to detect ovarian cancer, which tends to be diagnosed when the disease has already advanced.

Although many potential odour tests are still primarily the domain of research scientists, there is one that has made it into the clinic: a measure of nitric oxide, which is released by inflamed airways, in the breath. Exhaled nitric oxide levels are much higher in people with asthma, and after two decades of careful development, a handheld detection device was approved by the FDA some years ago. It is now widely used by doctors to helpmake a diagnosis. Dweik says that a similar technology for personal use might eventually be available to enable patients to monitor the effects of their medication and give advance warning of attacks. The new sensor he is developing with collaborators would plug into a cell phone and use an app to report on nitric oxide levels. Your phone would become the device. That’s the future!

REFERENCES

■ Brain Injury Alters Volatile Metabolome. Kimball, B.A., Cohen, A.S., Gordon, A.R. et al. Chemical Senses, Vol. 41(5), pagwa 407-414; June 2016.

■ Volatile Biomarkers from Human Melanoma Cells. Kwak, J., Gallagher, M., Wysocki, C.J. et al. in J. Chromatogr B Analyt Technol Biomed Life Sci., Vol. 93, pages 90-96; July 2013.

■ Biomarkers and Asthma Management: An Update. Bayes, H.K. and Cowan, D.C. in Curr Opin Allergy Clin Immunol, Vol. 16(3), pages 210-217; June 2016.

■ You Smell Sick. Greenwood, V. in Scientific American Mind., Vol. 27 (5), page 15; September/October 2016.

Why is Friendship So Important?

Posted Posted in Jayne's blog

DOWNLOAD THE ENTIRE AUGUST 2016 NEWSLETTER.

Picture two female chimpanzees hanging out under a tree. One grooms the other, systematically working long fingers through fur, picking out bugs and bits of leaves. The recipient sprawls sleepily on the ground, looking as relaxed as someone enjoying a spa day. A subsequent surreptitious measurement of her levels of oxytocin, a hormone associated with bonding and pleasure, would confirm that she is pretty happy.

 

And why not? Grooming appears to be a pleasurable way to spend time. Many species of apes and monkeys devote long chunks of the day to it. Among other things, grooming can curry favour and strengthen alliances, so it is likely that of these two chimps, the female being primped is of equal or greater rank in the troop than the one doing the work.

 

There is another level of social complexity to this scene that researchers have only recently discovered. If any old troop mate is doing the grooming, hormone levels do not change much. But if it is an individual with whom the recipient has a close bond—including but not limited to family—oxytocin levels will rise considerably. What matters most, in other words, is whether the chimpanzees are friends.

 

To most of us, the pleasures of friendship are familiar. Like this pair of chimps, we are more likely to relax and enjoy ourselves at dinner with people we know well than with people we have just met. Philosophers have celebrated the joys of social connection since the time of Plato, who wrote a dialogue on the subject, and there has been evidence for decades that social relationships are good for us. But it is only now that friendship is getting serious scientific respect. Researchers from disciplines as diverse as neurobiology, economics and animal behaviour are recognising parallels between the interactions of animals and the habits of people at dinner parties and are asking far more rigorous questions about the motivations behind social behaviour.

 

The early answers, though preliminary, are spurring a reappraisal of the importance of friendship as a biological and societal force. First, there is the apparent universality of friendship: it isn’t just limited to we humans. ‘Sociality’ is also found in rhesus macaques and killer whales.

 

There also appears to be a genetic basis to both our instincts toward sociability and our actual relationships that goes beyond family. And there is strong evidence that the absence of friendship can be toxic for our health, whereas those with tighter social bonds live longer and enjoy more reproductive success. All of which means friendship has evolutionary origins: it suggests a basic propensity for the need for sociality in mammals. Friendship, then, is not a luxury; it is an infrastructural necessity.

 

Mapping Connections

Just what constitutes friendship? If you think we are friends, but I think we are acquaintances, which are we? The variety of possible answers is one reason friendship went unexamined for so long. Scientists gravitated to the study of individuals because it meant fewer statistical headaches and more available data. Furthermore, if one is interested in evolution, it is also true that natural selection occurs when conditions favour individuals carrying particular traits. The beak size of Charles Darwin’s finches, to take a famous example, changed bird by bird. It is harder to develop an evolutionary argument about connections between people, which are so much less tangible.

 

When researchers did look at bonds between pairs, or dyads in scientific terms, they studied mates or relatives such as mothers and infants. To consider relationships between individuals who are not related and do not have sex requires agreement on how to measure the properties of those bonds. The current working definition of friendship—a persistent positive relationship that involves cooperation over time—developed only recently and is based on the quality and patterning of interactions.

 

Most critically, friendship is sustained. You might have a pleasant interaction with someone on the subway but would not call that person your friend. But the neighbour with whom you regularly exercise and occasionally dine? That is a friend.

 

Although researchers cannot ask a monkey to name his or her closest friends, they can observe, in natural environments, how and with whom the animal spends time. By following individual animals closely over years and painstakingly recording every instance of vocalising, grooming, cooperative foraging, and so on, behavioural ecologists have amassed volumes of data on social activity in certain populations.

 

In people, researchers prod subjects to list names of friends and identify social relationships. The top two “name-generating” questions concern free time and discussing important matters: Whom do you invite to the movies? And whom do you call when you are sick, breaking up or changing jobs? There may well be more than one person on the list, and the names may change over time, but a 2014 study of phone calls made by college students over a year and a half showed that the number of close friends you have remains surprisingly constant. Based at the Aalto University School of Science in Finland, the researchers monitored 24 students as they transitioned from high school to college, a period when these young men and women met many new people. They found that specific friendships changed during this period, but at any given time most individuals still leaned on roughly the same number of core companions—and the specific number was unique to each person.

 

Your entire social circle is relevant to the new friendship research. In his early career as a medical doctor, Nicholas Christakis, now a sociologist at Yale University, became interested in the way one person’s illness might take a toll on another, especially a spouse. That led to the realisation that pairs of people connect to other pairs to form huge webs of ties stretching far into the distance.

 

Christakis joined forces with James Fowler, a political scientist now at the University of California, San Diego (both were then at Harvard University), to study social networks of 3,000 or 30,000 or more people. Using computational techniques, they and others have established measures of connectedness that allow sophisticated mapping of these bonds. For example, they count how many friends I would name (“out-degree”) and how many friends name me (“in-degree”) separately—thereby dealing with any mismatch in our perceptions of how close we really are. Their 2009 book, Connected: The Surprising Power of Our Social Networks and How They Shape Our Lives, made the case that social connections of up to three degrees of separation have a significant influence on such things as weight as well as on smoking habits, altruism and voting behaviours.

 

The new way of thinking about friendship also blurs the long-standing distinction between friends and family by theorising that the quality of a bond might be more significant than its origin. The relationship with your spouse can be positive and supportive, or it can be the most toxic that you have in your life. By the logic of this approach, relatives and sexual mates can be considered friends but only if the bond is rewarding. According to this view, family might often come first in part out of convenience. Friendship is just a word for a persistent, long-term social bond and kinship/family ties provides an easy start to these bonds.

 

Animal Nature

Research in animals has been important in establishing the idea that a strong social bond—all by itself—may have evolutionary significance. Evolutionary theories are hard to prove. Many experiments designed to test these ideas require studying not just a single group or population but their descendants. Most animal species have shorter life spans than humans, however, making measuring generational change a simpler proposition. That can make it easier to tease out cause from correlation. In addition, findings that echo across species suggest biological rather than cultural origins. To date, horses, elephants, hyenas, monkeys, chimpanzees, whales and dolphins have all been shown to form social bonds that can last for years. Studies of our closest living relatives— monkeys and apes—have been especially groundbreaking. Seyfarth and Cheney have studied the same troop of baboons in Kenya for more than 30 years. When they began, primatologist Robert Hinde had already established that nonhuman primates had notable social relationships. One of the first things Seyfarth and Cheney did was use audio-playback experiments to show that baboons were aware of the relationships of others. When a group of female monkeys heard an offspring’s distress vocalisation, they often looked at the infant’s mother. That suggests that the social relationships were not just a figment of our human imagination.

 

Eventually it became evident in two separate long-term studies of baboons—one led by Seyfarth and Cheney, the other by primatologist Jeanne Altmann — that those social relationships, carefully recorded over time, made a big difference in lifetime reproductive success. In 2003 Altmann and coworkers published a seminal paper in Science that was the first to explicitly link adult females’ friendships with the proportion of their infants that survive the first year of life. In 2009 and 2010 Seyfarth, Cheney, Silk and their colleagues presented similar data. They also showed that baboons with stable friendships have lower stress and that female baboons work to form new friendships when a close friend is killed by predators—an important piece of evidence in favour of the social bond’s overarching importance. The striking and convergent results from the two studies surprised the researchers, who had expected dominance rank to confer the most advantage. It was not that rank was unimportant, but the critical factor was a close set of social bonds. Primates have long-term relationships. They are aware of the relationships in others, and these relationships have a direct impact on repro- ductive success.

 

The Social Genome

A related evolutionary idea about humans has also generated interest. The social brain hypothesis, put forward by evolutionary psychologist Robin Dunbar, argues that the need for early humans to live in ever bigger social groups led to the enlargement of the human brain. Navigating the complexities of social life after all requires social attention and the ability to take others’ perspectives, to communicate and, ultimately, to cooperate.

 

The idea is rooted in the earlier observation that monkeys and apes had a much larger brain relative to body size than other animals and that this was probably the result of their social lives. Archaeological and fossil evidence to bolster the theory includes changes, though slight, in brain size between Neandertals and modern humans at the same time that social groups expanded both in size and, especially, in complexity. A corollary known as Dunbar’s number holds that no matter what your Facebook page says, each of us can only maintain a wider social circle of about 150 people. It turns out many forms of social organisation from military companies to average holiday card lists hover around that number.

 

If evolution is steering various species, including our own, toward prosocial behaviour, it makes sense to seek evidence in the genome. Already genetic variation has been identified in people with disorders that affect social function, such as autism and schizophrenia. And some genes in the dopamine and serotonin pathways have been consistently linked with social traits. Genetics started with an understanding of how genes affect the structure and function of our bodies and then our minds. And now scientists are beginning to ask how genes affect the structure and function of our societies.

 

Over the past five years Christakis, Fowler and their collaborators have published a series of papers on both cooperation and the possible genetics of friendship. The first examined data on 1,110 twins included in the National Longitudinal Study of Adolescent to Adult Health, in which participants were periodically asked to name friends. Christakis and Fowler’s team found that genetic factors account for nearly half of the variation both in how connected an individual is to a larger friend group (based on the number of in-degree and out-degree associations linked to that person) and, more surprisingly, in the probability that a person’s friends are friends with one another, a property known as transitivity. “That’s a bizarre result,” Christakis says. “If you have Tom, Dick and Harry in a room, whether Dick is friends with Harry depends not only on Dick’s genes or on Harry’s genes but on Tom’s genes. How can that be? We think the reason is that people vary in their tendency to introduce their friends to one another. Some knit the networks around them together, and some people keep their friends apart.”

 

A person’s social position, in terms of how central that individual is in his or her network, was also heritable. According to Christakis and Fowler’s analysis, 29 percent of the differences in a person’s likelihood to have a particular social role could be explained by genetics, as opposed to environment.

 

In 2011 Christakis and Fowler used six available genotypes from the same database (excluding relatives this time) to test for genetic similarity among friends. They found that the old adage about “birds of a feather” was genetically based. Friends did not just have similar traits; they resembled one another on a genotypic level beyond what one would expect from systematic genetic differences that might occur because of shared ancestry, such as being European or Asian. They expanded on this work in a 2014 paper on friendship and natural selection and showed that a degree of correlation in genotypes made friends the equivalent of fourth cousins. And they replicated the results with a second large database, the Framingham Heart Study. The conclusion was that friends may be a kind of ‘functional family’.

 

As part of this work, in a 2012 paper in Nature, they even mapped the social network of the Hadza hunter-gatherers of Tanzania, who live essentially as humans did 10,000 years ago. Christakis and Fowler showed that the Hadza form networks with a mathematical structure just like humans living in modernised settings, suggesting something very fundamental about the structure of friendship.

 

Brent was the first to apply Christakis and Fowler’s social-network analysis to monkeys. Together with neurobiologist Michael Platt, they work with a colony of rhesus macaques on Cayo Santiago, an island off the coast of Puerto Rico, for whom extended genetic records exist. Their 2013 study found that the most sociable monkeys, those with the largest, strongest networks, tended to be descendants of similarly social macaques. More social monkeys also had greater reproductive success, meaning their babies were more likely to survive their first year. In a 2015 paper they showed that social vigilance, the ability to observe and gather social information, had a heritability of 12 percent.

 

In her newest work, Brent is now exploring whether indirect connections—friends of friends—are as significant for animals as they are for humans. All these studies are based on relatively small samples (dictated by the number of available animals), so they lack the power of Christakis and Fowler’s work, which used extremely large databases. It remains to be seen how pervasive this is but the results striking.

 

Friends for Life

If friendship is so important, the next aim is to understand why by teasing out what exactly social bonds do for us. Our pair of grooming chimpanzees were very much like real duos studied by primatologists. In work carried out by Crockford and Wittig it was found that the closeness of a pair’s bond would determine the amount of oxytocin circulating in a primate’s blood. This finding might suggest that there’s something about interacting with individuals that you perceive as close friends that’s physiologically very rewarding.

 

There are also clues in human physiological responses to social interaction. Several large longitudinal studies have shown that the strength of our social network can predict mortality to such a degree that strong ties may be as beneficial to our health as quitting smoking and more impactful than well-known risk factors such as obesity and physical inactivity. Studies of loneliness have shown that a weak social network can be detrimental to well-being.

 

If the new science of friendship can paint a clearer picture of how and why we make friends, researchers hope to use that information in a variety of ways. In an ambitious randomised trial involving 30,000 people in 160 villages in Honduras, Christakis and Fowler are exploring whether targeting influential individuals, identified through social-network analysis, can be used to change health habits and reduce childhood mortality. On Cayo Santiago, Platt and Brent hope to be able to establish normal variation in social behaviour among macaques as a way of then studying behaviour that falls outside that range. (One of the first things that seems to fall apart in autism is attention to others).

 

But of course, the most straightforward result of this work would be to spark a deeper appreciation of just how important our friends are in our life. Other individuals are in fact the source of some of our greatest joys. And now we know that they do not just make us happy—they help keep us alive. So time to celebrate the power and inportance of your friendships for the rest of this year – and well on into the future!

 

 

REFERENCES

■ Connected: The Surprising Power of Our Social Networks and How They Shape Our Lives. Nicholas A. Christakis and James H. Fowler. Little, Brown, 2009.

■ The Evolutionary Origins of Friendship. Robert M. Seyfarth and Dorothy L. Cheney in Annual Review of Psychology, Vol. 63, pages 153–177; January 2012.

■ Genetic Origins of Social Networks in Rhesus Macaques.

Lauren J. N. Brent et al. in Scientific Reports, Vol. 3, Article No. 1042; January 9, 2013.

■ Friendship and Natural Selection. Nicholas A. Christakis and James H. Fowler in Proceedings of the National Academy of Sciences USA, Vol. 111, Supplement 3, pages 10,796–10,801; July 22, 2014.

Finding Love Online

Posted Posted in Jayne's blog

DOWNLOAD THE ENTIRE JULY 2016 NEWSLETTER.

Everbody seems to be doing it!

What?

Looking for love online.

 

According to recent statistics for those looking for love then ‘everybody seems to be doing it online’. So for those of you looking for a bit of summer romance, then boost your odds of making a match with these new research based insights.

 

A Video is Worth a Thousand Words

If you have ever chosen a profile picture for an online dating site, you have probably tried to pick a shot that gets across some of your key traits—energetic, friendly, silly, warm. Yet recent research suggests that the people who see your photograph are probably not accurately gauging your personality. A new study finds that a short video can leave a much more accurate first impression.

 

Researchers at the University of Texas at Austin put together a speed-dating pool of about 200 men and women. They also took photos of the participants, mimicking those found on online dating sites, and recorded short videos of the same individuals to see what kinds of first impressions people would form in each context. For each scenario, participants rated those they “met” on traits such as attractiveness, humour, intelligence and other qualities that we usually judge within seconds. The researchers presented their findings in January at the Society for Personality and Social Psychology meeting in San Diego.

 

Ratings from the three groups showed that individuals were more likely to agree on what another person was like if they met face-to-face or saw a video of that person. But when they had only a picture to go by, the raters used more of their own beliefs and schemas to make judgments. When someone describes a static image it usually tells more about the viewer than it tells about the person in the photograph.

 

The reason we misjudge photos, the researchers say, is that the limited information contained in a photo puts us in an abstract mindset. We then draw on our past experience and expectations to fill in the blanks. A video, on the other hand, contains dynamic details that capture our attention and quickly reveal volumes about a person’s personality—even if the clip is just a few seconds long. Someone’s smile, voice and gestures, for example, provide instant clues about his or her agreeableness, trustworthiness and self-confidence.

 

Live impressions, of course, are the most powerful. So when you start warming up to a potential date online, it is important to get to that meet-up at a coffee shop or bar so you can get a more authentic sense of the person. Meanwhile clever entrepreneurs are already creating dating apps based on videos, not photos…..

 

The Problem With Speed Swiping

Ideally, any potential date deserves a fresh look, unaffected by what you thought of the last person you saw. But new research suggests that we may not be giving prospects a fair chance when we switch or swipe from one profile to another on dating apps and Web sites.

 

In a study described in March in Scientific Reports, female subjects saw men’s faces on a screen for 300 milliseconds—about the length of a very short view on a dating app such as Tinder. After each face, they judged it attractive or not. The researchers found that faces were more likely to be judged attractive when they followed other attractive faces. Two factors caused this pattern: a response bias, in which one presses the same key as last time, and a perceptual effect mostly likely caused by the short interval allowed for processing the faces.

 

Previous studies have shown contrast effects, in which people in photographs look uglier when viewed next to portraits of attractive strangers. But in the new study, the exposure was so brief that an individual face was not fully processed, and thus it took on qualities of the previous face. Jessica Taubert, one of the lead authors of the paper and a researcher at the University of Sydney, advises online daters: “Be mindful that your brain has limited cortical resources.” In other words, slow down!

 

In another new paper, in the Journal of Experimental Social Psychology, researchers asked whether contrast effects occur when judging personality. Participants viewed two dating profiles. When the first person came across as uncaring (“I get bored talking about feelings and stuff”), the second person, who was nice but unattractive, seemed much more appealing. In real profiles, people might not appear as blatantly callous as in this study, but other personality traits could be turnoffs that bias viewers’ later decisions.

 

So whatever speed swipe app you’re using, it pays to clear your head and try to view each profile as a unique individual—before rushing on to the next one.

 

To Hide Or Not To Hide

Online dating provides opportunities we do not have in the real world, like scanning 100 potential sweethearts in an hour. But some of these advantages may actually be drawbacks. Anonymous browsing, for instance, allows users to look at people’s profiles without the target knowing they got checked out—which can mean freedom from drawing unwanted messages. Yet it also erases any breadcrumbs that might lead to love. A paper published online in February in Management Science finds that on the whole, this feature backfires.

 

The researchers selected 100,000 users of a large online dating site and gave half of them the ability to browse anonymously, which usually costs extra. They became less inhibited and more likely to look at people of the same sex or a different race. The researchers thought the disinhibition would translate into more matches….but….women with this ability actually made fewer matches because they did not leave so-called weak signals of interest that might lead the other party to follow up. The simple notification that a particular person perused your profile is often enough to get a conversation started. Anonymous browsing did not affect men’s matches as much, because the men were already uninhibited—they messaged individuals who interested them. Women, however, are less likely in general to make the first move and therefore depend more on sending weak signals to invite flirtation.

 

Further, what secret scanners lost in quantity they did not gain in quality. The average romantic appeal of their matches, as rated by other users, was no different from those of nonanonymous users.

 

So wherever you are this summer, and whoever you are with, have a wonderful time!

Scratch & Sniff Tests for Dementia?

Posted Posted in Jayne's blog

DOWNLOAD THE ENTIRE JUNE 2016 NEWSLETTER including this months Freebie.

Name that smell—if you can’t, it could be an indicator of a problem somewhere in your brain. New research suggests that scratch-and-sniff smell tests could become an easy and cheap way to detect signs of traumatic brain injury and neurodegenerative ailments.

 

Recent research found that a diminished sense of smell predicted frontal lobe damage in 231 soldiers who had suffered blast-related injuries on the battlefield. In the Department of Defense study led by Michael Xydakis of the Uniformed Services University of the Health Sciences, subjects with low scores on a smell test were three times as likely to show evidence of frontal lobe damage during brain imaging than those whose sense of smell was normal.

 

When the sense of smell is working properly, it acts as a matchmaker between odourant molecules in the air and memories stored in the brain. Those memories are not housed in a single place, but extend across many regions. Because different smell signals have to take a variety of paths to reach their destinations, arranging their travel requires a lot of coordination. This unique feature makes an individual’s ability to describe and verbally name an odour extremely challenging and cognitively demanding.

A damaged sense of smell, therefore, can indicate that the ability to make those connections has been hampered by disease, a lack of sleep or, as shown in Xydakis’s study, injury to the brain. The new results add to a growing understanding of the link between brain damage and an impaired sense of smell. Researchers have been working for years to use olfaction tests to track damage to the brain caused by neurodegenerative ailments such as Parkinson’s and Alzheimer’s diseases.

 

Kim Good, an associate professor in the psychiatry department at Dalhousie University in Nova Scotia, is currently recruiting subjects for a cohort study that aims to better understand the link between olfaction and Parkinson’s, which could improve early identification and intervention. Olfactory deficits are as common as tremor in Parkinson’s, and they help rule out other competing diagnoses.

 

Smell is also the first sense to be affected by Alzheimer’s, with the hallmark protein tangles of the disease appearing early in the olfactory bulb. Last January psychiatrist Davangere Devanand of Columbia University and his colleagues reported the results of a four-year-long cohort study in Manhattan, which found that scores on a multiple-choice scratch-and-sniff test in which participants had to identify 40 scents were good predictors of cognitive decline.

 

It’s not hard to imagine such exams becoming a routine part of primary care for older patients. The beauty of olfaction tests is that testing is easy and can be done in the family GP’s office.

 

Why Smell is Special

The unique characteristics of our sense of smell make sniff tests ideal for diagnosing brain injury. Here are some of the most interesting scientific findings about this unusual sense:

 

■ The adult brain can generate new neurons in the olfactory bulb, the brain region that processes smells. This area is one of just a few regions that continue to grow new neurons during adulthood.

■ Individuals vary in how they perceive odours and whether or not they can detect certain scents, and yet humans seem to universally enjoy the smell of vanilla.

■ Anosmia, a condition in which people completely lose their sense of smell, can be debilitating. Sufferers often report feeling disconnected from their surroundings, and many become severely depressed.

■ Romantic couples can unconsciously sense their partner’s emotional state from their sweat—and the longer they have lived together, the better they are at it.

■ Babies locate their mother’s nipples in part by learning a smell map of the breasts.

 

Have an enjoyable June, and take time to smell the flowers (unless you have hay fever…) !

Dress for Success: How Clothes Influence our Performance

Posted Posted in Jayne's blog

DOWNLOAD THE ENTIRE MAY 2016 NEWSLETTER including this months Freebie.

As I am writing this – on an unusually warm (for Holland) Sunday afternoon under the shade of our plum tree – I am wearing old jeans and a favourite faded cotton shirt. They feel familiar and comfortable. This attire is also ‘air conditioned’ (perfect for the soaring temperatures) since my jeans have a hole in the inside thigh seam, and my shirt has torn & threadbare parts. I love them both dearly. Yet before I go out for a walk, after putting this On the Border together, you can be sure that I shall change into something more decent. I don’t want the neighbours to see me in these Sunday sloppy clothes….

It’s not news to anyone that we judge others based on their clothes. In general, studies that investigate these judgments find that people prefer clothing that matches expectations—surgeons in scrubs, little boys in blue—with one notable exception. A series of studies published in an article in June 2014 in the Journal of Consumer Research explored observers’ reactions to people who broke established norms only slightly. In one scenario, a man at a black-tie affair was viewed as having higher status and competence when wearing a red bow tie. The researchers also found that valuing uniqueness increased audience members’ ratings of the status and competence of a professor who wore red Converse sneakers (trainers) while giving a lecture. The results suggest that people judge these slight deviations from the norm as positive because they suggest that the individual is powerful enough to risk the social costs of such behaviours.

The old advice to dress for the job you want, not the job you have, may have roots in more than simply how others perceive you—many studies show that the clothes you wear can affect your mental and physical performance. Although such findings about so-called ‘enclothed cognition’ (yes indeedy, scientists invent words for everything!) are mostly from small studies in the laboratory that have not yet been replicated or investigated in the real world, a growing body of research suggests that there is something biological happening when we put on a snazzy outfit and feel like a new person.

If you want to be a big-ideas person at work, suit up. A paper in August 2015 in Social Psychological and Personality Science asked subjects to change into formal or casual clothing before cognitive tests. Wearing formal business attire increased abstract thinking—an important aspect of creativity and long-term strategising. The experiments suggest the effect is related to feelings of power.

Informal clothing may hurt in negotiations. In a study reported in December 2014 in the Journal of Experimental Psychology: General, male subjects wore their normal clothes or were placed in a suit or in sweats. Then they engaged in a game that involved negotiating with a partner. Those who dressed up obtained more profitable deals than the other two groups, and those who dressed down had lower testosterone levels.

For better focus, get decked out like a doctor. In research published in July 2012 in the Journal of Experimental Social Psychology, subjects made half as many mistakes on an attention-demanding task when wearing a white lab coat. On another attention task, those told their lab coat was a doctor’s coat performed better than either those who were told it was a painter’s smock or those who merely saw a doctor’s coat on display.

Inspired by findings that winning combat fighters in the 2004 Olympics had worn red more often than blue, researchers investigated the physiological effects of wearing these colours. As reported in February 2013 in the Journal of Sport and Exercise Psychology, they paired 28 male athletes of similar age and size, who competed against one another once while wearing a red jersey and again while wearing blue. Compared with the 2004 combat fighters in blue, those wearing red were able to lift a heavier weight before the match and had higher heart rates during the match—but they were not more likely to be victorious.

Trying too hard to look sharp can backfire. When women donned expensive sunglasses and were told the specs were counterfeit, as opposed to when they thought they were real, they cheated more often on lab experiments with cash payouts. Fake sunglasses also seemed to make women see others’ behaviour as suspect. Authors of the study, published in May 2010 in Psychological Science, theorise that counterfeit glasses increase unethical behaviour by making their wearers feel less authentic.

So, after all that, I am going to change into something less shabby and go out for a walk.

However, there is a good chance that when I come back I’ll change back into my comfortable favourites….

Have an enjoyable May!

Immune to Addiction?

Posted Posted in Jayne's blog

DOWNLOAD THE ENTIRE APRIL 2016 NEWSLETTER including this months Freebie.

When neuroscientist George Koob proposed creating a vaccine for addiction 25 years ago, his colleagues thought he was wasting his time. The immune system evolved to prevent infections, not highs from illegal drugs. Prevailing wisdom holds that treating addiction requires months or years of psychotherapy to help addicts change their thought patterns, a difficult process that does not consistently work. But Koob, then at the Scripps Research Institute, wanted addicts to be able to see their doctor for a shot that could keep them from getting high when their motivation to stay clean waned.

His premise was simple. Vaccines against infectious diseases work by priming the body to produce antibodies that stick to the invading pathoge, preventing it from causing illness. Koob, who now directs the National Institute on Alcohol Abuse and Alcoholism, believed that the body could be duped into producing antibodies to drugs of abuse. The antibodies would biochemically block these drugs from creating a high, thereby eliminating the incentive to use them. Unlike traditional vaccines, howeever, this approach would aim to treat, rather than prevent, drug abuse.

More than two decades after Koob proposed his idea, scientists are finally making headway on affordable vaccines against addictive drugs. A vaccine for cocaine has seen success in early human trials, and one against heroin is making its way toward the clinic. A potential vaccine to combat methamphetamine addiction has shown promise in rodents.

Yet the approach is not without its critics. No addiction vaccine has proved effective in a largescale investigation in people, and the first such vaccine (for nicotine addiction) to be put to that test did not fare well. Because environmental factors are instrumental in perpetuating addiction, many experts argue that the problem is unlikely to succumb to a strictly biochemical attack. Still, for a disease that often stubbornly persists despite available treatments, vaccines could be an important addition to the toolbox.

Scourge of Their Lives

Nearly one in 12 Americans is addicted to illegal drugs, according to the latest data from the Substance Abuse and Mental Health Services Administration. The National Institute on Drug Abuse estimates that abuse of alcohol, tobacco and illicit drugs together cost the economy more than $600 billion a year.

Addictions remain difficult to treat. First a user generally heads to detox, during which addicts abstain from using a drug so that it is eliminated from the body. Patients in detox centers receive around-the-clock support to manage the often intense physical and psychiatric symptoms that accompany this process. After detox, some people spend weeks or months in rehabilitation at a live-in facility; others simply attend weekly outpatient psychotherapy either individually or in groups. They often have to rely on willpower and motivation to try to stay clean. The limits of this do-it-yourself approach to addiction treatment are reflected in abysmally high relapse rates, which range from 40 to 60 percent for cocaine, heroin and methamphetamines.

Most psychotherapies used to treat addiction help an addict reduce and resist his or her cravings by avoiding places and people linked to drug taking—their triggers—and developing support networks to help them kick the habit. In addition, doctors may prescribe medications such as methadone and buprenorphine for addiction to heroin and other opiates that reduce withdrawal symptoms and cravings and temper the high. But these medications do not completely eliminate cravings, and users may not remember to take them every day. Drugs that combat nicotine addiction are partially effective at best, and no medications exist for dependence on cocaine, methamphetamines or alcohol.

Thus, for many addicts, getting and staying clean seems like an impossible dream….

Sleeping Rats

Koob wanted to help people overcome this frustration by interfering with the biochemistry of drug taking. After a user injects, inhales or ingests a drug, it travels through the bloodstream and crosses the blood-brain barrier, a sheath of cells that lines brain capillaries and protects the brain from many toxic substances and other molecules in the bloodstream. Once inside the brain, molecules of the drug (or its metabolised products) bind to specific targets, setting off a series of chemical events that produce feelings of euphoria. Methadone treats heroin withdrawal and cravings—and can block its high—by acting at opiate receptors much more slowly and mildly than heroin. Koob wanted to intervene sooner, before a drug crossed the blood-brain barrier. So he decided to push the idea of the addiction vaccine.

Like an infectious disease vaccine, an addiction vaccine mobilises the immune system to fight a foreign substance. The vaccine trains the immune system to make antibodies that specifically target the “invader.” These antibodies will then rapidly kill the pathogen or deactivate the drug whenever they encounter it in the bloodstream. Because they act by sticking to a drug molecule, antidrug antibodies have the added benefit of creating a compound that is too big to cross the blood-brain barrier.

For scientists, the goal of the vaccine was to coax the immune system into responding to something that does not ordinarily provoke a reaction. Koob and Scripps medicinal chemist Kim Janda decided to attach the drug molecule—cocaine in this case—to a protein from a virus that does incite an immune reaction. This technique causes the immune system to react to the combination molecule, creating antibodies that will bind to various parts of it. Many of these antibodies will then also attach to a cocaine molecule when it enters the body alone. The vaccine thus prompts a subset of immune cells to build an arsenal against cocaine.

Next Koob, Janda and their colleagues injected their vaccine into rats immediately after the animals were exposed to cocaine. Ordinarily, when get high for the first time on a stimulant such as cocaine, they become hyperactive and restless; they fail to eat and stay awake for extended periods. In contrast, after these rats took a huge hit of cocaine they were able to fall asleep. The rats had become immune to the effects of cocaine.

Several other laboratories, including that of neuroscientist Thomas Kosten of the Baylor College of Medicine, also developed cocaine vaccines that proved effective in animals. Instead of using a viral protein, Kosten and his colleagues attached cocaine to a toxin produced by the bacterium that causes cholera. In 2002 the researchers gave 24 former cocaine users their vaccine to test its safety and to see whether it would trigger the hoped-for antibody production in people similar to those who might eventually receive the vaccine therapy. Although the treatment proved benign, it failed to produce high levels of antibodies in 25 to 30 percent of patients.

Booster Shots

In a larger follow-up study published in 2009, Kosten’s team injected 109 drug users either with the vaccine or with saline—and gave the participants four booster shots over the next 12 weeks to try to raise the percentage of those able to make adequate antibodies. The researchers also tested the addicts’ urine three times a week for 24 weeks for cocaine and other drugs and monitored the level of anticocaine antibodies in their bloodstream. Only 38 percent of the vaccine recipients had high levels of those antibodies, yet nearly all produced some antibodies, and as a group they were 22 percent less likely to have a cocaine-positive urine test than were those who received saline injections. In addition, those who produced large amounts of antibodies were significantly more likely to have cut their cocaine usage by half.

Still, Kosten’s vaccine left a considerable number of cocaine addicts without adequate antibody protection. It also did not affect their desire to use, which means that they might not come back for booster shots. So Koob and his colleagues kept pursuing their virus-based vaccine and came up with a combination of chemicals that included a new viral protein. In a study published in 2013 Crystal, Koob, Janda and their colleagues injected their latest manipulation of the cocaine molecule into four female rhesus macaques monkeys that had become dependent on cocaine; a fifth received a saline injection.

The macaques that received the vaccine produced very high levels of antibodies to cocaine. When the monkeys were then injected with cocaine, positron- emission tomography (PET) brain scans showed that very little of the drug bound to its molecular target, the dopamine transporter in the brain [see illustration below]. What is more, the animals showed no behavioural signs of a drug high, such as restlessness or insomnia. Koob and Janda are currently planning a preliminary safety trial of their vaccine in humans.

The pair is also now putting the finishing touches to a heroin vaccine. A vaccine for heroin is trickier to make because heroin is rapidly metabolised into morphine and 6-monoacetylmorphine, both of which act on the brain’s opioid receptors. An effective vaccine therefore has to spur the production of antibodies against heroin’s breakdown products as well as the drug itself. So Koob and Janda made three vaccines in one: they separately attached the virus protein to heroin and its two major metabolites.

In 2013 Koob, Janda and their colleagues tested their compound vaccine in rats addicted to heroin. These animals spent many of their waking hours either searching for or taking heroin, delivered by intravenous infusion whenever the rats pressed a lever. The researchers then removed the heroin and injected half the rats with three doses of vaccine. After 30 days, the vaccinated rats were once again offered heroin. Although the animals tried to get high, they stopped pushing the lever after several minutes, presumably because they were not getting any reward. The rats that had not been vaccinated, in contrast, kept obsessively pressing the lever for heroin.

Whether the vaccine will work in humans is still an open question, however. Human addicts might be more determined than rats to get high, so if a vaccine thwarts that high, instead of giving up, people might wind up taking more of a substance, leading to a massive overdose. In addition, humans have access to other addictive substances. If you have an addict who seriously wants to use drugs and is vaccinated, then the next option could be to use a different drug that the vaccine doesn’t act against….

The Other Half of Addiction

Even if vaccines do not produce such rebound effects, many addiction specialists believe the approach is too narrowly focused on biochemistry to be of much benefit in the real world. A complex interplay between individual psychology and environment is at least half the equation of addiction. Maybe the vaccine would help with the part of addiction that is biological, but what can be done about the other half?

The nicotine vaccine NicVAX, produced by Nabi Biopharmaceuticals, provides a cautionary tale. In large-scale clinical trials conducted from 2009 to 2011, the vaccine (which is nicotine attached to a bacterial antigen) performed no better than placebo in getting people to quit smoking. Koob and other researchers believe, however, that the devil is in the details. They expect other combinations of pathogen proteins and drug molecules—whether nicotine, heroin or cocaine—to fare better.

Koob concedes that vaccines are only part of the solution to the addiction puzzle. “Vaccines aren’t going to cure addiction by any stretch,” he says. “But they will put up an enormous barrier.” If vaccines can help even a fraction of addicts get off drugs, Koob and Janda believe their work will have been worth the effort.

REFERENCES

  • Cocaine Vaccine for the Treatment of Cocaine Dependence in Methadone- Maintained Patients: A Randomized, Double-Blind, Placebo-Controlled Efficacy Trial. Bridget A. Martell et al. in Archives of General Psychiatry, 66, No. 10, pages 1116–1123; October 2009.
  • A Vaccine Strategy That Induces Protective Immunity against Heroin. Neil Stowe et al. in Journal of Medicinal Chemistry, Vol. 54, No. 14, pages 5195–5204; July 28, 2011.
  • Dynamic Vaccine Blocks Relapse to Compulsive Intake of Heroin. Joel E. Schlosburg et al. in Proceedings of the National Academy of Sciences USA, 110, No. 22, pages 9036–9041; May 28, 2013.
  • Adenovirus Capsid-Based Anti-Cocaine Vaccine Prevents Cocaine from Binding to the Nonhuman Primate CNS Dopamine Transporter. Anat Maoz et al. in Neuropsychopharmacology, 38, No. 11, pages 2170–2178; October 2013.
  • A Shot at Quitting. Carrie Arnold in Scientific American Mind, Vol. 26, No. 1, pages 43-47; January/February 2015