Social Media and the Mind

Posted Posted in Jayne's blog

How do hours of Facebook and constant streams of WhatsApp, tweets and text messages affect our cognition and mental health? Scientists are beginning to find out….

Green with FaceBook Envy

Spending a lot of time on Facebook is linked to diminished well-being, according to many studies. Yet questions linger about cause and effect—perhaps people who are already lonely simply spend more time on social media. New studies reveal that Facebook can indeed affect mood and mental state, and whether the effect is positive or negative depends heavily on how a person interacts with his or her contacts. Several of the new findings reveal that when Facebook hurts, the underlying culprit is—you guessed it— envy.

A study published in February 2015 in Computers in Human Behavior surveyed 736 college students and found that when Facebook evoked envy, it increased symptoms of depression. But a March 2015 study from the same journal found that Facebook use can actually decrease depression if users sign on seeking social connection and support and then feel they have received it.

Those studies did not attempt to figure out why some people experienced envy and others did not, but other studies have found that the way a user interacts with Facebook may be crucial. For example, researchers at the University of Michigan and KU Leuven in Belgium tracked 173 students’ habits over time and found that passive use—browsing news feeds, for example—led to reduced well-being by increasing feelings of envy. Active use, such as posting and commenting, had no such effect, according to the two studies, published in April 2015 in the Journal of Experimental Psychology: General.

Another important factor seems to be how close you are to the people with whom you interact. Two related experiments published in November 2015 in Computers in Human Behavior were the first to explore the role of relationship strength in users’ emotional responses to posts on the site. Among a sample of 207 American adults and 194 German college students, the researchers found that people more often felt positive emotions than negative ones when browsing the site, and their emotions were amplified when reading posts from someone they knew well.

Empathy is apparently more pronounced when the relationship is closer, so one is more likely to ‘catch’ the happiness of a close friend than a casual acquaintance. Close friends can inspire envy, too, but the researchers found that this type of envy tended to be benign—the overall reaction to a friend’s good news was usually positive.

The takeaway, the experts say, is that you can control how Facebook makes you feel. If you tend to compare yourself with others or get envious easily, you might consider limiting your time spent on social-networking sites or make a conscious effort to use them in active rather than passive ways. It is not technology such as Facebook that affects our feelings per se but rather how we use it.

Waiting for that Email

There’s nothing like firing off a carefully crafted e-mail and then waiting for what seems like an eternity for a reply. When you finally do get an answer, you might still be frustrated. What do you make of the fact that it is only 10 words long?

We now have some clues about typical email response patterns, thanks to a recent study drawing on 16 billion emails sent by more than two million people. The participants were Yahoo Mail users who allowed their anonymised data to be used in what appears to be the largest-ever analysis of email behaviour. The researchers, based at the University of Southern California and Yahoo Labs, used algorithms to mine data about the times messages were sent and the number of words they contained, among other factors. Here are some of the surprising revelations:

■ The most likely length of a reply is just five words.

■ More than 90 percent of replies are sent within a day.

■ The younger you are, the faster and shorter your reply.

■ Messages sent on weekday mornings got the fastest responses.

■ E-mails with attachments took twice as long to get a reply as those without.

The researchers included only users who wrote to one another at least five times in the months covered by the study period. After mining the data, the researchers found they could use their algorithm to predict when an email conversation was nearing its end. For the first half of a dialogue, correspondents usually developed similar reply times and email lengths, lobbing messages back and forth at a regular clip. Yet that similarity decreased as the conversation trailed off. Many conversations ended with a long lag before one correspondent sent a final brief reply.

Of note to the anxious emailer: the more words in a reply, the longer it tended to take for the writer to send it—but only up to 100 words. Beyond that, the time for a reply actually dropped slightly, except for in the oldest age group. So if you’re expecting a hefty reply to your mission-critical missive, it won’t necessarily take any longer than a 100-word message. That may be some comfort while you wait on the edge of your seat.

How the younger sexes text

Texting has become the most popular form of communication among people under 30. One recent study found that students spend less than six minutes, on average, on schoolwork before being distracted by social media and texting. For a small percentage of teens, texting becomes compulsive—they may try to text less and fail or feel anxiety and frustration if they are kept away from texting. A new study from the American Psychological Association eval- uated how 211 girls and 192 boys communicated via text and found notable gender differences in overall behaviour and compulsive use:

■ Teenage girls use texting for social connection, whereas boys mostly use it to convey information.

■ Boys and girls send about the same number of texts every day, but girls are more likely to become compulsive texters.

■ Teenage girls who compulsively text see a steeper decline in their grades than their compulsive male counterparts. The researchers suggest the social content of girls’ messages may be more likely to distract them from their academic tasks.

■ Compulsive texting also appears to affect girls’ mental health more than boys’, perhaps because girls are more prone to text about negative feelings and to ruminate on those feelings.

Does Size Matter?

Posted Posted in Jayne's blog

DOWNLOAD THE ENTIRE FEBRUARY 2016 NEWSLETTER including this month’s Freebie.

Now that I have your attention, then I may be about to disappoint you.

We are going to be considering whether the size of Woody Allen’s second favourite organ really matters….The Brain.

Does a bigger brain make you necessarily smarter or wiser?

 

Adjectives such as “highbrow” and “lowbrow” have their origin in the belief, much expounded by 19th-century brain researchers, of a close correspondence between a high forehead—that is, a big brain—and intelligence. Is this true?

 

Bigger is slightly better

The human brain continues to grow until it reaches its peak size in the third to fourth decade of life. An MRI study of 46 adults of mainly European descent found that the average male had a brain volume of 1,274 cm3 and that the average female brain measured 1,131 cm3. Of course, there is considerable variability in brain volume, ranging from 1,053 to 1,499 cm3 in men and between 975 and 1,398 cm3 in women. As the density of brain matter is just a little bit above that of water plus some salts, the average male brain weighs about 1,325 grams, close to the proverbial three pounds often cited in American texts.

 

Removing brains after their owners died revealed that Russian novelist Ivan Turgenev’s brain broke the two-kilogram barrier, coming in at 2,021 grams, whereas writer Anatole France’s brain could barely bring half of that weight on the scale at 1,017 grams. (Note that postmortem measures are not directly comparable to data obtained from living brains.) In other words, gross brain size varies considerably across healthy adults.

 

What about cleverness? We all know from our day-to-day interactions that some people just don’t get it and take a long time to understand a new concept; others have great mental powers. Individuals differ in their ability to understand new ideas, to adapt to new environments, to learn from experience, to think abstractly, to plan and to reason. Psychologists have sought to capture these differences in mental capacities via a number of closely related concepts such as general intelligence (g, or general cognitive ability) and fluid and crystalline intelligence. These differences in people’s ability to figure things out on the spot and to retain and apply insights that they learned in the past to current circumstances are assessed by psychometric intelligence tests. These observations are reliable, in that different tests strongly correlate with one another. They are also stable across decades. That is, measures such as the intelligence quotient (IQ) can be repeatedly and reliably obtained from the same subjects nearly 70 years later.

 

Differences in general intelligence, assessed in this way, correlate with success in life, with social mobility and job performance, with health and with life span. In a study of one million Swedish men, an increase in IQ by one standard deviation, a measure of variability, was associated with an amazing 32 percent reduction in mortality. Smarter people do better in life. Whereas a high IQ may not predispose people to be happy or to understand the finer points of personal relationships, the highly intelligent are more likely to be found among hedge-fund managers than among supermarket checkout clerks.

 

What about any numerical relation between brain size and intelligence? Such correlations were difficult to establish in the past when only pathologists had access to skulls and their content. With structural MRI imaging of brain anatomy, such measurements are now routine. In healthy volunteers, total brain volume weakly correlates with intelligence, with a correlation value between 0.3 and 0.4 out of a possible 1.0. In other words, brain size accounts for between 9 and 16 percent of the overall variability in general intelligence. Functional scans, used to look for brain areas linked to particular mental activities, reveal that the parietal, temporal and frontal regions of the cortex, along with the thickness of these regions, correlate with intelligence but, again, only modestly so. Thus, on average, a bigger brain is associated with somewhat higher intelligence. Whether a big brain causes high intelligence or, more likely, whether both are caused by other factors remains unknown.

 

Recent experiments take into account the particular connections among neurons in certain regions of an individual’s brain, much like a neural fingerprint. They do better at predicting fluid intelligence (the capacity to solve problems in novel situations, to find and match patterns, to reason independently of specific domains of knowledge), explaining about 25 percent of the variance in this measure from one person to the next.

 

Our ignorance when it comes to how intelligence arises from the brain is accentuated by several further observations. As alluded to earlier, the adult male’s brain is 150 grams heavier than the female’s organ. In the neocortex, the part of the forebrain responsible for perception, memory, language and reasoning, this disparity translates to 23 billion neurons for men versus 19 billion for women. As no difference exists in the average IQ between the two genders, why is there a difference in the basic number of switching elements?

 

It is also well established that the cranial capacity of Homo neanderthalensis, the proverbial caveman, was 150 to 200 cm3 bigger than that of modern humans. Yet despite their larger brain, Neandertals became extinct between 35,000 and 40,000 years ago, when Homo sapiens shared their European environment. What’s the point of having big brains if your small-brained cousins outcompete you?

 

Brain size across species

Our lack of understanding of the multiplicity of causes that contribute to intelligence becomes even more apparent when looking outside the genus Homo. Many animals are capable of sophisticated behaviours, including sensory discrimination, learning, decision making, planning and highly adaptive social behaviours.

 

Consider honeybees. They can recognise faces, communicate the location and quality of food sources to their sisters via the waggle dance, and navigate complex mazes with the help of cues they store in short-term memory. And a scent blown into a hive can trigger a return to the site where the bees previously encountered this odour, a type of associative memory that guides them back and that was made famous by Marcel Proust in his Remembrance of Things Past (À la Recherche du Temps Perdu). The insect does all of this with fewer than one million neurons that weigh around one thousandth of a gram, less than one millionth the size of the human brain. Yet are we really a million times smarter? Certainly not if I look at how well we govern ourselves.

 

The prevailing rule of thumb holds that the bigger the animal, the bigger its brain. After all, a bigger creature has more skin that has to be innervated and more muscles to control and requires a larger brain to service its body. Thus, it makes sense to control for overall size when studying brain magnitude. By this measure, humans have a relative brain-to-body mass of about 2 percent. What about the big mammals—elephants, dolphins and whales? Their brains far outweigh those of puny humans, up to 10 kilograms for some whales. Given their body mass, ranging from 7,000 kg (for male African elephants) up to 180,000 kg (for the blue whale), their brain-to-body ratio is under a tenth of a percent. Our brains are far bigger relative to our size than those of these creatures. Smugness is not in store, though. We are outclassed by shrews, molelike mammals, whose brain takes up about 10 percent of their entire body mass. Even some birds beat us on this measure. Hmm.

 

One small consolation is an invention of neuroanatomists called the encephalization quotient (EQ). It is the ratio of the mass of the brain of the species under investigation relative to a standard brain belonging to the same taxonomic group. Thus, if we consider all mammals and compare them against the cat as a reference animal (which therefore has an EQ of 1), people come out on top with an EQ of 7.5. Stated differently, the human brain is 7.5 times bigger than the brain of a typical mammal weighing as much as we do. Apes and monkeys come in at or below five, as do dolphins and other cetaceans. We finally made it to the top, validating our ineradicable belief in humanity’s exceptionalism.

 

Yet it is not quite clear what all this means in terms of the cellular constituents of brains. Neuroscientists always assumed that humans have more nerve cells where it counts, in the neocortex, than any other species on the planet, no matter the size of their brain.

 

A 2014 study of 10 long-finned pilot whales from the Faeroe Islands plays havoc with this hypothesis. Caught as part of a local hunt in the cold waters of the North Atlantic, between Scotland and Iceland, these graceful mammals—also known as blackfish—are actually dolphins. The number of nerve cells making up their highly convolved neocortex was estimated in a few sample slices and then extrapolated to the entire structure. The total came to an astonishing 37.2 billion neurons. Astonishing because this implies that the long-finned pilot whale has about twice as many neocortical neurons as humans do!

 

If what matters for cognitive performance is the number of neocortical neurons, these dolphins should be smarter than all other extant creatures, including us. Whereas the highly playful and social dolphins exhibit a variety of skills, including the ability to recognize themselves in a mirror, they do not possess language or any readily discernible powers of abstraction that stand out from those of other nonhuman animals. So what gives? Is the complexity of the nerve cells themselves substantially less than cells found in people, or is the way these neurons communicate or learn less sophisticated? We don’t know.

 

People forever ask for the single thing that distinguishes humans from all other animals, on the supposition that this one magical property would explain our evolutionary success—the reason we can build vast cities, put people on the moon, write Anna Karenina and compose Eroica. For a while it was assumed that the secret ingredient in the human brain could be a particular type of neuron, so-called spindle or von Economo neurons, after Baron Constantin von Economo (1876–1931).

 

But we now know that not only great apes but also whales, dolphins and elephants have these neurons in their frontal cortex. So it’s not brain size, relative brain size or absolute number of neurons that distinguishes us. Perhaps our wiring has become more streamlined, our metabolism more efficient, our synapses more sophisticated.

 

As Charles Darwin surmised, it is very likely a combination of a great many different factors that jointly, over the gradual course of evolution, made us distinct from other species. We are unique, but so is every other species, each in its own way.

 

References:

The Evolution of the Brain, the Human Nature of Cortical Circuits, and Intellectual Creativity. Javier DeFelipe in Frontiers in Neuroanatomy, Vol. 5, Article No. 29. Published online May 16, 2011.

Quantitative Relationships in Delphinid Neocortex. Heidi S. Mortensen et al. in Frontiers in Neuroanatomy, Vol. 8, Article No. 132. Published online November 26, 2014.

Do NOT diet!

Posted Posted in Jayne's blog

DOWNLOAD THE ENTIRE JANUARY 2016 NEWSLETTER including this months Freebie.

At any given time, at least one in five adults reports being on a diet, but the majority don’t keep the weight off. A huge amount of scientific evidence tells us that dieting does not promote lasting weight loss. In fact, many dieters end up gaining back more weight after they quit.

When I say “diet,” I am referring to eating regimens that require cutting potions, severely restricting calories, or eliminating entire food groups: carbs, fats, sweets, whatever. Despite such deprivations, diets remain alluring because they offer a clear and quick prescription dictating what you should and should not eat. These tactics are meant to tame erratic eating behaviours and revise poor food choices. But the truth is that such strategies hardly ever work because they are too extreme and thus almost impossible to maintain over the long term.

In trawling through many papers & articles about losing weight it seems that the best advice from psychologists and researchers are: Do not diet. Do not cut out groups of foods or count calories. Do not try to eat very little or deprive yourself. Such strategies backfire because of psychological effects that every dieter is all too familiar with: intense cravings for foods you have eliminated, bingeing on junk food after falling off the wagon, an intense preoccupation with food. A growing body of research shows why these tendencies undermine most people’s diet efforts and confirms that the way around these pitfalls is moderation. Making small changes to your eating patterns, ones you can build on slowly over time, is truly the best pathway to lasting weight loss. Although you may have heard this message of moderation before, the evidence is finally too overwhelming to ignore.

Effective weight management is particularly important when we consider (most of the studies have been conducted in the States) that two thirds of Americans older than 20 are overweight or obese. With the rise in obesity rates and related health problems, such as diabetes and heart disease—both of which are leading causes of death in the U.S.—it has become even more critical for us all to approach weight loss armed with a keen understanding of what really works and what doesn’t. Let’s start with what doesn’t.

Why Typical Diets Fail

  1. The “what the hell effect”

Studies have consistently revealed that dieting usually leads to weight gain, not weight loss. In a 2013 review published online in Frontiers in Psychology, investigators reported that 15 of 20 studies showed that dieting predicted weight gain in adolescents and adults of normal weight.

One problem with diets is that once you give into temptation after restricting yourself, you are more likely to binge. This tendency, which psychologists dub the “what the hell effect,” undermines attempts to lose weight. A 2010 study by psychologists at the University of Toronto demonstrated this effect in people who believed they had broken their diet. In the study, 106 female students—some of whom were dieting and some of whom were not—all received identical slices of pizza. Some of the students saw a person carrying another slice that was either bigger or smaller than the one they got, and others did not see another slice. After they ate the pizza, the participants were asked to taste-test a range of cookies. Women who weren’t dieting and dieters who thought they had eaten a smaller than usual slice or who didn’t see a comparison slice ate a small amount of cookies. But dieters who thought they had violated their diet by eating a bigger slice ate more cookies than everyone else.

The researchers suggest that these women believed they had already blown their diet—so what the hell, might as well pig out on cookies. This study and many others like it confirm that violating or even thinking you have gone off your diet is enough to abandon self-control.

 

  1. Ironic processing

Some diets promise you’ll avoid feelings of deprivation by letting you eat as much as you want of certain food groups while totally eliminating others. The trouble is that when you eliminate your favourite foods—a requirement of most weight-loss regimens—you develop a deeper longing for them. Vow to avoid pasta, and you will soon find yourself dreaming about the plate of spaghetti bolognese or a vegetarian lasagne.

Food preoccupation is an inevitable result of dieting. Psychologists call this phenomenon “ironic processing”—suppressing a thought makes it more salient. It became famous when the late social psychologist Daniel Wegner did a series of experiments—the white bear studies—in which he asked subjects to avoid all thoughts of a white bear. Guess what creature relentlessly prowled through their minds…

Many studies over the years have shown that people who try to eliminate food groups end up craving those foods intensely. One published last year confirms that finding and adds to mounting evidence that not only do people crave the forbidden food, they eat more of it when they get a chance. The study compared eating patterns in 23 normal-weight nondieters who restricted their intake of palatable foods, such as doughnuts and ice cream (again, American studies….), and 23 similar people who merely recorded their snack intake. The researchers found that participants who restricted themselves reported craving and eating more treats, whereas those who simply monitored their snacks did not. This growing line of research suggests that for most, eliminating foods entirely will backfire badly.

In fact, treating yourself to indulgences may help you avoid the pitfalls of craving and overeating forbidden foods. In a 2012 study, 144 obese men and women were put on a strict, low-calorie diet for 16 weeks. About half ate a regular breakfast—300 calories—and the rest consumed a larger breakfast— 600 calories—which included something sweet, such as a doughnut or chocolate (and ate less at dinner to make up for it). In the second half of the study, participants tried to maintain their meal plans on their own for 16 more weeks. The participants kept food diaries and continued to receive counselling from a dietician.

After the initial 16 weeks of close monitoring, the small breakfast group had lost a few more pounds than the large breakfast group (33 versus 30 pounds). But in the self-maintenance 16-week period, the small breakfast group regained 25 pounds, whereas the large breakfast group continued to shrink, dropping 15 additional pounds. Notably the small breakfast group reported increased cravings for sweets, fats and fast foods at the end of the study, whereas the large breakfast group reported reduced cravings in each category. Although eating dessert for breakfast is not necessarily the fastest or healthiest route to weight loss, these findings demonstrate that it is possible to have your cake and lose weight, too.

  1. Mental fatigue

Although efforts to change your eating behaviours require attention and record keeping, especially at the beginning, focusing too much energy on what you eat reduces your ability to do other, potentially more important things. Studies that examine the mental energy available to dieters versus nondieters consistently reveal that dieters have more difficulty learning new information, solving problems and exerting self-control.

Overthinking your food choices may also have negative consequences for your mental health. A 2010 study in Appetite looked at the mental toll of eating chocolate among dieters and nondieters. The nondieters were not particularly distracted by this indulgence, but the dieters could no longer think clearly, becoming consumed with thoughts, such as “Why did I eat that?” and “What should I eat later today to make up for eating that?’

Another experiment published in 2010 found that women who restricted their caloric intake and recorded what they ate exhibited elevated cortisol levels, a marker of biological stress. Even women who simply monitored their meals without trying to restrict calories reported feeling more stressed, and they ended up gaining weight. The bottom line is, for most people, that diets not only backfire, they also take a heavy toll on our physical and mental well-being.

What You Should Do

  1. Start with your head

If you want to improve your body, you must also improve your mind-set. Decades of research show that individuals who are dissatisfied with their bodies are less successful at losing weight. Studies also show that it is possible for anyone to learn to feel good about his or her body.

In a 2014 study, women with eating disorders, including some who binged or who were over-weight, received compassion-focused therapy— an approach aimed at reducing feelings of shame and improving self-esteem. Over the 12-week treatment, women who exhibited greater improvements in self-compassion and reductions in body shame were also more likely to develop better eating habits.

One simple way to improve your self-esteem, according to many findings, is to write positive affirmations on a regular basis. Happiness research has consistently shown that focusing on what you do like—“I have nice eyes”—and on health rather than appearance-related goals—“I want to run a 5K this year”—can help you develop a healthier mind-set and self-image.

  1. Simple, slow and steady

When setting a weight loss goal, it is natural to want to accomplish it immediately. Yesterday please rather than tomorrow! But to maintain a more svelte figure, you need to make gradual, sustainable changes to your diet: for example, drinking less alcohol and juice, substituting diet drinks for (sparkling) water, and eating dessert on four nights a week instead of seven. Making even small changes such as these may sound like a “diet,” which I have just told you to avoid, but it is not, for one important reason: this slow, steady approach allows you to adjust to a new routine at your own pace without the intense effort and denial that typical diet plans require.

Apparently most people trying to lose between 2 and 20 kg will benefit from this slow-to-moderate approach to weight loss, but it is important to note that individuals whose health is at serious risk because of obesity will likely need more drastic measures and should consult a doctor.

 

A large body of research supports the idea that making simple, gradual changes to your eating patterns is the best way to promote lasting weight loss. Robust evidence for this approach comes from a 2008 study, which demonstrated that overweight and obese adults who made very modest changes to their daily calorie intake and physical activity levels lost four times more weight than those following regimens that involved more extreme calorie restriction. The moderate group lost 10 pounds in one month, and they sustained the weight loss over the next three months.

In support of this approach, a 2015 study published in PLOS ONE found that women who successfully modified their diet and exercise habits over time set small, achievable behaviour change goals, had realistic expectations about their weight loss and were internally motivated to lose weight. The women who relapsed or failed to change their habits tended to have unrealistic expectations, lower motivation and self-confidence, and less satisfaction with their progress.

Some of the most compelling data on effective weight-loss strategies comes from the National Weight Control Registry (NWCR), which surveyed more than 4,000 people who lost at least 14 kg and kept it off for at least a year. The best tactics, according to the seminal 2006 report, included self-monitoring, such as limiting certain foods, keeping track of portion sizes and calories, planning meals and incorporating exercise into the daily routine.

Such advice may appear to conflict with the research I described earlier on the pitfalls of restriction and mental fatigue, but the truth seems to be (based on the research) that, to lose weight, it is important to find the right balance. For instance, before making changes to your diet, you need to understand your current eating patterns, which may require considerable thought and attention. Most overweight individuals, when they are not trying to diet, tend to eat erratically— consuming junk food, snacking a lot and indulging cravings on a whim. Becoming aware of these habits, the good and bad, will allow you to tailor them.

As you begin to make small tweaks to your daily eating, start to plan a few meals you like that you can cycle through on a regular basis, so you don’t have to think too hard about what you’re going to eat every day. According to the NWCR data, people who plan their meals are 1.5 times more likely to maintain their weight loss. The NWCR data also show that limiting the variety of foods you eat can help you sustain your weight. You don’t have to eat the same foods every day, but generally reducing the array of options makes grocery shopping less stressful.

  1. Work it out

We all know by now that exercise is essential for all-around health. Yet study after study shows that working out is not terribly effective for weight loss on its own. When combined with better eating habits, however, exercise appears to help people slim down. A 2012 study looked at the effects of diet or exercise, or both or neither, in a group of overweight or obese post-menopausal women. Dieters could consume between 1,200 to 2,000 calories a day, depending on their initial weight, and exercisers ramped up to 45 minutes or more of cardio five days a week. After 12 months, those in the combined diet and exercise group lost the most weight— about 19.5 pounds (8.9 kg)—although the diet-only group was not far behind, losing 15.8 pounds (7.2 kg). Those who only exercised lost 4.4 pounds (2kg) and the control group, who didn’t exercise or eat differently, lost 1.5 pounds (0.7 kg) over the year.

Once your goal weight is achieved, exercise may be crucial for keeping the scale steady. Most people who have slimmed down report that routine physical activity is an important part of their maintenance regimen. Exercise has many physiological benefits; it even appears to moderate the brain’s reaction to pleasurable foods. In a small 2012 study, overweight or obese participants underwent an initial brain scan while looking at images of food. Then they were put on a six-month exercise regimen. At six months, the exercisers showed decreased activity in the insula, which regulates emotions, in response to images of palatable treats. They did not, however, report changes in dietary restraint, food cravings or hunger, suggesting that the neural effects are subtle—perhaps helpful during weight maintenance but not strong enough to induce weight loss.

Incorporating exercise into your life should be a gradual process. You don’t have to run marathons to reap psychological and physical rewards. Going for a lunchtime walk or biking to work is a way to integrate activity into your daily routine. You can also increase your movements in small ways by taking the stairs instead of the lift or washing your car instead of driving through the car wash. Being disciplined is important, but making exercise fun and sustainable is also essential.

  1. Don’t do it alone

Receiving social support is key to losing weight. Consulting a doctor or nutritionist is one way to elicit support and provide greater accountability.

Research also demonstrates the role romantic partners play in encouraging weight loss. Research has shown that men are better able to adopt and stick with healthier eating habits when they receive support and encouragement from their spouse. Similarly, friends, co-workers and online weight-loss buddies can keep you on track by offering inspiration, praise and partners in crime. More systematic help has been shown to be useful, too, such as becoming a member of Weight Watchers or other support groups or participating in the community of users of smart-phone apps such as MyFitnessPal and Lose It!

After decades of diet studies, we can no longer ignore the fact that the majority of evidence points toward these small, sustainable steps as the best way to lose weight. That message may not be as sexy or exciting as the latest fad diet, but the science is clear: moderation leads to changes that will last for the rest of your life. Creating good habits takes time, patience and resolve, and you will inevitably encounter setbacks along the way. But the key is to never give up—and in a few short months, you may find yourself on the road to the body and active way of life you’ve always dreamed about.

References:

Dietary and Physical Activity Behaviors among Adults Successful at Weight Loss Maintenance. Judy Kruger, Heidi Michels Blanck and Cathleen Gillespie
in International Journal of Behavioral Nutrition and Physical Activity, Vol. 3, Article No. 17. Published online July 19, 2006.

Dieting and Restrained Eating as Prospective Predictors of Weight Gain. Michael R. Lowe, Sapna D. Doshi, Shawn N. Katterman and Emily H. Feig in Frontiers in Psychology, Vol. 4, Article No. 577. Published online September 2, 2013.

Smart People Don’t Diet: How the Latest Science Can Help You Lose Weight Permanently. Charlotte N. Markey. Da Capo/Lifelong Books, 2014.

From Our Archives

Out of Synch – How Our Digital Lifestyles Are Upsetting Our Body’s Natural Rhythms

Posted Posted in Jayne's blog

DOWNLOAD THE ENTIRE DECEMBER 2015 NEWSLETTER including this month’s Freebie.

Are you one of those people who falls asleep minutes after their head hits the pillow and awakens cheery and refreshed when the sunlight filters through the window?

If you are, then count your blessings! Your reliable inner clock may also deserve some credit for other aspects your health: good blood pressure, metabolism, digestion, and more.

Millions of people across the world —including nurses, firemen, airline crews, truck drivers and factory workers—have irregular work schedules that may cause a disconnect from the basic time-based patterns of daily life. Our internal organs operate in patterns called circadian rhythms that repeat over the course of each 24-hour day. And research is revealing that when these physiological rhythms are out of sync— a state known as circadian misalignment—the health impacts can be vast, from diabetes and obesity to cancer, heart problems, infertility, mood disorders and mental decline. “Your body is optimised to work

with a certain relationship to the natural world. Good health follows from that,” explains Martha Gillette, a neuroscientist and circadian expert at the University of Illinois at Urbana-Champaign. “In modern life, we’ve taken the world and done with it what we wish.”

Because modern routines clash with natural rhythms, scientists are beginning to suspect that virtually everyone is affected to some degree. Staying up late to work or have fun, using laptops, mobiles and other screens before bed or to quell insomnia in the middle of the night, indulging in midnight snacks—all these apparently innocuous activities can subtly throw the body off-kilter. The body clock is an ancient system, common to all life on earth, that relies on sunlight and darkness, periods of activity and periods of rest to calibrate itself. Today’s society, with its electric lights, 24-hour convenience stores, proliferating digital devices, global economy and “always on” mentality, has scrambled our inner timing systems.

In short, we are living in an age of circadian dysfunction.

Anyone who has flown across time zones knows what it feels like to have a body clock that is out of whack—fatigue, insomnia, digestive problems, headache, dizziness, nausea, among other symptoms. Jet lag is a classic example of circadian misalignment. The body typically adjusts within a week or so. But we are increasingly subjecting ourselves to the equivalent of permanent jet lag.

The science is so new that no one knows how many of us are affected, but people may experience mild circadian misalignment in a variety of ways without realising the root cause. It could present as stomach upset, unexplained insomnia or, more ominously, the shifts in blood pressure, inflammatory markers, insulin resistance and other metrics that signal the implacable onset of heart disease, diabetes or cancer. Happily, research reveals inexpensive and straightforward solutions that will allow most people to reset their inner clock.

Timing Is Everything

Almost every living thing, from cyanobacteria to lemurs, is attuned to the earth’s daily rotation. Evolution has smiled on creatures that capitalise on the planet’s day-night schedule, matching their internal workings to the shifting conditions of the outside world.

These are the fluctuations known as circadian rhythms (the word “circadian” comes from the Latin for “about a day”). In many animals they dictate the timing of hibernation, courtship and reproduction. Even in plants, circadian rhythms are crucial to survival. In June scientists at the University of Washington found that it is thanks to a circadian gene that the common garden petunia waits until night to release its fragrance, which attracts nocturnal pollinators.

Circadian rhythms also create the ebb and flow of human physiology. They explain why fevers run highest at night, why a late meal can make it hard to sleep, why teenagers are late risers and many other familiar aspects of daily life. And they are grounded in the daily planetary shift between light and darkness.

To align the body with what’s going on in the outside world, the suprachiasmatic nucleus, which serves as the brain’s master clock and is located deep within the hypothalamus, constantly monitors the intensity of ambient light. Bright light in the morning sets the body clock for the day, and evening darkness nudges organs into their night- time mode. For example, the drowsiness-inducing hormone melatonin flows, preparing the body for rest. The bladder expands to hold more urine, making it possible to sleep through the night. And the liver makes extra glucose to keep the brain nourished throughout the overnight fast.

But if the master clock encounters bright light at night, it sends “start the day” messages at the time when organs are settling down for the evening. Circadian rhythms get scrambled. This can happen when flying across time zones (and explains why jet lag is worse when traveling east); when people use an iPad, cell phone or laptop at night (because digital screens emit the same blue wavelengths found in morning sunlight); and when people work the wee hours in a brightly lit space or fall asleep with the television on.

Scientists have been investigating circadian rhythms for decades, but until very recently they did not appreciate how critically important these rhythms are to the regulation of nearly every bodily system. In the last 10 years or so, work on circadian rhythms and human health has really just exploded.

One of the discoveries: by banishing darkness, modern society has ushered in a host of potential health problems. We are all so used to nighttime light exposure that when you tell people it’s unnatural, they often reply, ‘What? Light?’ People don’t think of light exposure the same way they think of something like a drug or a dietary intervention, but really it does have these very profound effects on our physiology.

An even newer revelation: mealtimes may also be critically important to keeping circadian rhythms in balance. Mounting evidence suggests that the body relies not only on light exposure but also on behavioural cues to orient itself in time— sleep, exercise, social interactions and, perhaps most significant, eating.

The latest research suggests that the body is designed to take in food during the day and fast at night. Breakfast, like sunlight, seems to serve as a timing cue, alerting the body clock that it is morning. So snacking long after dark may be as disruptive to natural rhythms as staying up late bathed in the illumination of a digital screen.

Off the Clock

Scientists are learning that there is a genetic basis to people’s natural sleep inclinations. About half the population is predisposed to be either early birds or night owls, and the other half fall somewhere in between. These inherited patterns are known as chrono-types. Extreme chronotypes are rare: delayed sleep phase syndrome, for example, affects three in 2,000 people.

Misalignment Made Flesh

Disconnecting from daily rhythms strikes the body at the most basic level: the cell. In 2014 a team led by geneticist John Hogenesch of the University of Pennsylvania made an astounding discovery: Nearly half of all gene activity in mammals is timing-related. Previous estimates had been closer to 15 percent.This means the circadian clock could be influencing most, if not all, of our physiology and many of our behaviours.

Over the course of two days Hogenesch’s team removed 12 organs, including the heart, lungs and liver, from a different group of mice every two hours, then analysed the RNA from those tissue samples to figure out which genes were active in which organs at every hour of day and night. The team learned that organs do not chug along at a steady pace. Instead they are alternately active and quiescent, attending to certain tasks during the day and others at night, with “rush hours” of activity at dawn and dusk.

Another groundbreaking study, published a year earlier, detected the same telltale signs of rhythmic gene activity—in the brain. The work, conducted by the Pritzker Neuropsychiatric Disorders Research Consortium, involved 89 brains taken from people who had donated their bodies to science. Some of the donors had suffered from major depression, others had not. In the healthy brains, as in Hogenesch’s mice, hundreds of genes ramped up and slowed down at specific times of day, forming daily patterns so clear and predictable that they could be used to pinpoint time of death for an unmarked sample of brain tissue.

But the brains of depressed people were different. Their gene activity was haphazard and disorganised, lacking these daily patterns. Psychiatrists have long noticed that people with mood disorders tend to have sleep problems and other signs of circadian misalignment. Now here was physical proof that the circadian rhythms of depressed people are weak or nonexistent—circadian misalignment made flesh.

Flipping a Biological Switch

In some people the master clock gets broken. Their bodies adopt a ‘non-24’ sleep pattern with, for example, bedtime shifting an hour later each day.

Non-24 is a common side effect of blindness because damaged eyes do not transmit the necessary light signals to the master clock. But in the rare instances when non-24 affects sighted people, no one knows the cause.

The suprachiasmatic nucleus (the master clock) functions like an orchestra conductor, keeping time so that the individual rhythms of the heart, liver and other organs can coordinate—a bodily state known as entrainment. When the master clock stops working properly— whether because of a biological defect or because of frequent eating, working or socializing late into the night or at odd hours—internal organs begin operating at different tempos, like instrumentalists in a cacophonous orchestra with no maestro. Illness ensues.

Organs That Cannot Keep Time

Diabetes affects more than 29 million Americans, three times as many as a quarter of a century ago. Experts cite factors ranging from the ubiquity of cheap sugary drinks and snack foods to sedentary habits. But some scientists are starting to suspect that disrupted circadian rhythms may also underlie Americans’ mass metabolic dysfunction.

For years, observational studies have shown that people who work nighttime or rotating shifts are susceptible to much higher rates of obesity and diabetes. More recently, scientists have begun to artificially induce circadian misalignment, and here, too, one of the most dramatic changes they see is an increased disposition to weight gain and metabolic problems. In 2009 Harvard scientists kept 10 healthy people in a lab, scrambling their mealtimes and sleep schedules while subjecting them to constant low light. As the participants’ inner timekeepers lost track of day and night, their blood pressure, body temperature and hormone production stopped following regular patterns. Most strikingly, levels of leptin, the hormone that alerts people that they have eaten their fill, decreased. People with low leptin levels tend to over-eat. In addition, three participants became prediabetic, all in just 10 days’ time.

Experiments in animals are yielding equally dramatic results. Multiple labs are finding that when mice are kept in constant light or are forced to eat during their normal resting time, they gain weight—even when they consume the same number of calories. We are apparently not as good at metabolising our food when it’s not eaten at appropriate times of day.

Circadian disruption leads to cognitive as well as metabolic problems. Alertness and motor coordination decline markedly. If you look at the frequency of industrial accidents, they peak between two and four in the morning,. That is the time when people should not be doing anything that requires vigilance.

People whose jobs require them to work odd hours also have trouble making agile mental calculations. Emergency room doctors working the night shift showed short-term memory impairments in a 2012 study.

Animal experiments are confirming that the hippocampus, the part of the brain central to learning and memory, is highly sensitive to circadian disruption. For example, in studies published in 2013 found that rats with the equivalent of jet lag have trouble remembering what they have learned. Rats with longer-term circadian disruption, the kind that afflicts shift workers, have difficulty learning new tasks as well as recalling them.

Practically every month a new study spotlights circadian misalignment in some other ill. In a study published in April scientists at the University of Warwick examined uterine lining cells from 70 women and found a higher frequency of circadian disruption in women who suffer multiple miscarriages—suggesting that misalignment of daily rhythms in the womb hampers the ability of the fertilised egg to implant. Pregnancy is all about timing—an able sperm meets a fertile egg just as it is making its way through the fallopian tube—but it turns out that timing also matters at the cellular level.

For unknown reasons, rhythms shift later during adolescence, then return to normal in young adulthood. Several recent studies suggest that the disconnect between high school start times and teens’ natural sleep needs compromises brain areas related to reward and self-control, making them more susceptible to getting hooked on drugs and alcohol. New studies also link circadian misalignment to greater risk of post-traumatic stress disorder, breast cancer and inflammatory bowel disease.

The Value of Repetition

Circadian rhythms are old-fashioned. They are conservative. They are your grandmother’s medicine. Go to bed at a reasonable hour. Eat a good breakfast. Do not push yourself too hard. Something in our modern spirit rebels against these strictures. We will stay up until 3 a.m. binge-watching films or a favourite series if we feel like it. We will fall in love with people in faraway places and use Skype and cell-phone apps to erase the time differences.

But the need for structure and daily repetition is woven into our DNA. Sunrise and sunset bookended our ancestors’ days. We evolved on a planet that has a roughly 24-hour day, and we are biologically prepared to function better if we are in a regular rhythm.

Circadian “Hygiene”

Melatonin supplements improve mood and memory in people with dementia, who suffer from disturbed sleep and other hallmarks of circadian dysfunction. Sitting near a device called a light box to get bright light in the morning is a boon for people with seasonal depression. And forward-thinking nursing home administrators are finding that when they provide varied illumination instead of keeping the lights on 24/7, elderly residents are less disoriented.

People with bipolar disorder are especially vulnerable to circadian disruption: pulling an all-nighter or traveling overseas can trigger an episode of mania or depression. Conversely, regularising routines can stabilise their moods. A therapy recently developed – interpersonal and social rhythm therapy – asks patients to record daily when they get out of bed, when they first interact with other people, when they begin their daily routine, when they have dinner and go to bed—and then to tweak those times over a period of weeks to establish a schedule they can stick to. Keeping routines very regular, seven days a week, no shifts on weekends has proved effective in two large trials.

Circadian rhythms naturally deteriorate with age—which may account for some of the sleep and memory problems of the elderly. But strengthening circadian rhythms may be a hedge against cognitive decline. In research, old hamsters with strong circadian systems outperformed misaligned younger animals on memory tasks.

Changing habits is not easy. But if more people understood the potential long- term benefits to their mood, sleep quality, cardiovascular health, weight-loss goals and mental sharpness, they might make the effort. Maybe we should

consider sleep and circadian hygiene just as important as washing our hands? It seems to be critical for good health and well-being.

There is a lesson here for us, with our overextended, brightly lit, Starbucks- fueled lives. Modernity has made it possible to stretch beyond the confines of the 24-hour day, but in the process we have become untethered from the fundamental pulse of our planet. Science is revealing that we do so at our own risk.

REFERENCES:

Internal Time: Chronotypes, Social Jet Lag, and Why You’re So Tired. Till Roenneberg. Harvard University Press, 2012.
The Rhythms of Life: What Your Body Clock Means to You from Eye Disease to Jet Lag. Talk by Russell Foster. Physiological Society’s Annual Public Lecture, Birmingham, England, July 22, 2013. www.physoc.org/russell- foster-public-lecture

How to Fix a Broken Clock. Analyne M. Schroeder and Christopher S. Colwell inTrends in Pharmacological Sciences, Vol. 34, No. 11, pages 605–619; November 2013.
No, You Don’t: Essays from an Unstrange Mind. Sparrow Rose Jones. CreateSpace Independent Publishing Platform, 2013.

Missing Link Found Between the Brain and Immune System

Posted Posted in Jayne's blog

DOWNLOAD THE ENTIRE NOVEMBER 2015 NEWSLETTER including this month’s freebie.

Textbooks have traditionally taught that when it comes to the immune system, the brain and body are separate entities. When exposed to foreign objects such as bacteria or transplant tissue, the body stirs up a torrent of immune activity: white blood cells devour invading pathogens and burst compromised cells; antibodies tag outsiders for destruction. Except, that is, in the brain, where the blood-brain barrier bars both foreign bodies and immune cells from entry. New research, however, uncovered a previously unknown line of communication between our brain and immune system. The report in July in Nature adds to a fast-growing body of research linking the brain and bodily defenses.

As early as 1921, scientists recognised that the brain is different, immunologically speaking. Tissue grafted into the central nervous system sparks a far less hostile response than tissue grafted to other parts of the body, prompting scientists to consider the brain “immunologically privileged.” Experts have long pointed to the brain’s apparent lack of lymphatic drainage as one reason for this privilege. The lymphatic system is our body’s third set of vessels, along with arteries and veins. Lymph nodes—stationed periodically along the vessel network—serve as storehouses for immune cells. In most parts of the body, foreign invaders trigger the release of these cells through the vessels into the bloodstream.

The new study discovered that the brain is connected to the lymphatic system after all. Working primarily with mice, neuroscientist professor Jonathan Kipnis and his group identified a hitherto undetected network of lymphatic vessels in the meninges—the membranes that surround the brain and spinal cord—that shuttle fluid and immune cells from the cerebrospinal fluid to the deep cervical lymph nodes in the neck. Kipnis and his colleagues had previously shown that a type of white blood cell called a T cell (see picture on right) in the meninges is associated with significant influence on cognition and hence were curious about the role of meningeal immunity on brain function. Using neuroimaging on mouse meninges, the team noticed that T cells were present in vessels separate from arteries and veins.

The newly discovered vessels, which were also identified in human samples, could explain the long-standing conundrum of how the immune system manages to contribute to neurological and psychiatric disease. For example, some cases of multiple sclerosis are thought to result from autoimmune activity in response to an infection in the central nervous system and cerebrospinal fluid. Although Kipnis says that it is too early to speculate he does think that alteration in these vessels may affect disease progression in those neurological disorders with a prominent immune component, such as multiple sclerosis, autism and Alzheimer’s disease.

Some mental illnesses, including depression and schizophrenia, have also been linked with abnormal immune activity and inflammation. Yet scientists have not been able to uncover the underlying mechanism. The new finding suggests a tantalising target for research and, perhaps one day, medicines.

In light of the news, the textbooks might need some revising since it is becoming increasingly clear that the central nervous system is immune-different rather than immune-privileged.

How Awe Makes Us Healthier and Less Selfish

Posted Posted in Jayne's blog

DOWNLOAD THE ENTIRE OCTOBER 2015 NEWSLETTER including this month’s freebie.

“Awesome” has become a common descriptor, yet genuine awe is a profound emotion: the intake of breath at a starry night sky, a shiver down your spine during live music or a lump in your throat at the sight of a silent vast crowd holding candles aloft. Can this feeling make us better people? A recent paper in the Journal of Personality and Social Psychology suggests that it does.

Philosophers long ago suggested that awe binds people together. New research carried out at Professor Dacher Keltner’s lab in Berkeley proves that awe can make people less self-involved and more attuned to the needs of the larger group.

In the first of five studies, the researchers ascertained, through a representative national survey, that people who report feeling awe more often are, in fact, more generous. When given raffle tickets and offered the chance to donate some, those who frequently felt awe gave away more tickets.

Then the researchers conducted four other experiments in which they induced awe in some participants and other emotions such as pride or amusement in others. They evoked awe through videos of breathtaking natural scenes and by taking subjects outside to gaze upward at towering eucalyptus trees.

In every case, those who experienced awe behaved in what psychologists call a more “prosocial” way, being more helpful or making more ethical decisions. The participants who had gazed up at the trees, for example, picked up more pens that were “accidentally” dropped by an undercover researcher than other subjects outside who had gazed at a building.

By making us feel like a small part of something grander, awe shifts our attention from our own needs to those of the greater good. Some researchers have speculated that awe might have evolved as the response to a powerful leader. Maintaining social hierarchies and ensuring membership in a group can boost odds of survival.

One of the researchers suggests that people try keeping an “awe diary” for two weeks and every day soak up whatever evokes it—a sunset, a bird’s feathers. Shifting your focus toward something vast is can put your problems in perspective and open you to the greater world. It turns out that awe might also make it healthier too…..

Negative emotions have been linked to poor health outcomes, such as heart disease and even a shorter life span. Research suggests inflammation may be responsible for this link, at least in part. The molecules involved in inflammation are essential for our body’s response to infection and injury, but high levels over the long term have been linked to everything from diabetes to depression.

Few studies have assessed the health effect of positive emotions, so a team led by Jennifer Stellar of the University of Toronto (who also began studying awe in Keltner’s lab at Berkeley) conducted two studies to investigate the link. In the first, 94 students completed a questionnaire to determine how often they had experienced various emotions during the past month. The scientists then took a saliva sample to assess levels of a molecule that promotes inflammation called interleukin-6 (IL-6). They found more positive emotion was associated with lower levels of IL-6.

In the second experiment, 105 students completed online questionnaires designed to assess their tendency to experience several specific positive emotions. They later visited the lab to provide saliva samples. Joy, contentment, pride and awe were all associated with lower levels of IL-6, but awe was the only emotion that significantly predicted levels using a strict statistical test.

These results do not establish whether awe actually causes changes in IL-6 levels. In fact, the authors caution that the relation probably operates in both directions: having a healthier, less stressful life may allow a person to experience more awe. They point out that awe is associated with curiosity and desire to explore, which they contrast with the social withdrawal that often accompanies illness or injury. We know positive emotions are important for well-being, but these initial findings suggest they’re also good for our body.

How Being a New Parent Changes Your Brain

Posted Posted in Jayne's blog

DOWNLOAD ENTIRE SEPTEMBER 2015 NEWSLETTER including this month’s freebie.

The arrival of a child brings big changes in the brains of the new mothers and fathers. Mothers experience a near immediate shift, thanks in part to the hormones involved in giving birth and nursing. Fathers’ brains tend to change in different and more subtle ways.

Is ‘Baby Brain’ a Myth?

As many as four out of every five pregnant women say that they suffer from “pregnancy brain”—deficits in memory and cognitive ability that arise during pregnancy, making women more forgetful and slow-witted. Yet studies on the phenomenon have generally not supported these claims: although some have found evidence of problems on certain types of tasks, others, including a recent paper published by researchers in Utah, have found no signs of cognitive problems at all. Some experts believe that pregnancy brain and its postnatal cousin, “baby brain,” could largely be a product of confirmation bias: pregnant women and new mums expect to experience brain fog and therefore believe they are actually affected. Others argue that the mental symptoms might simply be too difficult to confirm in a laboratory setting.

In the most recent study, researchers at Brigham Young University gave cognitive and neuropsychological tests to 21 women in their third trimester of pregnancy and then tested them again six months after they gave birth. They administered the same tests at similar intervals to 21 women who had never been pregnant. They found no differences between the groups no matter when they were tested, including before and after giving birth. These findings mesh with those from a 2003 study, which found that pregnant women did not score differently from nonpregnant women on tests of verbal memory, divided attention and focused attention.

There is variety in the results, but overall most studies suggest there are few to no memory impairments associated with pregnancy. Researchers think the reason the myth persists may be that women selectively look for evidence that supports the cultural expectation. For example, when a pregnant woman loses her car keys, she might blame pregnancy brain—without recalling the times she lost her car keys before she was pregnant.

There is also another possibility, too. In a 2011 study, a team at the University of British Columbia found that although pregnant women did not display any problems on cognitive tests given in a lab, they were less likely than nonpregnant women to remember to call the lab when asked and to return a questionnaire on time. It is possible that lab-based measures do not reveal differences, because labs are typically quiet environments with minimal distractions, in contrast with everyday life.

Motherhood Can Be a Lonely Place

Entering motherhood is a rite of passage for most women. For many new mums, however, the first months and years can be a lonely place. A new study finds that several types of social support are crucial for staving off negative feelings.

Although only 10 to 15 percent of mothers from Western nations will develop a full-blown case of postpartum depression (PPD), many more will experience some serious symptoms of depression. Feelings of PPD are on a continuum, with PPD at the end. Even if PPD can be diagnosed clinically, there is no standard for measuring where the remaining 85 to 90 percent of mothers land on the scale. Researchers estimate that most first-time mothers are overwhelmed.

Becoming a mother is a major transition. New mothers give up autonomy, sleep and relationships to tend to the relentless needs of a baby. On top of that, they are also expected to be in a constant state of bliss and fulfillment with their new role. There can be a lot of pressure to be the perfect mother, and women can be afraid to say they are not coping.

Making matters worse, research that demonstrates the importance of early childhood experiences in determining future success and happiness puts additional pressure on mums to get it right. Also, for working mothers, who are used to a productive mindset and established social routines, it can be difficult to adapt to the repetitive life of meeting the basic daily needs of a baby. Many women appear to go back to work because of the loneliness.

According to a recent study from Patricia Leahy-Warren at University College Cork in Ireland published in the Journal of Clinical Nursing, mothers with strong social support who have confidence in their ability to parent were 75 percent less likely to be depressed than mothers who had neither advantage. There are four parts to social support: hands-on, emotional, informational and appraisal, meaning affirmation that a mother is doing a good job.

Mums require a network of people to meet these four types of social needs. Generally they lean most on their partner, then their own mother, then sisters. Health professionals, other family and friends can be an important part of a mother’s community. Good social support will also boost a mother’s confidence and ability to parent, Leahy-Warren says, which has a significant positive influence on her mental well-being.

Father’s brain shifts are different

Most investigations of brain changes have focused on mothers, but scientists have recently begun looking more closely at fathers. Neural circuits that support parental behaviours appear more robust in mums a few weeks after the baby is born, whereas in dads the growth can take several months.

A study in Social Neuroscience analysed 16 dads several weeks after their baby’s birth and again a few months later. At each check, the researchers administered a multiple-choice test to check for signs of depression and used MRI to image the brain. Compared with the earlier scans, MRI at three to four months postpartum showed growth in the hypothalamus, amygdala and other regions that regulate emotion, motivation and decision making. Furthermore, dads with more growth in these brain areas were less likely to show depressive.

Although some physiological brain changes are similar in new mums and dads, other changes seem different and could relate to the roles of each parent, as shownn in the brain diagrams below.).

A 2014 behavioral study of expectant fathers showed that midpregnancy ultrasound imaging was a “magic moment” in the dads’ emerging connection with their baby. Yet the emotional bond was different than it is in expectant moms. Instead of thinking about cuddling or feeding the baby, dads-to-be focused on the future: they imagined saving money for higher education or walking down the aisle at their daughter’s wedding.

It is interesting how little dads’ images centred on an infant, instead of what we might assume they would focus on such as putting the baby down for a sleep or changing nappies.

The Brain’s Homing Signal

Posted Posted in Jayne's blog

DOWNLOAD THE ENTIRE AUGUST 2015 NEWSLETTER including this month’s freebie.

After wandering round a unfamiliar part of town, can you sense which direction to travel back to get to your car or the station? If so, you can thank your entorhinal cortex, a brain area recently identified as being responsible for our sense of direction. Variation in the signal in this area might even explain why some people are better navigators than others.

The new work adds to a growing understanding of how our brains knows where we are. Groundbreaking discoveries in this field won last year’s Nobel Prize in Physiology or Medicine for John O’Keefe, a neuroscientist at University College in London, who discovered ‘place cells’ in the hippocampus, a brain region most associated with memory. These cells activate when we move into a specific location so that groups of them form a map of the environment.

O’Keefe shared the prize with his former students Edvard Moser and May-Britt Moser, both now at the Kavli Institute for Systems Neuroscience in Norway, who discovered ‘grid cells’ in the entorhinal cortex, a region adjacent to the hippocampus. Grid cells have been called the brain’s GPS system. They are thought to tell us where we are relative to where we started.

A third type – head-direction cells, also found in the entorhinal region – fires when we face a certain direction. Together these specialised neurons appear to enable navigation, but precisely how is still unclear. For instance, in addition to knowing which direction we are facing, we need to know which direction to travel. Little was known about ho or where such a goal-direction signal might me generated in the brain until the new study.

A team of researchers asked 16 volunteers to familiarise themselves with a virtual environment consisting of a square courtyard with a landscape (such as forest or a mountain) on each wall and a unique object in each corner. They then scanned the participants’ brains while showing them views from the environment and asking them to indicate in which direction different objects lay.

The entorhinal region displayed a distinct pattern of activity when volunteers faced each direction – consistent with how head-direction cells should behave. The researchers discovered, however, that the same pattern appeared whether the volunteers were facing a specific direction or just thinking about it. The finding suggests that the same mechanism that signals head direction also simulates goal direction. How, exactly, the brain switches back and forth is unclear, but the researchers think the brain probably signal which direction you are facing until you consciously decide to think about where you want to go, at which point the same cells then run the simulation.

Interestingly, the more consistently the participants’ goal-direction signals were, the better they were able to correctly recall in which direction the target objects lay, potentially offering a brain-based explanation for the differences in navigational ability. Such results should be interpreted carefully, however. There are apparently many ways worse performance can lead to weaker effects; for instance, if a participant’s attention lapses, he or she will not only perform worse but also produce less relevant brain activity.

This work may have clinical relevance. The ability to navigate is often an early casualty of dementias such as Alzheimer’s disease because the entorhinal region is one of the first areas to be affected. One group of neuroscientists is now working with doctors to develop tests to help identify deficits and potentially measure disease progression.

Advances in the Science of Intuition

Posted Posted in Jayne's blog

DOWNLOAD ENTIRE JULY 2015 NEWSLETTER including this month’s freebie.

Sometimes a solution just appears out of nowhere. Intuition!

This is the name we give to the uncanny ability to quickly and effortlessly know the answer, unconsciously, either without or well before knowing why. The conscious explanation comes later, if at all, and involves a much more deliberate process.

Understanding computer code, deciphering a differential equation, diagnosing a tumour from the shadowy patterns on an x-ray image, telling a fake from an authentic painting, knowing when to hold and when to fold in poker. Experts decide in a flash, without thought.

Intuition arises within a well-defined cognitive domain. It may take years of training to develop, and it does not easily transfer from one domain of expertise to another. Chess mastery is useless when playing bridge. Professionals, who may spend a lifetime honing their skills, are much in demand for their proficiency.

Let us consider a series of elegant experiments in functional brain imaging that finger one brain structure as being centrally involved in intuition. Shogi is a Japanese strategy game played on a nine-by-nine board, with two sets of 20 distinct pieces facing each other. It is much more complex than chess, given that captured pieces can be dropped into an empty position anywhere on the board at the discretion of the capturer. This rule multiplies the number of possible moves available at any point in the game and prevents the steady attrition of the two opposing armies that face off in a chess match.

Keiji Tanaka of the RIKEN Brain Science Institute outside Tokyo led a group of cognitive neuroscientists who studied the brains of shogi players, using functional MRI to search for the neural signatures of proficiency. First, subjects inside the scanner looked at drawings of shogi boards taken either from tournament games or from randomly shuffled board positions. They also looked at sketches that had nothing to do with shogi: games of chess and Chinese chess, as well as pictures of faces and houses.

In professional players, pictures of board positions taken from real shogi games activated a piece of cortex, the precuneus in the parietal lobe (located at the top of the brain toward the back), much more strongly than any of the other catgories of pictures. That is, a region of their parietal cortex read out certain perceptual features associated with shogi games and distinguished them from random board positions. Experts see configurations of pieces, lines of control, a weakened defense or an imminent attack—patterns that amateurs do not notice.

In a second experiment, Tanaka and his group presented players with check-matelike shogi puzzles while they lay in the scanner. Subjects had to find the next move that would lead, inexorably, to the capture of the king. They had to do this within one second, pushing them to rely on their intuition because there was no time to analyse the various moves, countermoves, countercountermoves, and so on. When they controlled for confounding cognitive factors, the scientists found nothing activated in the cortex. They did, however, isolate a small region in the front of the caudate nucleus, under the cortex, that reliably and very distinctly turned on in professional shogi players. The caudate was less reliably and less prominently activated when amateur players tried to find the correct move. And when subjects had up to eight seconds to more deliberately search for the best solution, this subcortical region remained silent.

 

Special-Purpose Hardware

This elegant finding links intuition with the caudate nucleus, which is part of the basal ganglia—a set of interlinked brain areas responsible for learning, executing habits and automatic behaviours. The basal ganglia receive massive input from the cortex, the outer, rindlike surface of the brain. Ultimately these structures project back to the cortex, creating a series of cortical–basal ganglia loops. In one interpretation, the cortex is associated with conscious perception and the deliberate and conscious analysis of any given situation, novel or familiar, whereas the caudate nucleus is the site where highly specialised expertise resides that allows you to come up with an appropriate answer without conscious thought. In computer engineering parlance, a constantly used class of computations (namely those associated with playing a strategy game) is downloaded into special-purpose hardware, the caudate, to lighten the burden of the main processor, the cortex.

So far these experiments relate the task of generating shogi moves to brain activity. Of course, we are not allowed to infer causation from correlation. Just because two things are associated does not imply that one causes the other. As research progresses, the causal structure of intuition and brain activity could be probed by inhibiting or blocking the caudate nucleus to see whether doing so prevents the rapid generation of correct shogi moves. Regrettably there are no reliable technologies to turn bits of brain deep inside the skull on and off in a way conducive to the long-term health of the subject! ;=)

Instead Tanaka and his collaborators wondered whether novices who learn to play shogi wire up their caudate nucleus in a similar manner to that of experts. They recruited naive volunteers and subjected them to an intensive 15-week regime of daily play on a simplified computer version of the game. Motivated by prize money, the subjects improved over the approximately 100 days of training, during which they accumulated total practice time ranging from 37 to 107 hours.

Asking subjects in these experiments to quickly come up with the best next move led to increased cortical activity, but that activity did not change over the training period, nor did it correlate with the fraction of correct responses. In contrast, changes in blood flow in the front of the caudate nucleus evolved over the course of training in parallel with better performance. Furthermore, the strength of the caudate signal at the end of the training correlated with how much subjects improved over time. The more the subject learned, the larger the caudate signal.

It appears that the site of fast, automatic, unconscious cognitive operations—from where a solution materialises all of a sudden—lies in the basal ganglia, linked to, but apart from, the cortex. These studies begin to provide a telling hint of what happens when the brain brings the output of unconscious processing into awareness. My intuition tells me that the last word on this subject has not been written by a long way!

Can Infection Make You Depressed?

Posted Posted in Jayne's blog

DOWNLOAD ENTIRE MAY 2015 NEWSLETER including this month’s freebie.

Did you know that depression could be a symptom of something as simple as infection? Doctors have long viewed depression as a complex disorder. Stress, neurochemical imbalances, physical pain and ill health can all precipitate an emotional collapse. Yet a flurry of research during the past 25 years has linked many of depression’s contributing factors to a single root cause: inflammation

In the short term, inflammation is an important part of our immune system’s built-in ability to thwart disease and injury. But when prolonged, it can prompt fatigue and melancholy. In addition, investigations hint that an upset immune system might underlie a host of other psychiatric illnesses, including obsessive- compulsive disorder, panic disorder and post-traumatic stress syndrome.

With these growing insights, scientists are testing new treatments to heal the psyche by tempering chronic inflammation. Even if the approach may help only some people with depression, the benefit could be enormous. About one in 10 individuals suffers from a serious spell of despondency at least once during their lives, and depression is the most prevalent mental illness among women.

 

Self-Defense in Overdrive

Our immune system deploys an arsenal of diverse cells to keep us healthy, choreographing their actions by way of messenger molecules called cytokines. Cells attacked by harmful bacteria, viruses, parasites or cancer secrete interferons, a class of cytokines that warn neighboring tissues to bolster their defenses and rally killer cells to engage. Cytokines called interleukins help to coordinate fever (which limits the spread of an infection) and inflammation (which rushes specialised immune cells to the scene). Tumour necrosis factors, yet another large family of cytokines, destroy abnormal cells and activate other cytokines. They also promote swelling, reddening and pain, which have both positive and negative effects.

Working together, these various proteins modulate the type, intensity and duration of an immune response. Across the blood-brain barrier, they also hold considerable sway over our emotional state. When we are unwell, interferons and interleukins announcing the start of an attack flood the brain. Numerous studies have revealed that these proinflammatory cytokines can disrupt the normal functioning of multiple neurotransmitters and dampen the production of serotonin, often called a happiness hormone for its ability to boost mood. As a result, even people with minor colds often lose their appetite, feel tired, seek warmth, avoid others and struggle to concentrate.

Biologically, this sickness behaviour, as it is called, makes sense. Our immune system works more efficiently and we can recuperate faster if we stay in bed. We are also less likely to infect our co-workers and friends. Once an illness has cleared, anti-inflammatory cytokines see to it that our bodies and our brain chemistry reset. But what if an illness drags on and the immune system continues to pump out proinflammatory signals? In that case, sickness behaviour begins to look a lot like depression. Tooth decay, urinary tract infections and sinusitis are all examples of infections that do not always produce obvious symptoms or pain but might perpetuate sickness behaviour for a long period.

To test the idea that depression can sometimes be a kind of extended sickness behaviour, psychology researcher Yekta Dowlati of the University of Toronto and her colleagues evaluated 24 studies in a paper published in 2010, looking at measurements of proinflammatory cytokines in about 400 depressed individuals. Overall, these subjects showed significantly heightened blood levels of tumour necrosis factor alpha (TNF-alpha) and interleukin-6, hallmarks of an ongoing immune reaction. Two years later researcher Simon Gray and psychiatrist Michael Bloch of Yale University conducted another meta-analysis of 12 studies of obsessive- compulsive disorder, often diagnosed alongside depression. They, too, reported elevated blood levels of TNF-alpha in individuals suffering from both depression and obsessive-compulsive disorder.

An overcooked immune response may also trigger anxiety disorders. In 2009 psychiatrist Elizabeth Hoge of Harvard Medical School and her co-workers compared the immunological status of 48 patients suffering from panic disorder or post-traumatic stress disorder with that of 48 age- and gender-matched healthy control subjects. The team tested blood serum levels of 20 different inflammation markers and found 18 of them to be higher in the psychiatric patients. To look for a generalised inflammatory state—indicative of an underlying injury or infection—they measured how many participants had detectable levels of at least six out of nine common proinflammatory cytokines. Some 87 percent of the anxiety sufferers met the criterion, compared with only 25 percent of the controls…..

 

The Role of Stress

There seems to be little doubt that protracted low-grade inflammation can engender depression and other emotional disorders in some people. Whether or not a person succumbs may depend in part on how aggressive their immune sys- tem is to begin with. In 2013 a meta-analysis was carried out to examine data on genes that predispose humans to depression. It was found that many of these genes are present in individuals with an especially vigorous immune response—which might explain why the genes have persisted in the human population even though they can have a detrimental effect.

Until the advent of good hygiene and antibiotics, infection was arguably the greatest threat to staying alive, so having an overactive immune system conferred a real advantage. Raison and Miller speculate that today, when most of us are not routinely exposed to hazardous microbes, some people’s immune systems habitually overreact to harmless stimuli, leading to a persistent increase in proinflammatory cytokines. This may account for an increased prevalence of not only depression and other emotional disorders but also allergies, autoimmune diseases, cardio-vascular disease, stroke, cancer, diabetes and dementia.

Stress probably plays a crucial role in this nexus of cause and effect. In the short term, the mere anticipation of injury can amplify inflammation. Several experiments have confirmed that animals experiencing moments of acute stress crank out higher levels of proinflammatory cytokines, and prolonged stress can eventually elicit depression-like behaviour in these same creatures. In 2009 psychiatrist Lisa Christian and her co-workers demonstrated a similar phenomenon in humans. They studied 60 women during pregnancy and found an association between depression and blood levels of TNF-alpha and interleukin-6. Proinflammatory cytokines rise during any pregnancy, but the researchers found elevated levels of both cytokines and depression in women under stress and the highest levels of depression in women under presumably the greatest stress—those with unwelcome pregnancies and those who received the least support.

Chronic stress is even more deleterious. Faced with some threat, the body prepares for fight or flight. A hormonal cascade along the so-called stress axis— from the hypothalamus to the pituitary gland to the adrenal gland—jump-starts the production of cortisol. Among other functions, this hormone temporarily suppresses the immune system to guarantee that we focus all of our energy on external dangers. If the stress endures, though, cortisol keeps the immune system offline, and we are more susceptible to illness. Over the course of several years, stress can permanently damage signaling along the axis, and we begin to release too little cortisol—in which case, the immune system reawakens and kicks into overdrive, with proinflammatory cytokines flowing freely.

 

Stacking the Deck

The sum of this research is a rough mechanism by which inflammation can seed depression: take an immune system prone to overreact and add stress. The chance of physical illness skyrockets, pro-inflammatory cytokines surge and sickness behavior becomes the new normal. Further investigations reveal that trauma in any form primes this pathway. In 2007 psychiatrist Andrea Danese conducted a longitudinal study of people who were rejected or mistreated by their parents during childhood. They found that even 20 years after the abuse, many study participants had greatly elevated blood levels of inflammation biomarkers.

Several other studies suggest that disturbing childhood experiences may permanently unbalance the stress axis and alter the sensitivity of cortisol receptors in the brain. Miller and his colleagues have discovered that compared with emotionally healthy control subjects, depressed men who were distressed as children mount much stronger immune responses when they take a test designed to cause psychosocial stress. Physical pain can also overload the immune system. Scientists have documented especially high levels of TNF-alpha in depressed women who also exhibit an increased sensitivity to pain—itself a symptom of sickness behaviour. Constant pain is a stress factor, creating a vicious circle: pain begets inflammation and stress, leading to more inflammation and depression. Indeed, so complex is the interplay of contributing factors that it may be impossible to determine the degree to which immune reactivity, personality, general physical health and life experiences contribute to depression in any one person. Chronic inflammation in and of itself almost certainly accounts for only a subset of patients with emotional disorders. Yet several trials have shown that patients who do not respond to traditional anti- depressants frequently begin to improve when they take anti-inflammatory medications, from everyday ibuprofen to cytokine inhibitors, on top of their other prescriptions. Interestingly, the reverse is also true. Patients suffering from skin cancer or hepatitis C frequently take the cytokine drug interferon-alpha, which causes inflammation, and many develop symptoms of depression as a side effect. Practitioners are thus pursuing a variety of approaches—from medications to dietary interventions—to tackle unwanted inflammation in psychiatric patients. For instance, in 2011 psychiatrist Janice Kiecolt-Glaser and her colleagues at Ohio State reported that omega-3 fatty acids, which curb inflammation, alleviated anxiety in medical students before an exam. Scientists are hopeful that drugs aimed at blocking cytokine receptors in the brain might also help quell emotional disorders. Several initiatives are under way to develop effective cytokine antagonists. Meanwhile more studies on the role of stress and trauma may reveal better ways to keep inflammation in check, lessening the risk that a chance infection will lead to an intractable disease.

 

References:

Inflammation and Its Discontents: The Role of Cytokines in the Pathophysiology of Major Depression. Andrew H. Miller, Vladimir Maletic and Charles L. Raison in Biological Psychiatry, Vol. 65, No. 9, pages 732–741; May 1, 2009.

A Meta-Analysis of Cytokines in Major Depression. Yekta Dowlati, Nathan Herrmann, Walter Swardfager, Helena Liu, Lauren Sham, Elyse K. Reim and Krista L. Lanctôt in Biological Psychiatry, Vol. 67, No. 5, pages 446–457; March 1, 2010.

Can Infection Give You the Blues? Erich Kasten in Scientific American Mind, Vol. 26, No. 3; May/June, 2015.

The Evolutionary Significance of Depression in Pathogen Host Defense (PATHOS-D).
C. L. Raison and A. H. Miller in Molecular Psychiatry, Vol. 18, No. 1, pages 15–37; January 2013.

Can Music Really Heal?

Posted Posted in Jayne's blog

DOWNLOAD ENTIRE APRIL 2015 NEWSLETTER including this month’s freebie.

Across cultures and throughout history, music listening and music making have played a role in treating disorders of the mind and body. Much of the power of music-based treatment lies in its ability to meld numerous subtle benefits in a single, engaging package. Music is perhaps unrivaled by any other form of human expression in the range of its defining characteristics, from its melody and rhythm to its emotional and social nature.

The treatments that take advantage of these attributes are rewarding, motivating, accessible and inexpensive, and basically free of side effects, too. The attractive quality of music also encourages patients to continue therapy over many weeks and months, improving the chance of lasting gains. These treatments aim to restore functions lost to injury or neurological disorders by enlisting healthy areas of the brain and sometimes even by reviving dysfunctional circuitry. As evidence accumulates about the effectiveness of these techniques, clinicians and therapists from a variety of fields have begun to incorporate them into their practices, most notably music therapists, who are at the intersection of music and health and important mediators of these interventions, as well as speech therapists and physical therapists. And among the beneficiaries are people diagnosed with stroke, autism, tinnitus, Parkinson’s disease and dementia. As scientists learn more about the effect of music on cognitive and motor functions and mental states, they can tailor these therapies for each disorder, targeting specific brain injuries or dysfunctions.

Music as Medicine

The view that music can be useful in treating neurological impairment gained some scientific heft in a landmark study published in 2008. Psychologist Teppo Särkämö of the University of Helsinki and his team recruited 60 patients who had suffered a stroke in the middle cerebral artery of one hemisphere. They split the patients into three groups: the first participated in daily sessions of music listening, the second listened to audiobooks every day and the third received no auditory treatment. Researchers observed the patients over two months. Those in the group that listened to music exhibited the greatest recovery in verbal memory and attention. And because listening to music appears to improve memory, the hope now is that active music making—singing, moving and synchronizing to a beat—might help restore additional skills, including speech and motor functions in stroke patients.

The Singing Cure

When a stroke affects areas of the brain that control speech, it can leave patients with a condition known as nonfluent aphasia, or an inability to speak fluently. And yet, as therapists over the years have noted, people with nonfluent aphasia can sometimes sing words they cannot otherwise say.

In the 1970s neurologist Martin Albert and speech pathologists Robert Sparks and Nancy Helm recognized the therapeutic implications of this ability and developed a treatment called melodic intonation therapy in which singing is a central element. During a typical session, patients will sing words and short phrases set to a simple melody while tapping out each syllable with their left hand. The melody usually involves two notes, perhaps separated by a minor third (such as the first two notes of “Greensleeves”). For example, patients might sing the phrase “How are you?” in a simple up-and-down pattern, with the stressed syllable (“are”) assigned a higher pitch than the others. As the treatment progresses, the phrases get longer and the frequency of the vocalizations increases, perhaps from one syllable per second to two.

Each element of the treatment contributes to fluency by recruiting undamaged areas of the brain. The slow changes in the pitch of the voice engage areas associated with perception in the right hemisphere, which integrates sensory information over a longer interval than the left hemisphere does; as a consequence, it is particularly sensitive to slowly modulated sounds. The rhythmic tapping with the left hand, in turn, invokes a network in the right hemisphere that controls movements associated with the vocal apparatus. Benefits are often evident after even a single treatment session. But when performed intensively over months, melodic intonation therapy also produces long-term gains that appear to arise from changes in neural circuitry—the creation of alternative pathways or the strengthening of rudimentary ones in the brain. In effect, for patients with severe aphasia, singing trains structures and connections in the brain’s right hemisphere to assume permanent responsibility for a task usually handled mostly by the left.

This theory has gained support in the past two decades from studies of stroke patients with nonfluent aphasia conducted by researchers around the world. In a study published in September 2014 by Schlaug and his group at Harvard Medical School, 11 patients received melodic intonation therapy; nine received no treatment. The patients who received therapy were able to string together more than twice as many appropriate words per minute in response to a question. That same group also showed structural changes, assessed through MRI, in a right-hemisphere network associated with vocalization. The laboratory is now conducting studies to compare the benefits of melodic intonation therapy with other forms of therapy for patients with aphasia.

Because melodic intonation therapy seemed to work by engaging the right hemisphere, researchers then surmised that electrical or magnetic stimulation of the region might boost the therapy’s power. In two recent studies researchers stimulated an area in the right hemisphere called the inferior frontal gyrus, which helps to connect sounds with the oral, facial and vocal movements that produce them. For many participants, combining melodic intonation therapy with noninvasive brain stimulation yielded improvements in speech fluency after only a few sessions.

Music and Motion

Music making can also help stroke survivors living with impaired motor skills. In a study published in 2007 scientists asked patients to use their movement-impaired hand to play melodies on the piano or tap out a rhythm on pitch-producing drum pads. Patients who engaged in this intervention, called music-supported training, showed greater improvement in the timing, precision and smoothness of fine motor skills than did patients who relied on conventional therapy. The researchers postulated that the gains resulted from an increase in connections between neurons of the sensorimotor and auditory regions.

Rhythm is the key to treatment of people with Parkinson’s, which affects roughly one in 100 older than 60. Parkinson’s arises from degeneration of cells in the midbrain that feed dopamine to the basal ganglia, an area involved in the initiation and smoothness of movements. The dopamine shortage in the region results in motor problems ranging from tremors and stiffness to difficulties in timing the movements associated with walking, facial expressions and speech.

Music with a strong beat can allay some of these symptoms by providing an audible rhythmic sequence that people can use to initiate and time their movements. Treatments include so-called rhythmic entrainment, which involves playing a stimulus like a metronome. In neurologist Oliver Sacks’s 1973 book Awakenings, musical rhythm sometimes released individuals from their immobility, letting them dance or sing out unexpectedly.

The use of rhythm in motor therapy gained momentum in the 1990s, when researchers around the world demonstrated a technique called rhythmic auditory stimulation, or RAS, for people who had trouble walking, such as stroke and Parkinson’s patients. A therapist will first ask patients to walk at a comfortable speed and then to an audible rhythm. Tempos that pushed patients slightly past their comfort zone yielded the greatest improvements in velocity, cadence and stride length.

Despite these encouraging outcomes, the neural mechanisms that trigger improvements have been difficult to pin down. Imaging work suggests that during rhythmic auditory stimulation, neural control of motor behaviour is rerouted around the basal ganglia; instead the brain stem serves as a relay station that sends auditory input to motor networks in the cerebellum, which governs coordination, and to other cortical regions that could help synchronize sound and motion.

Recovered Memory

Fewer neurological disorders inspire greater fear than dementia, one of the most common diseases of the elderly. According to some estimates, 44 million people worldwide are living with dementia, a number expected to reach 135 million by 2050. Alzheimer’s disease, a neurodegenerative condition, accounts for more than 60 percent of the cases; multiple strokes can also cause so-called vascular dementia.

Music may be ideally suited to stimulating memory in people with dementia, helping them maintain a sense of self. Because music activates neural areas and pathways in several parts of the brain, the odds are greater that memories associated with music will survive disease. Music also stimulates normal emotional responses even in the face of general cognitive decline. In a 2009 study 12 individuals with Alzheimer’s and 12 without it were asked to judge the emotional connotations of various pieces of music. The Alzheimer’s participants were just as accurate as the others despite significant impairments in different areas of judgment. Other research suggests that taking part in musical activities throughout life keeps the mind young and may even decrease the risk of developing dementia; the continuous engagement of the parts of the brain that integrate senses and motion with the systems for emotions and rewards might prevent loss of neurons and synapses.

The type of therapy that individual dementia patients receive will vary, from receptive (listening) to active (dancing, singing, clapping). Music that the patient selects is most effective because the choice represents a connection to memory and self. The benefits vary, too, and tend to be short-term. But when the treatment does work, it reduces the feelings of agitation that lead to wandering and vocal outbursts and encourages cooperation and interaction with others. Music therapy can also help patients with dementia sleep better and can enhance their emotional well-being.

Music on the Spectrum

Perhaps the most fascinating interplay between music and the brain lies in the case files of people with autism spectrum disorder, a neurodevelopmental syndrome that occurs in 1 to 2 percent of children, most of whom are boys. Hallmarks of autism include impaired social interactions, repetitive behaviours and difficulties in communication. Indeed, up to 30 percent of people with autism cannot make the sounds of speech at all; many have limited vocabulary of any kind, including gesture.

One of the peculiarities of the neurobiology of autism is the overdevelopment of short-range brain connections. As an apparent consequence, children with autism tend to focus intensely on the fine details of sensory experience, such as the varying textures of different fabrics or the precise sound qualities emitted by appliances such as a refrigerator or an air conditioner. And this fascination with sound may account for the many anecdotal reports of children with autism who thoroughly enjoy making and learning music. A disproportionate number of children with autism spectrum disorder are musical savants, with extraordinary abilities in specialized areas, such as absolute pitch.

The positive response to music opens the way to treatments that can help children with autism engage in activities with other people, acquiring social, language and motor skills as they do. Music also activates areas of the brain that relate to social ways of thinking. When we listen to music, we often get a sense of the emotional states of the people who created it and those who are playing it. By encouraging children with autism to imagine these emotions, therapists can help them learn to think about other people and what they might be feeling.

Recently the Music and Neuroimaging Laboratory at Harvard developed a new technique called auditory-motor mapping training, or AMMT, for children whose autism has left them unable to speak. The treatments have two main components: intonation of words and phrases (changing the melodic pitch of one’s voice) and tapping alternately with each hand on pitch-producing drums while singing or speaking words and phrases. In a proof-of-principle study, six completely nonverbal children took part in 40 sessions of this training over eight weeks. By the end, all were able to produce some speech sounds, and some were even able to voice meaningful and appropriate words during tasks that the therapy sessions had not covered. Most important, the children were still able to demonstrate their new skills eight weeks after the training sessions ended.

Quiet, Please

Music-based treatments can also train the brain to tune out the phantom strains of tinnitus—the experience of noise or ringing in the ear in the absence of sound that affects roughly 20 percent of adults. Age-related hearing loss, exposure to loud sounds and circulatory system disorders can all bring on the condition, with symptoms ranging from buzzing or hissing in the ears to a continuous tone with a definable pitch. The sensation can cause serious distress and interfere with the ability to concentrate on other sounds and activities. There is no cure.

The past decade has seen a surge in understanding of the neurological basis of the disorder. In one view, cochlear damage (most likely caused by exposure to loud sounds) reduces the transmission of particular sound frequencies to the brain. To compensate for the loss, neuronal activity in the central auditory system changes, creating neural “noise,” perhaps by throwing off the balance between inhibition and excitation in the auditory cortex, leading to the perception of sounds that are not there. Also at play might be dysfunctional feedback to auditory brain regions from the limbic system, which is thought to serve as a noise-cancellation apparatus that identifies and inhibits irrelevant signals.

Music treatment seeks to counteract this dysfunction by inducing changes in the neural circuitry. For those with tonal tinnitus, one treatment involves listening to “notched music,” generated by digitally removing the frequency band that matches the tinnitus frequency. The notching—pioneered and proved effective by neurophysiologist Christo Pantev and his group at the University of Münster in Germany—might help reverse the imbalance in the auditory cortex, strengthening the inhibition of the frequency band that might be the source

of the phantom sound in the first place. Another approach involves playing a series of pitches to patients and then asking them to imitate the sequence vocally. As the patients refine their accuracy, they learn to disregard irrelevant auditory signals and focus on what they want to hear. In time, the stimulus of effortful attention might help the auditory cortex return to its normal physiological state.

For any novel therapy, enthusiasm can sometimes outpace the evidence, and researchers have rightly pointed out that the new music-based treatments must prove their efficacy against the more established therapies. But of all the techniques for addressing neurological disorders, music-based therapies seem unique in their capacity to tap into emotions, to help the brain find lost memories, to let patients resume their place in the world. We are only now beginning to understand the science behind the belief in the power of music to heal.

References:

  • From Singing to Speaking: Why Singing May Lead to Recovery of Expressive Language Function in Patients with Broca’s Aphasia. Gottfried Schlaug, Sarah Marchina and Andrea Norton in Music Perception, 25, No. 4, pages 315–323; April 1, 2008.
  • Listening to Tailor-Made Notched Music Reduces Tinnitus Loudness and Tinnitus-Related Auditory Cortex Activity. Hidehiko Okamota
    et al. in Proceedings of the National Academy of Sciences USA,
    107, No. 3, pages 1207–1210; January 19, 2010.
  • Auditory-Motor Mapping Training as an Intervention to Facilitate Speech Output in Non-Verbal Children with Autism: A Proof of Concept Study. Catherine Y. Wan et al. in PLOS ONE, 6, No. 9, Article No. e25505; September 29, 2011.
  • The Healing Power of Music. Willam F. Thompson and Gottfried Schlaug in Scientific American Mind, Vol.26, No. 2, pages 32-41; March/April 2015.
  • Music, Health, and Wellbeing. Edited by Raymond MacDonald, Gunter Kreutz and Laura Mitchell. Oxford University Press, 2012.
  • Music, Thought, and Feeling: Understanding the Psychology of Music. Second edition. William Forde Thompson. Oxford University Press, 2014.

Does the Midlife Crisis Exist?

Posted Posted in Jayne's blog

DOWNLOAD THE ENTIRE MARCH 2015 NEWSLETTER including this month’s freebie.

Examples of the desperate midlife-crisis stricken characters abound in popular culture (think, Kevin Spacey in American Beauty, Billy Crystal in City Slickers or Meryl Streep in The Bridges of Madison County), and the concept seems to be entrenched in our collective psyche. But are people of a certain age really more likely to launch a total life reboot?

It would seem so, according to scientists. Some hallmarks of midlife—increased self-reflection, aging, career and family changes—can seed deep dissatisfaction. And studies indicate that our sense of well-being naturally wanes during this stretch of adulthood. But research also shows that many common beliefs about the quintessential fortysomething meltdown are untrue. In fact, malaise can commence at almost any age, men are not more susceptible to dissatisfaction than women, and midlife crises are far from inevitable.

At some point between age 40 and 60, most people will face significant stress in one form or another, but not everyone reacts by chasing after fading youth or, worse, succumbs to depression or substance abuse. Genuine midlife breakdowns appear to be less common than many think, affecting only 10 percent of the population, according to a review in the early 1990s, or 26 percent, according to a 2000 study. For some of that group, these upheavals lead not to ruin but to a welcome second act. And the reasons certain people do fall victim may have more to do with personality and cultural expectations than age.

A Life Half-Empty

If the midlife crisis seems to defy precise definition, consider that midlife itself is a nebulous concept.

In 2011 researchers at Florida State University analysed a questionnaire given to several thousand Americans in the 1990s and found that most participants defined midlife as running from age 44 to 59. Ten years later the same participants described midlife as age 46 to 62. The older the respondents—women in particular—the later they envisioned the debut of their dotage.

Our notion of midlife may be a moving target, but once it starts, our perspective on time shifts. Instead of counting the years from our birth, we begin guessing at how many years we have left. Psychologist Laura Carstensen, founding director of the Center on Longevity at Stanford University, has shown that this subjective sense of a life half-empty influences our goals. The fewer years we think we have, the less expansive our plans become. Instead some individuals focus on family and friends—just as young and old alike seek familiar comforts when epidemics or terrorist attacks remind us of how finite life is.

Other people, though, begin to take stock of their lives and revisit unrealised dreams. Alexandra Freund, a professor of applied psychology at the University of Zurich has found that we seem to retrieve our youthful goals from 20 years before and check off, one by one, what we have achieved. She has noticed that older people typically try to maintain what they have now to avoid future losses and that they focus less on their careers. During this transition, our sense that the demands of work exceed our ability to cope can increase. Occupational psychologist Amanda Griffiths and her colleagues at the University of Nottingham in England have reported that this perceived job stress peaks between the ages of 50 and 55.

Such changes in focus and aspiration may render us less satisfied with life. In 2008 labour economists David Blanchflower of Dartmouth College and Andrew Oswald of the University of Warwick in England reviewed data about well-being collected from half a million people of various ages in 72 countries. They concluded that the happiness level throughout an individual’s life span frequently follows a U-shaped curve, bottoming out in the early to mid-40s. In some locations, this emotional nadir did not appear in the raw data but emerged once Blanchflower and Oswald reanalysed the numbers to consider the potentially confounding contributions of marital status, income, education and other factors on contentment.

Complicating the picture, other researchers have noted the presence of peaks and plateaus of happiness within the midlife period. In 2012 economist John Haisken-DeNew of the University of Melbourne reviewed life satisfaction among Germans using data from the German Socio-Economic Panel, a 30-year-old longitudinal study that surveys some 30,000 people annually. The length of the study allowed Haisken-DeNew to account for individual differences in optimism. For instance, a happiness score of 6 out of 10 might represent an all-time high for a confirmed misanthrope but a crash for a starry-eyed romantic. Haisken-DeNew observed that happiness levels drop continuously, if slightly, throughout adulthood until the early 60s, after which they increase until age 75 and then plummet.

Wrinkles in Time

Still, the subject of much debate is whether the cause of midlife strife lies in our creaturely selves or in our stars. Some researchers argue that life events are most important. In a study published in 2000 sociologist Elaine Wethington of Cornell University conducted a telephone survey of 724 American adults between the ages of 28 and 78 and found that more than a quarter of her respondents—men and women almost equally—admitted to having had a midlife crisis. The majority attributed these spells to upsetting life events such as losing a job or parent and not to age, leading Wethington to conclude that midlife crises are not a natural part of aging.

Yet research into the biology of middle age suggests that at least part of our vulnerability is built in. In 2012 psychologist Alexander Weiss and his colleagues at the University of Edinburgh reviewed accounts from zookeepers, volunteers, researchers and caretakers, who reported that middle-aged chimpanzees and orangutans showed definite signs of disgruntlement, compared with younger and older apes. These observations were subjective; the animal keepers knew the ages of the animals and may have been interpreting their behaviour based on expectations. Nevertheless, Weiss’s team concluded that the biology we share with our fellow great apes could underpin our midlife doldrums.

Gray hair and wrinkles aside, many adjustments associated with aging can cause psychological distress. The rates of cancer and cardiovascular disease, among other illnesses, increase, along with the risk of depression. A 2012 report from the Central Research Institute of Ambulatory Health Care in Berlin revealed that depression cases climb almost linearly until one’s late 50s and peak again at around 85 for women and 90 for men. In the U.S., statistics from the Centers for Disease Control and Prevention reveal that the highest rates of depression among men and women fall between the ages of 40 and 59.

Physically, too, the brain begins to degrade more quickly after age 40. In 2010 neuroscientist Antonio Giorgio of the University of Siena in Italy, then working with colleagues at the University of Oxford, tested 66 subjects between the ages of 23 and 81 using MRI and diffusion-tensor imaging. Their results were consistent with earlier studies: The brain’s white matter volume increased continuously up to the early 40s and then decreased rather rapidly. The volume of gray matter declined steadily over the entire period in most brain regions.

Older brains can usually compensate for these deficits, off-setting lost firepower with greater experience and knowledge. But the effects of hormonal attrition are more unsettling. When male andropause sets in after about age 40, testosterone levels start to decline, and both sexual interest and performance can suffer. For women, the decline of estrogen levels in the 40s leads to menopause, usually in the 50s, and a host of sometimes upsetting symptoms, including insomnia, memory problems and depression.

To Every Season

But whatever its source, midlife stress does not foredoom us to a life out of control, especially in our relationships. A 2011 Kinsey Institute study of more than 1,000 couples in Germany, Spain, the U.S., Japan and Brazil found that middle-aged men and women rate their relationships and sex lives higher the longer they have been married and that people entering middle age with a long-term partner have a good chance of staying together, citing earlier estimates that more than half of marriages in the U.S. and 92 percent in Spain will last more than 20 years. Of the marriages that do break down, the husband is not typically the one to walk out. According to the National Marriage Project at the University of Virginia, women instigate two thirds of all divorces—most likely not because they are having midlife crises but because their husbands are behaving badly.

The empty-nest syndrome appears to be a myth, too. Pasqualina Perrig-Chiello, a professor of developmental psychology at the University of Bern in Switzerland, found in a 2001 study of 260 middle-aged subjects that mothers frequently view their children’s departure optimistically. Fathers more often have mixed feelings, perhaps wishing that they had spent more time with their children. In a 2009 study, sociologist Barbara Mitchell of Simon Fraser University in British Colum- bia asked more than 300 parents from different cultural backgrounds about their children’s departure from home. Only a minority—younger parents with health problems and fewer children—reported emotional suffering. Overall, most parents reported positive feelings, such as pride at having been successful in raising their children so that they could move out.

Perhaps the biggest misconception of all is that the outlook at 40 is grim. The John and Catherine MacArthur Foundation Research Network on Successful Midlife Development, a Harvard University–based interdisciplinary project run by 13 scholars, surveyed more than 7,000 people in the U.S. between the ages of 25 and 74 on aspects of middle age. The results, which have spawned multiple books and scores of research papers, reveal midlife to be largely a period of calm and stability: most relationships hold together, most people stay healthy and many enjoy financial security.

And when Zurich’s Freund asked older people what age they would most like to be again, the majority chose their mid-40s. In some cultures, such as Japan, India, Kenya and Samoa, the concept of the midlife crisis is entirely imported. Maybe knowing that our misgivings about midlife are usually exaggerated— and temporary— can make the passage to late maturity just a bit more manageable.

References:

  • Expecting Stress: Americans and the “Midlife Crisis.” Elaine Wethington in Motivation and Emotion, 24, No. 2, pages 85–103; June 2000.
  • Is Well-Being U-Shaped over the Life Cycle? David G. Blanchflower and Andrew J. Oswald in Social Science & Medicine, 66, No. 8, pages 1733–1749; April 2008.
  • Evidence for a Midlife Crisis in Great Apes Consistent with the U-Shape in Human Well-Being. Alexander Weiss et al. in Proceedings of the National Academy of Sciences USA, 109, No. 49, pages 19,949–19,952; December 4, 2012.
  • Debunking Midlife Myths. Hanna Drimalla in Scientific American Mind, Vol. 26, No. 2. Pages 58-61, 2015.
  • John D. and Catherine T. MacArthur Foundation Research Network on Successful Midlife Development: http://midmac.med.harvard.edu