Wednesday, January 23, 2013


When will we all live to 100? 40 percent of girls born now expected to reach this milestone

An article from John Appleby, Chief Economist at the Kings Fund, published on BMJ website today brings attention to the rising amount of those expected to live to 100 and asks where it will end.
23 jan 2013--According to the Office of National Statistics there seems to be "no end in sight" as far as the number of UK citizens reaching 100 years old is concerned. Approximately 13% of girls born in 1951 are expected to reach this milestone, increasing to 40% for girls born this year and a predicted 60% of those born in 2060.
Appleby attributes similar worldwide trends to the fact that people are dying at older ages. Deaths in children under five have fallen by 60% since 1970, and surviving early childhood makes it easier to live a much longer life.
Variations between men and women, social groups and countries have, however, remained significant with one UK study finding a difference of 11.4 years (80 years compared with 68.6 years) between women in the poorest and most affluent social classes.
And although "living longer is a good thing", research suggests that gains in life expectancy have more to do with reductions in deaths than reductions in years lived in disability. While life expectancy for women has risen 4.6% since 1990, healthy life expectancy has risen by only 3%.
As more and more people live longer, Appleby asks whether we may find ourselves "scrabbling for resources" but concludes that this is unlikely and there is "no need to panic".
Provided by British Medical Journal

Tuesday, January 22, 2013


Surprising connections between our well-being and giving, getting, and gratitude

We all know that getting a good night's sleep is good for our general health and well-being. But new research is highlighting a more surprising benefit of good sleep: more feelings of gratitude for relationships.
22 jan 2013--"A plethora of research highlights the importance of getting a good night's sleep for physical and psychological well-being, yet in our society, people still seem to take pride in needing, and getting, little sleep," says Amie Gordon of the University of California, Berkeley. "And in the past, research has shown that gratitude promotes good sleep, but our research looks at the link in the other direction and, to our knowledge, is the first to show that everyday experiences of poor sleep are negatively associated with gratitude toward others – an important emotion that helps form and maintain close social bonds."
Social psychologists are increasingly finding that "prosocial" behavior – including expressing gratitude and giving to others – is key to our psychological well-being. Even how we choose to spend our money on purchases affects our health and happiness. And children develop specific ways to help others from a very young age. Gordon and other researchers will be presenting some of these latest findings at the Society for Personality and Social Psychology (SPSP) annual meeting today in New Orleans.
Sleeping to feel grateful
A large body of research has documented that people who experience gratitude are happier and healthier. In three new studies, Gordon and Serena Chen, also of the the University of California, Berkeley, explored how poor sleep affects people's feelings of gratitude.
In the first study, people who experienced a poor night's sleep were less grateful after listing five things in life for which they were appreciative than were people who had slept well the night before. The researchers adapted the Pittsburgh Sleep Quality Index, which measures sleep quality and number of hours slept, among other variables, to evaluate the previous night's sleep.
In the second study, participants recorded their sleep from the previous night for two weeks and their feelings of gratitude. The researchers found a decline in gratitude associated with poor sleep, and those participants reported feeling more selfish those days.
The final study looked at heterosexual couples and found that people tend to feel less grateful toward their romantic partners if either they or their partners generally sleep poorly. "In line with this finding, people reported feeling less appreciated by their partners if they or their partner tends to sleep poorly, suggesting that the lack of gratitude is transmitted to the partner," Gordon says.
"Poor sleep is not just experienced in isolation," Gordon says. "Instead, it influences our interactions with others, such as our ability to be grateful, a vital social emotion."
Giving away money to feel wealthy
Just as expressing gratitude confers benefits, so too does giving to others. New research shows that people all around the world – from Canada to Uganda, from South Africa to India – derive more happiness from spending money on others than they do on themselves.
"For the first time, we show that giving away money or spending it on others confers the ironic psychological benefit of increasing the giver's sense of wealth," says Michael Norton of Harvard Business School and co-author with Elizabeth Dunn of the University of British Columbia of the upcoming book Happy Money: The Science of Smarter Spending. In a suite of new, not-yet published, studies, Norton and colleagues showed that charitable giving makes people feel wealthier.
This research follows on other recent work published in Psychological Science by Norton and colleagues that shows that giving time to others – from helping with homework to shoveling a neighbors' driveway – actually makes people feel that they have more time. "In fact, giving time away alleviates people's sense of time famine even more than receiving unexpected windfalls of free time."
That people feel wealthier from spending money on others may explain why poor individuals tend to give away a higher fraction of their income than members of the middle class do. In one study, researchers reported that Americans earning less than $20,000 a year give a higher percentage of their income to charity than others earning up to $300,000 a year.
"Our results suggest when the poor give money away, that very act might mitigate their feelings of poverty," Norton says. "More broadly than this specific benefit, our investigation contributes to the growing body of research documenting the benefits of prosocial behavior, which include greater happiness, reduced mortality, and better immune function."
Buying experiences to feel happy
In related research, psychologists are finding that spending money on experiential purchases, such as vacations, concerts, and meals out, tends to bring us more happiness than material purchases, such as clothing, jewelry, or electronic gadgets. Amit Kumar and Thomas Gilovich of Cornell University are investigating one potential explanation for this difference: that experiences prompt storytelling more than possessions do.
In new research, they asked participants to recall either a significant experiential purchase or a significant material purchase. They then asked them how much they had talked about the purchase they recalled, and questions related to the satisfaction they derived from their purchase. Participants rated a higher satisfaction for experiences than for possessions, which was because they were more likely to talk about the experiences with other people.
In another experiment, the researchers measured what happens when people cannot talk about their purchases. They asked participants if they would be willing to pay a price to be able to talk about a beach vacation (experiential purchase) or an electronic good (material purchase). "Participants were more likely to switch from a better purchase that they could not talk about to a lesser purchase that they could talk about in the experiential condition than in the material one," Kumar says.
"Well-being is likely to be enhanced by shifting the balance of spending in our consumer society away from material goods and towards experiential ones," Kumar says. "This research also suggests that there are benefits to be had not only by nudging people to choose experiences over possessions, but also by encouraging people to share stories about their experiences."
Knowing what is best to help others
The roots for how we give to others form at a very young age. Children, it turns out, are very sophisticated givers – not only coming to someone's aid when needed but also coming up with the best strategy for doing so, often independent of an adult's instruction.
In new research, Kristina Olson of Yale University and Alia Martin have found that children often will act, thinking they know better than others what is best for them or others. In a series of experiments, the researchers investigate whether 3-year-old children will help someone by ignoring the specific request and instead offering a better alternative.
In one study, for example, when an experimenter asks the child for a specific marker, but the child knows that marker does not work, the child will instead offer up a better marker. In another study, a pre-recorded child asks the child participant to give her a piece of chocolate via a tube that supposedly connects them. If the participant knows that chocolate makes the other child sick, the participant will decide to give her fruit snacks instead.
"Perhaps most provocatively, children will selectively decide not to help in this way if they don't like the person," Olson says. "For example, if an experimenter has previously been mean, children won't warn the adult of a potential harm – such as something sharp in the container they are reaching in – but will if the experimenter was not mean."
"These results suggest that children are able to help adults and peers already by the preschool years in rather complex ways, even when the beneficiary is misguided about what he or she wants," Olson says. "Children don't just blindly do as they are requested, but rather consider a person's goal and consider alternative possible ways to achieve that goal."
Provided by Society for Personality and Social Psychology

Monday, January 21, 2013


Loneliness, like chronic stress, taxes the immune system

New research links loneliness to a number of dysfunctional immune responses, suggesting that being lonely has the potential to harm overall health.
21 jan 2013--Researchers found that people who were more lonely showed signs of elevated latent herpes virus reactivation and produced more inflammation-related proteins in response to acute stress than did people who felt more socially connected.
These proteins signal the presence of inflammation, and chronic inflammation is linked to numerous conditions, including coronary heart disease, Type 2 diabetes, arthritis and Alzheimer's disease, as well as the frailty and functional decline that can accompany aging.
Reactivation of a latent herpes virus is known to be associated with stress, suggesting that loneliness functions as a chronic stressor that triggers a poorly controlled immune response.
"It is clear from previous research that poor-quality relationships are linked to a number of health problems, including premature mortality and all sorts of other very serious health conditions. And people who are lonely clearly feel like they are in poor-quality relationships," said Lisa Jaremka, a postdoctoral fellow at the Institute for Behavioral Medicine Research at Ohio State University and lead author of the research.
"One reason this type of research is important is to understand how loneliness and relationships broadly affect health. The more we understand about the process, the more potential there is to counter those negative effects – to perhaps intervene. If we don't know the physiological processes, what are we going to do to change them?"
The results are based on a series of studies conducted with two populations: a healthy group of overweight middle-aged adults and a group of breast cancer survivors. The researchers measured loneliness in all studies using the UCLA Loneliness Scale, a questionnaire that assesses perceptions of social isolation and loneliness.
Jaremka will present the research Saturday (1/19) at the Society for Personality and Social Psychology annual meeting in New Orleans.
The researchers first sought to obtain a snapshot of immune system behavior related to loneliness by gauging levels of antibodies in the blood that are produced when herpes viruses are reactivated.
Participants were 200 breast cancer survivors who were between two months and three years past completion of cancer treatment with an average age of 51 years. Their blood was analyzed for the presence of antibodies against Epstein-Barr virus and cytomegalovirus.
Both are herpes viruses that infect a majority of Americans. About half of infections do not produce illness, but once a person is infected, the viruses remain dormant in the body and can be reactivated, resulting in elevated antibody levels, or titers – again, often producing no symptoms but hinting at regulatory problems in the cellular immune system.
Lonelier participants had higher levels of antibodies against cytomegalovirus than did less lonely participants, and those higher antibody levels were related to more pain, depression and fatigue symptoms. No difference was seen in Epstein-Barr virus antibody levels, possibly because this reactivation is linked to age and many of these participants were somewhat older, meaning reactivation related to loneliness would be difficult to detect, Jaremka said.
Previous research has suggested that stress can promote reactivation of these viruses, also resulting in elevated antibody titers.
"The same processes involved in stress and reactivation of these viruses is probably also relevant to the loneliness findings," Jaremka said. "Loneliness has been thought of in many ways as a chronic stressor – a socially painful situation that can last for quite a long time."
In an additional set of studies, the scientists sought to determine how loneliness affected the production of proinflammatory proteins, or cytokines, in response to stress. These studies were conducted with 144 women from the same group of breast cancer survivors and a group of 134 overweight middle-aged and older adults with no major health problems.
Baseline blood samples were taken from all participants, who were then subjected to stress – they were asked to deliver an impromptu five-minute speech and perform a mental arithmetic task in front of a video camera and three panelists. Researchers followed by stimulating the participants' immune systems with lipopolysaccharide, a compound found on bacterial cell walls that is known to trigger an immune response.
In both populations, those who were lonelier produced significantly higher levels of a cytokine called interleukin-6, or IL-6, in response to acute stress than did participants who were more socially connected. Levels of another cytokine, tumor necrosis factor-alpha, also rose more dramatically in lonelier participants than in less lonely participants, but the findings were significant by statistical standards in only one study group, the healthy adults.
In the study with breast cancer survivors, researchers also tested for levels of the cytokine interleukin 1-beta, which was produced at higher levels in lonelier participants.
When the scientists controlled for a number of factors, including sleep quality, age and general health measures, the results were the same.
"We saw consistency in the sense that more lonely people in both studies had more inflammation than less lonely people," Jaremka said.
"It's also important to remember the flip side, which is that people who feel very socially connected are experiencing more positive outcomes," she said.
Provided by The Ohio State University

Saturday, January 19, 2013


Which nutritional factors help preserve muscle mass, strength and performance in seniors?

Sarcopenia, or the gradual loss of muscle mass, is a common consequence of ageing, and poses a significant risk factor for disability in older adults. As muscle strength plays an important role in the tendency to fall, sarcopenia leads to an increased risk of fractures and other injuries.
19 Jan 2013--The International Osteoporosis Foundation (IOF) Nutrition Working Group has published a new review which identifies nutritional factors that contribute to loss of muscle mass, or conversely, are beneficial to the maintenance of muscle mass. The Group reviewed evidence from worldwide studies on the role of nutrition in sarcopenia, specifically looking at protein, acid–base balance, vitamin D/calcium, and other minor nutrients like B vitamins.
"The most obvious intervention against sarcopenia is exercise in the form of resistance training," said Professor Jean-Philippe Bonjour, co-author and Professor of Medicine at the Service of Bone Diseases, University of Geneva. "However, adequate nutritional intake and an optimal dietary acid-base balance are also very important elements of any strategy to preserve muscle mass and strength during ageing."
The review discusses and identifies the following important nutritional factors that have been shown to be beneficial to the maintenance of muscle mass and the treatment and prevention of sarcopenia:
Protein:
Protein intake plays an integral part in muscle health. The authors propose an intake of 1.0–1.2 g/kg of body weight per day as optimal for skeletal muscle and bone health in elderly people without severely impaired renal function.
Vitamin D:
As many studies indicate a role for vitamin D in the development and preservation of muscle mass and function, adequate vitamin D should be ensured through exposure to sunlight and/or supplementation if required. Vitamin D supplementation in seniors, and especially in institutionalized elderly, is recommended for optimal musculoskeletal health.
Avoiding dietary acid loads:
Excess intake of acid-producing nutrients (meat and cereal grains) in combination with low intake of alkalizing fruits and vegetables may have negative effects on musculoskeletal health. Modifying the diet to include more fruits and vegetables is likely to benefit both bones and muscles.
Emerging evidence also suggests that vitamin B12 and/or folic acid play a role in improving muscle function and strength. As well, the Review discusses non-nutritional interventions such as hormones, and calls for more studies to identify the potential of antioxidants and anti-inflammatory compounds in the prevention of sarcopenia.
Dr. Ambrish Mithal, co-author and Chair and Head of Endocrinology and Diabetes division at Medanta, New Delhi underlined the need for further research in the field. "Strategies to reduce the numbers of falls and fractures within our ageing populations must include measures to prevent sarcopenia. At present, the available evidence suggests that combining resistance training with optimal nutritional status has a synergistic affect in preventing and treating sarcopenia," said Mithal.
"We hope that further studies will shed light on other effective ways of preventing and treating this condition."
More information: Impact of nutrition on muscle mass, strength, and performance in older adults. A. Mithal & J.-P. Bonjour & S. Boonen & P. Burckhardt & H. Degens & G. El Hajj Fuleihan & R. Josse & P. Lips & J. Morales Torres & R. Rizzoli & N. Yoshimura & D. A. Wahl & C. Cooper & B. Dawson-Hughes & for the IOF CSA Nutrition Working Group. Osteoporos Int. DOI:10.1007/s00198-012-2236-y
Provided by International Osteoporosis Foundation

Friday, January 18, 2013


Guided care provides better quality of care for chronically ill older adults

Patients who received Guided Care, a comprehensive form of primary care for older adults with chronic health problems, rated the quality of their care much higher than patients in regular primary care, and used less home care, according to a study by researchers at Johns Hopkins University. In an article published online by the Journal of General Internal Medicine, researchers found that in a 32-month randomized controlled trial, Guided Care patients rated the quality of their care significantly higher than those in normal care, and were 66 percent more likely to rate their access to telephone advice as excellent or very good. Patients also had 29 percent fewer home health care visits.
18 jan 2013--"As more practices move to a comprehensive care model, Guided Care's team care approach can help ensure better quality care and more satisfied patients," said Bruce Leff, MD, co-investigator of the study and professor with the Johns Hopkins School of Medicine and Johns Hopkins Bloomberg School of Public Health. "In addition, the nearly one-third reduction in home care use highlights how providing comprehensive care for high-risk patients can reduce health service utilization."
According to the study, Guided Care patients also experienced, on average, 13 percent fewer hospital re-admissions and 26 percent fewer days in skilled nursing facilities. However, only the difference in home health care episodes is statistically significant. In earlier reports, physician satisfaction was higher and family caregiver strain was lower with Guided Care.
Guided Care is a model of proactive, comprehensive health care that can help primary care practices transform into patient-centered medical homes. Guided Care focuses on improving care for patients with multiple chronic health conditions. Guided Care teams include a registered nurse, two to five physicians, and other members of the office staff who work together to perform home-based assessments, create an evidence-based care guide and action plan, monitor and coach the patient monthly, coordinate the efforts of all the patient's healthcare providers, smooth transitions between sites of care, promote patient self-management, educate and support family caregivers, and facilitate access to community resources.
More information: "A Matched-Pair Cluster-Randomized Trial of Guided Care for High-Risk Older Patients," Journal of General Internal Medicine, 2013.
Provided by Johns Hopkins University Bloomberg School of Public Health

Thursday, January 17, 2013


For under-75s, living alone tied to higher mortality risk

For under-75s, living alone tied to higher mortality risk

For adults younger than 75 years of age, living alone is a significant predictor of all-cause mortality, according to a study published online Jan. 14 in JAMA Internal Medicine.
17 jan 2013—For adults younger than 75 years of age, living alone is a significant predictor of all-cause mortality, according to a study published online Jan. 14 in JAMA Internal Medicine.
To examine the correlation between living alone and the risk of mortality among community-dwelling adults, Bamini Gopinath, Ph.D., from the University of Sydney in Australia, and colleagues assessed self-reported health status using the 36-Item Short-Form Survey from 3,486 participants in the Blue Mountains Eye Study (BMES) population-based cohort (aged 49 years and older). Participants who reported living with nobody or living with pets were classified as living alone. Australian National Death Index data were used to confirm deaths.
The researchers found that, over 10 years of follow-up, 21.2 percent of participants died. For the overall cohort, living alone was not associated with total mortality after multivariable adjustment. Living alone was associated with a significantly increased risk of all-cause mortality (15.0 versus 11.4 percent; multivariate-adjusted hazard ratio, 1.36) in participants younger than 75 years. Living alone was not associated with total mortality or cardiovascular diseasemortality in those aged 75 years or older.
"In the BMES, living alone was a significant predictor of all-cause mortality among those younger than 75 years, independent of self-perceived health status and socioeconomic and medical covariates," the authors write.

Wednesday, January 16, 2013


Major step toward an Alzheimer's vaccine

A team of researchers from Université Laval, CHU de Québec, and pharmaceutical firm GlaxoSmithKline (GSK) has discovered a way to stimulate the brain's natural defense mechanisms in people with Alzheimer's disease. This major breakthrough, details of which are presented today in an early online edition of the Proceedings of the National Academy of Sciences (PNAS), opens the door to the development of a treatment for Alzheimer's disease and a vaccine to prevent the illness.
16 jan 2013--One of the main characteristics of Alzheimer's disease is the production in the brain of a toxic molecule known as amyloid beta. Microglial cells, the nervous system's defenders, are unable to eliminate this substance, which forms deposits called senile plaques.
The team led by Dr. Serge Rivest, professor at Université Laval's Faculty of Medicine and researcher at the CHU de Québec research center, identified a molecule that stimulates the activity of the brain's immune cells. The molecule, known as MPL (monophosphoryl lipid A), has been used extensively as a vaccine adjuvant by GSK for many years, and its safety is well established.
In mice with Alzheimer's symptoms, weekly injections of MPL over a twelve-week period eliminated up to 80% of senile plaques. In addition, tests measuring the mice's ability to learn new tasks showed significant improvement in cognitive function over the same period.
The researchers see two potential uses for MPL. It could be administered by intramuscular injection to people with Alzheimer's disease to slow the progression of the illness. It could also be incorporated into a vaccine designed to stimulate the production of antibodies against amyloid beta. "The vaccine could be given to people who already have the disease to stimulate their natural immunity," said Serge Rivest. "It could also be administered as a preventive measure to people with risk factors for Alzheimer's disease."
"When our team started working on Alzheimer's disease a decade ago, our goal was to develop better treatment for Alzheimer's patients," explained Professor Rivest. "With the discovery announced today, I think we're close to our objective."
Provided by Laval University

Tuesday, January 15, 2013


Coca-Cola to address obesity for first time in ads (Update)

Coca-Cola to address obesity for first time in ads

This undated image shows a frame grab taken from a new commercial from Coca-Cola. The Atlanta-based company on Monday, Jan. 14, 2013, said it will start airing a two-minute spot during the highest-rated shows on CNN, Fox News and MSNBC in hopes of flexing its marketing muscle in the debate over sodas and their impact on public health. The ad lays out Coca-Cola's record of providing drinks with fewer calories over the years and notes that weight gain is the result of consuming too many calories of any kind, not just soda. (AP Photo/Coca0Cola)
Coca-Cola became one of the world's most powerful brands by equating its soft drinks with happiness. Now, for the first time, it's addressing a growing cloud over the industry: obesity.
15 jan 2013--The U.S.-based company on Monday will begin airing a two-minute ad during the highest-rated shows on CNN, Fox News and MSNBC in hopes of becoming a stronger voice in the debate over sodas and their impact on public health. The ad lays out Coca-Cola's record of providing drinks with fewer calories over the years and notes that weight gain is the result of consuming too many calories of any kind—not just soda.
Coca-Cola says the campaign will kick off a variety of moves that help address obesity in the year ahead, such as providing more diet options at soda fountains.
For the world's No. 1 beverage company, the ads reflect the mounting pressures on the broader industry. Later this year, New York City is set to put into effect a first-in-the-nation cap on the size of soft drinks sold at restaurants, movie theaters, sports arenas and other venues.
And when PepsiCo Inc., the No. 2 soda maker, recently signed a wide-ranging endorsement deal with pop singer Beyonce, critics called for the singer to drop the contract or donate the funds to groups that fund health initiatives.
Diana Garza Ciarlante, a spokeswoman for Coca-Cola Co., said the new ads aren't a reaction to any negative public sentiment but that the company felt it needed to address "the issue of the times."
Recent studies have suggested that sugary drinks cause people to gain weight, independent of other unhealthy behavior. A decades-long study involving more than 33,000 Americans suggested that drinking sugary beverages interacts with genes that affect weight, amplifying a person's risk of obesity beyond what they would be from heredity alone.
Mike Jacobson, executive director for the Center for Science in the Public Interest, was skeptical about the intent behind Coca-Cola's ads and said that if the company was serious about helping reduce obesity, it would stop fighting soda taxes.
"It looks like a page out of damage control 101," he said. "They're trying to disarm the public."
In the Coca-Cola ad, a narrator notes that obesity is an issue that "concerns all of us" but that people can make a difference when they "come together."
Another ad, which will run later this week during "American Idol" and before the Super Bowl football championship, is much more reminiscent of catchy, upbeat advertising people have come to expect from Coca-Cola. It features a montage of activities that add up to burning off the "140 happy calories" in a can of Coke: walking a dog, dancing, laughing with friends and doing a victory dance after bowling a strike.
Garza Ciarlante said the 30-second ad, a version of which ran in Brazil last month, is intended to address confusion about the number of calories in soda. She said the company's consumer research showed people thought there were as many as 900 calories in one can.
Coca-Cola notes it has already made several moves to help customers make smarter choices, such as putting calorie counts on the front of its cans and bottles in the U.S. Last year, it also started posting calorie information on its vending machines ahead of a regulation that will require soda companies to do so by 2014.
Public concern over calories counts is apparent in Coca-Cola's business. In North America, all the growth in its soda business over the past 15 years has come from low- and no-calorie drinks, such as Coke Zero. Diet sodas now account for nearly a third of its sales in the U.S. and Canada. Other beverages, such as sports drinks and bottled water, are also fueling growth.
Even with the growing popularity of diet sodas, however, overall soda consumption in the U.S. has declined steadily since 1998, according to the industry tracker Beverage Digest.

Monday, January 14, 2013


Post-op mortality up for elderly with pre-heart op anxiety


Few elderly patients about to undergo cardiac surgery experience high levels of anxiety, but for those who do, there is a five-fold higher risk of postoperative major morbidity or mortality, according to research published in the Jan. 1 issue of The American Journal of Cardiology.
14 Jan 2013—Few elderly patients about to undergo cardiac surgery experience high levels of anxiety, but for those who do, there is a five-fold higher risk of postoperative major morbidity or mortality, according to research published in the Jan. 1 issue of The American Journal of Cardiology.
To examine the correlation between patient-reported anxiety and post-cardiac surgery mortality and major morbidity, Judson B. Williams, M.D., M.H.S., of the Duke University Medical Center in Durham, N.C., and colleagues conducted a prospective, multicenter cohort study—Frailty Assessment Before Cardiac Surgery—involving 148 elderly patients (mean age, 75.8 years) who were undergoing coronary artery bypass surgery and/or valve repair or replacement.
The researchers found that 7 percent of patients reported high levels of preoperative anxiety, with no difference based on type of surgery or Society of Thoracic Surgeons predicted risk. Preoperative anxiety independently predicted postoperative mortality or major morbidity, after adjustment for potential confounding variables (odds ratio, 5.1).
"Significant levels of patient-reported preoperative anxiety independently predicted a greater risk of in-hospital mortality or major morbidity in elderly patients undergoing cardiac surgery," the authors write. "Importantly, because high levels of anxiety are potentially modifiable, identifying these patients could provide an opportunity to increase psychological comfort and improve the clinical outcomes in this high-risk group."
More information: Abstract 
Full Text

Sunday, January 13, 2013


Study shows cognitive benefit of lifelong bilingualism

Seniors who have spoken two languages since childhood are faster than single-language speakers at switching from one task to another, according to a study published in the January 9 issue of The Journal of Neuroscience. Compared to their monolingual peers, lifelong bilinguals also show different patterns of brain activity when making the switch, the study found.
13 jan 2013--The findings suggest the value of regular stimulating mental activity across the lifetime. As people age, cognitive flexibility—the ability to adapt to unfamiliar or unexpected circumstances—and related "executive" functions decline. Recent studies suggest lifelong bilingualism may reduce this decline—a boost that may stem from the experience of constantly switching between languages. However, how brain activity differs between older bilinguals and monolinguals was previously unclear.
In the current study, Brian T. Gold, PhD, and colleagues at the University of Kentucky College of Medicine, used functional magnetic resonance imaging (fMRI) to compare the brain activity of healthy bilingual seniors (ages 60-68) with that of healthy monolingual seniors as they completed a task that tested their cognitive flexibility. The researchers found that both groups performed the task accurately. However, bilingual seniors were faster at completing the task than their monolingual peers despite expending less energy in the frontal cortex—an area known to be involved in task switching.
"This study provides some of the first evidence of an association between a particular cognitively stimulating activity—in this case, speaking multiple languages on a daily basis—and brain function," said John L. Woodard, PhD, an aging expert from Wayne State University, who was not involved with the study. "The authors provide clear evidence of a different pattern of neural functioning in bilingual versus monolingual individuals."
The researchers also measured the brain activity of younger bilingual and monolingual adults while they performed the cognitive flexibility task.
Overall, the young adults were faster than the seniors at performing the task. Being bilingual did not affect task performance or brain activity in the young participants. In contrast, older bilinguals performed the task faster than their monolingual peers and expended less energy in the frontal parts of their brain.
"This suggests that bilingual seniors use their brains more efficiently than monolingual seniors," Gold said. "Together, these results suggest that lifelong bilingualism may exert its strongest benefits on the functioning of frontal brain regions in aging."
Provided by Society for Neuroscience

Saturday, January 12, 2013


Predicting mortality amongst older care home residents

Predicting mortality amongst older care home residents






















The number of medications prescribed to a care home resident and the frequency of their contact with their GP are strong predictors of mortality in care homes shows research published today in Age & Ageing, the scientific journal of the British Geriatrics Society.
12 jan 2013--Researchers from St George's, University of London investigated predictors of mortality in older care home residents in England and Wales as compared to older people living in the community. They followed 9,772 care home and 354,306 community residents aged 65 to 104 years in 293 English and Welsh general practices in 2009. Approximately a quarter (26.2 per cent) of care home residents died within one year compared to just over 3 per cent of community residents.
The study found that in both care homes and the community, age and gender were important determinants of mortality, although the effect of increasing age was markedly attenuated in care homes. A diagnosis of dementia predicted mortality in both care homes and the community but its effect on mortality in care homes was modest compared to the community, due to the higher overall mortality in care homes. A previous cancer diagnosis was the strongest diagnostic predictor of mortality in both the community and care homes.
However, high numbers of GP consultations and prescribed medications were strong predictors of mortality in both settings. The researchers measured how many different classes of medication people had been prescribed in the last three months and the number of times they had been in contact with their GP in the last three months. This included contact, either in person or by telephone, with any GP practice clinical staff, either with the individual themselves or with a representative. It is likely that these measures of general practice use are indicators of frailty and clinical instability. Mortality was much higher in people receiving 11 or more medications compared to those receiving two or less and for people who had six or more clinical contacts compared to none.
Lead author Dr Sunil Shah said: "Care home residents are a vulnerable group. It is important to understand mortality in care homes in order to ensure that residents receive appropriate elective, preventive and palliative care. Our study shows that conventional indicators, such as diagnosis and age, are less predictive of mortality in care homes but frequency of contact with GPs and medication use are indicators of future mortality. These measures often indicate frailty and unstable health. Our findings should help GPs have timely discussions with patients and their relatives about preferences for end of life care."
Provided by St. George's University of London

Thursday, January 10, 2013


Saliva gland test for Parkinson's shows promise

Described as a "big step forward" for research and treatment of Parkinson's disease, new research from Mayo Clinic in Arizona and Banner Sun Health Research Institute suggests that testing a portion of a person's saliva gland may be a way to diagnose the disease. The study was released today and will be presented at the American Academy of Neurology's annual meeting in San Diego in March.
11 jan 2013--"There is currently no diagnostic test for Parkinson's disease," says study author Charles Adler, M.D., Ph.D., a neurologist with Mayo Clinic in Arizona. "We have previously shown in autopsies of Parkinson's patients that the abnormal proteins associated with Parkinson's are consistently found in the submandibular saliva glands, found under the lower jaw. This is the first study demonstrating the value of testing a portion of the saliva gland to diagnose a living person with Parkinson's disease. Making a diagnosis in living patients is a big step forward in our effort to understand and better treat patients."
The study involved 15 people with an average age of 68 who had Parkinson's disease for an average of 12 years, responded to Parkinson's medication and did not have known saliva gland disorders. Biopsies were taken of two different saliva glands: the submandibular gland and the minor saliva glands in the lower lip. The surgical team was led by Michael Hinni, M.D., and David Lott, M.D., at Mayo Clinic in Arizona, and the biopsied tissues were tested for evidence of the abnormal Parkinson's protein by study co-author Thomas Beach, M.D., with Banner Sun Health Research Institute.
"This procedure will provide a much more accurate diagnosis of Parkinson's disease than what is now available," Dr. Beach says. "One of the greatest potential impacts of this finding is on clinical trials, as at the present time some patients entered into Parkinson's clinical trials do not necessarily have Parkinson's disease and this is a big impediment to testing new therapies."
The abnormal Parkinson's protein was detected in nine of the 11 patients who had enough tissue to study. While still being analyzed, the rate of positive findings in the biopsies of the lower lip glands appears much lower than for the lower jaw gland.
"This study provides the first direct evidence for the use of submandibular gland biopsies as a diagnostic test for living patients with Parkinson's disease," Dr. Adler. "This finding may be of great use when needing definitive proof of Parkinson's disease, especially when considering performing invasive procedures such as deep brain stimulation surgery or gene therapy."
Parkinson's disease is a progressive disorder of the nervous system that affects movement. It develops gradually, sometimes starting with a barely noticeable tremor in just one hand. But while tremor may be the best-known sign of Parkinson's, the disorder also commonly causes stiffness or slowing of movement. Currently, diagnosis is made based on medical history, a review of signs and symptoms, a neurological and physical examination, and by ruling out other conditions. Up to 30 percent of patients may be misdiagnosed early in the disease.
Although Parkinson's disease can't be cured, medications may markedly improve symptoms.
Provided by Mayo Clinic

First Alzheimer's case has full diagnosis 106 years later

10 jan 2013—More than a hundred years after Alois Alzheimer identified Alzheimer's disease in a patient an analysis of that original patient's brain has revealed the genetic origin of their condition.
The brain specimen tested was discovered in a university basement late last century after a search by rival teams of academics.
"It is extremely satisfying to place this last piece in the medical puzzle that Auguste Deter, the first ever Alzheimer patient, presented us with," said Professor Manuel Graeber, from the University of Sydney.
"It is not only of historical interest, however, as it ends any speculation about whether the disease is correctly named after Alois Alzheimer. Alzheimer's ability to recognise this dementia more than a century ago provides compelling support for specialisation in medicine. Alzheimer was a founding father of neuropathology, an important medical specialty that is still underrepresented."
Professor Graeber, from the University's Brain and Mind Research Institute, Sydney Medical School and the Faculty of Health Sciences, collaborated with Professor Ulrich Müller's team from the Institute of Human Genetics of the University of Giessen in Germany to produce themolecular diagnosis recently published in Lancet Neurology.
For years scientists have been wondering whether the first case of Alzheimer's disease had a genetic cause. In 1901 Auguste Deter, a middle-aged female patient at the Frankfurt Asylum with unusual symptoms, including short-term memory loss, came to the attention of Dr Alzheimer. When she died, Dr Alzheimer examined her brain and described the distinctive damage indicating a form of presenile dementia.
For decades the more than 200 slides that Alzheimer prepared from Deter's brain were lost. Then in 1992, after Professor Graeber uncovered new information pointing to their location, two teams of medical researchers began a dramatic race to find them.
One team searched in Frankfurt but it was a team headed by Professor Graeber, then working at the Max Planck Institute for Neurobiology that finally located the material at the University of Munich in 1997.
The slides were examined and confirmed beyond doubt that Deter was suffering from Alzheimer's disease, with large numbers of amyloid plaques and neurofribrillary tangles in the brain that are hallmarks of the disease. Until now a more sophisticated DNA analysis of the small amount of fragile material in single slides has not been possible.
Since their rediscovery, a significant number of brain slides have been under the official custodianship of Professor Graeber who has been at the University of Sydney since 2010. He is preparing a book on the material.
"We found a mutation whose ultimate effect is the formation of amyloid plaques. These plaques, which form between nerve cells and seem to suffocate them are the key diagnostic landmark of the disease."
Alzheimer's disease represents one of the greatest health problems in industrialised societies today. An estimated 100 million dementia sufferers are predicted worldwide by 2050, the vast majority of whom will have Alzheimer's disease. 
95 percent of Alzheimer's patients suffer late onset of the illness after they turn 65. Five percent fall ill before that age (early onset) and Auguste Deter belongs to this group.
"We have revealed that Auguste Deter is one of those in which early onset of the disease is caused by mutation in a single gene," said Professor Graeber.
Provided by University of Sydney

Wednesday, January 09, 2013


Cancer screening unlikely to benefit patients with a short life expectancy

Breast and colorectal cancer screening should be targeted towards patients with a life expectancy greater than 10 years: for any shorter life expectancy the harms are likely to outweigh the benefits, concludes a study published in BMJ today.
09 Jan 2013--The authors stress that their results "should not be used to deny screening for patients with limited life expectancy" but "should inform decision making which aims to account for patient preferences and values while maximising benefits and minimising risks."
Guidelines recommend screening healthy older patients because complications from screening can harm patients immediately while the benefits of screening are not seen for many years.
What still remains unclear, however, is how long a patient needs to live to benefit from cancer screening. Previous trials have focused on the size of benefit rather than when those benefits occur.
Researchers from the University of California in San Francisco therefore analysed the results of five breast and four colorectal cancer screening trials focusing on patients aged over 50.
Their goal was to estimate the time-lag to benefit (the time between screening and when the benefits of screening are seen) to determine whether an individual patient is likely to benefit from screening.
The trials were published between 1986 and 2008 and ranged in size from just under 40,000 people to just over 150,000 people. Follow-up ranged from 8-20 years.
Results showed that at five years, an average 2.8 colorectal cancer deaths were prevented for every 10,000 people screened. This benefit steadily increased with longer follow-up, reaching 23 colorectal cancer deaths prevented for every 10,000 people screened at 15 years.
In absolute terms, it took an average of 4.8 years to prevent one colorectal cancer death for 5,000 people screened and 10.3 years to prevent one death for 1,000 people screened.
For breast cancer, at five years an average of 5.1 deaths were prevented for every 10,000 women screened. By 15 years, this mortality benefit had increased to 19 breast cancer deaths prevented for every 10,000 women screened.
In absolute terms, it took an average of three years to prevent one breast cancer death for 5.000 women screened and 10.7 years to prevent one death for 1,000 women screened.
However, the researchers say that in both colorectal and breast cancer, approximately one in ten people screened will have a false positive result and many more will be subject to possibly unnecessary treatment.
Based on these results, they suggest that patients with a life expectancy greater than ten years "should be encouraged to undergo colorectal and breast cancer screening" but patients whose life expectancy is 3-5 years "probably should be discouraged from screening since the potential risks likely outweigh the very small probability of benefit."
They conclude that incorporating time lag estimates into screening guidelines "would encourage a more explicit consideration of the risks and benefits of breast and colorectal cancer screening, likely resulting in a more individualised decision making process for the heterogeneous population of older adults."
More information: Time-lag to benefit after screening for breast and colorectal cancer: meta-analysis of survival data from the United States, Sweden, United Kingdom and Denmark, BMJ, 2013.
Provided by British Medical Journal

Tuesday, January 08, 2013


Expert suggests top four reasons why diets fail

The battle of the bulge is on—any movement on the scale yet? "Losing weight is one of the top resolutions made every year, yet only 20 percent of people achieve successful weight-loss and maintenance," says Jessica Bartfield,MD, internal medicine who specializes in nutrition and weight management at the Loyola Center for Metabolic Surgery & Bariatric Care.
08 Jan 2013--Despite that fact that two-thirds of Americans say they are on a diet to improve their health, very few are actually decreasing in size. "Dieting is a skill, much like riding a bicycle, and requires practice and good instruction, " says Dr. Bartfield. "You're going to fall over and feel frustrated, but eventually you will succeed and it will get easier."
According to Dr. Bartfield, here are the top four reasons why many dieters fail to lose weight:
1. Underestimating Calories Consumed
"Most people (even experts!) underestimate the number of calories they eat per day. Writing down everything that you eat- including drinks and "bites" or "tastes" of food - can help increase self-awareness. Pay attention to serving sizes and use measuring cups and spoons as serving utensils to keep portions reasonable. Food eaten outside of the home tends to be much larger portion sizes and much higher in calories. Try to look up nutrition information of your favorite take-out meal or restaurant and select a healthy meal before picking up the phone or going out to eat.
2. Overestimating Activity and Calories Burned
"Typically you need to cut 500 calories per day to lose 1 lb per week. This is very difficult to achieve through exercise alone, and would require 60 minutes or more of vigorous activity every day. A more attainable goal would be to try to increase activity throughout the day and get a total of 30 minutes of moderate to vigorous exercise most days of the week. Buy a pedometer and track your steps; try to increase to a goal of 10,000 steps per day. But be careful- exercise is not an excuse to eat more!"
3. Poor Timing of Meals
"You need a steady stream of glucose throughout the day to maintain optimal energy and to prevent metabolism from slowing down. Eat breakfast every day within one hour of waking up, then eat a healthy snack or meal every three to four hours. Try not to go longer than 5 hours without eating a healthy snack or meal to keep your metabolism steady."
4. Inadequate Sleep
"Studies have shown that people who get fewer than six hours of sleep have higher levels of ghrelin, which is a hormone that stimulates appetite, particularly for high- carbohydrate/high- calorie foods. In addition, less sleep raises levels of cortisol, a stress hormone, which can lead to weight gain."
Dr. Bartfield regularly counsels patients through the Loyola Center for Metabolic Surgery & Bariatric Care, which offers surgical as well as non-surgical weight loss programs. "A registered dietitian, behavioral psychologist, exercise physiologist and a physician plus a surgeon if appropriate, all partner one-on-one with patients," said Bartfield. "Good health practices are more than just learned, they become a regular habit and a way of life."
Provided by Loyola University Health System

Monday, January 07, 2013


Dopamine-receptor gene variant linked to human longevity


07 Jan 2013—A variant of a gene associated with active personality traits in humans seems to also be involved with living a longer life, UC Irvine and other researchers have found.
This derivative of a dopamine-receptor gene – called the DRD4 7R allele – appears in significantly higher rates in people more than 90 years old and is linked to lifespan increases in mouse studies.
Robert Moyzis, professor of biological chemistry at UC Irvine, and Dr. Nora Volkow, a psychiatrist who conducts research at the Brookhaven National Laboratory and also directs the National Institute on Drug Abuse, led a research effort that included data from the UC Irvine-led 90+ Study in Laguna Woods, Calif. Results appear online in The Journal of Neuroscience.
The variant gene is part of the dopamine system, which facilitates the transmission of signals among neurons and plays a major role in the brain network responsible for attention and reward-driven learning. The DRD4 7R allele blunts dopamine signaling, which enhances individuals' reactivity to their environment.
People who carry this variant gene, Moyzis said, seem to be more motivated to pursue social, intellectual and physical activities. The variant is also linked to attention-deficit/hyperactivity disorder and addictive and risky behaviors.
"While the genetic variant may not directly influence longevity," Moyzis said, "it is associated with personality traits that have been shown to be important for living a longer, healthier life. It's been well documented that the more you're involved with social and physical activities, the more likely you'll live longer. It could be as simple as that."
Numerous studies – including a number from the 90+ Study – have confirmed that being active is important for successful aging, and it may deter the advancement of neurodegenerative diseases, such as Alzheimer's.
Prior molecular evolutionary research led by Moyzis and Chuansheng Chen, UC Irvine professor of psychology & social behavior, indicated that this "longevity allele" was selected for during the nomadic out-of-Africa human exodus more than 30,000 years ago.
In the new study, the UC Irvine team analyzed genetic samples from 310 participants in the 90+ Study. This "oldest-old" population had a 66 percent increase in individuals carrying the variant relative to a control group of 2,902 people between the ages of 7 and 45. The presence of the variant also was strongly correlated with higher levels of physical activity.
Next, Volkow, neuroscientist Panayotis Thanos and their colleagues at the Brookhaven National Laboratory found that mice without the variant had a 7 percent to 9.7 percent decrease in lifespan compared with those possessing the gene, even when raised in an enriched environment.
While it's evident that the variant can contribute to longevity, Moyzis said further studies must take place to identify any immediate clinical benefits from the research. "However, it is clear that individuals with this gene variant are already more likely to be responding to the well-known medical adage to get more physical activity," he added.
Provided by University of California, Irvine

Sunday, January 06, 2013


When will genomic research translate into clinical care—and at what cost?

Genomic research is widely expected to transform medicine, but progress has been slower than expected. While critics argue that the genomics "promise" has been broken – and that money might be better spent elsewhere—proponents say the deliberate pace underscores the complexity of the relationship between medicine and disease and, indeed, argues for more funding.
06 jan 2013--But thus far, these competing narratives have been based mostly on anecdotes. Ramy Arnaout, MD, DPhil, a founding member of the Genomic Medicine Initiative at Beth Israel Deaconess Medical Center (BIDMC), decided it was time to look at genomics from a new perspective. So he turned to quantitative modeling, a numerical forecasting approach used to predict everything from weather events to the outcomes of political elections, and an extremely useful way to both set expectations and assist in decision-making.
Arnaout and colleagues knew that drug-related adverse outcomes cost the health-care system upwards of $80 billion a year, and that many such cases should be avoidable by choosing and dosing drug prescriptions according to a person's genome. So they developed a quantitative model to estimate how much time and money would be required to use genomics, specifically pharmacogenomics, to cut these adverse outcomes in half. Their findings, currently published online in the journal Clinical Chemistry, provide one of the first examples of data-driven estimates being applied to genomic medicine and offer a template for the use of quantitative modeling in this field.
How do the numbers add up? After analyzing their model for a range of situations, the research team found that the cost can be expected to be less than $10 billion, spread out over approximately 20 years.
"If you look across medicine, you can see specific places here and there where genomics is really starting to change things, but it's been hard to know how it all adds up in the big picture," explains Arnaout, who is also an Assistant Professor of Pathology at Harvard Medical School (HMS) and Associate Director of Clinical Microbiology at BIDMC. "Quantitative modeling is a standard approach for forecasting and setting expectations in many fields as we all remember from the recent presidential election and from the hurricane season. Genomics is so important and is so often on the minds of our patients, students and staff, that it seemed like a good idea to use modeling to get some hard numbers on where we're headed."
The idea for the study originated nearly two years ago, while Arnaout (whose laboratory studies genomics) and Sukhatme, BIDMC's Chief Academic Officer, were attending a lecture, shortly after the 10-year anniversary of the sequencing of the genome. "Vikas asked me, 'So when is genomics really going to change medicine?'" remembers Arnaout. "I realized I didn't know. And that got me thinking."
Arnaout and Sukhatme, together with coauthors Thomas Buck, MD, and Paulvalery Roulette, MD, of BIDMC and HMS, decided to try and answer this question by applying forecasting methods to a big clinical problem – drug-related adverse outcomes. "We know that preventable causes of these adverse outcomes—patients' non-adherence, interactions between multiple drugs, and medical error, for example—account for only a fraction of the millions of adverse outcomes that patients experience each year," explains Arnaout. "This leaves a significant number that are currently considered non-preventable and are thought to be caused by genomic variation."
By way of example, Arnaout explains that 30 million Americans currently use the blood-thinning drug warfarin. But because, in some cases, patients' genomes contain variants that make the standard dose of warfarin too high for them, these individuals are likely to experience bleeding, an extremely dangerous side effect. In fact, researchers now estimate that three-quarters of the variability in warfarin dosing requirement is due to these genomic variants, and they have already identified a set of variants in six specific genes that explain two-thirds of the variability.
"This kind of progress suggested an interesting thought experiment," says Arnaout. "What if we took existing examples in which there appears to be a carefully vetted, clinically useful connection between a specific adverse outcome and a specific genetic variant, found out how much it cost and how long it took to discover, and applied that model to all drugs? How much would it cost and how long would it take to cut adverse outcomes by 25 percent? How about by half?"
As data for the model, the authors selected eight associations involving six prescription drugs (clopidogrel, warfarin, escitalopram, carbamazepine, the nicotine-replacement patch and abacavir) and one drug class, the statin class of anticholesterol drugs.
Using an approach called Monte Carlo modeling, the team ran simulations to forecast the research investment required to learn how to cut adverse outcomes by meaningful amounts, and how long that research work would be expected to take. For statistical confidence, they ran their simulations thousands of times and explored a wide range of assumptions. "The results were surprising," says Arnaout. "Before we did this work, I couldn't have told you whether it would take a million dollars or a trillion dollars or whether it would take five years or a hundred years. But now, we've got a basis for thinking that we're looking at single-digit billions of dollars and a couple of decades. That may sound like a lot or a little, depending on your point of view. But with these numbers, we can now have a more informed conversation about planning for the future of genomic medicine."
The most important determinant of the numbers is the extent to which the examples used in the model will turn out to be representative of drugs as a whole. "It's a broad set of drugs that were used, but we know how the genome can surprise us," says senior author Sukhatme. "For example, you won't be able to use genomics to cut adverse outcomes in half if genomics turns out to explain less than half of the adverse outcomes. But even in that case, we found that pharmacogenomics will be able to make a significant dent in adverse outcomes – cutting them by a quarter – for multi-billion-dollar investments."
Also surprising, say the authors, was the timing. "As a rule, the fruits of research come only after research dollars have already been spent," points out Arnaout. This means that, in this case, hundreds of millions of dollars will be spent for "pump-priming" long before the public can expect to see any meaningful clinical impact. "It's one thing to say, 'Be patient,' based on just faith," he adds. "It's another to be able to say so based on data and a model. We now have that. This enables the conversation to shift to which indicators of progress to look for, over the five or so years of pump-priming, to make sure we're on track."
Can we go faster? "If we could enroll an ethnically diverse set of patients who are taking each of the 40 or 50 most commonly prescribed drugs, get their blood samples, and keep track of the adverse outcomes that some of them are bound to experience, we should be able to move faster, for less money," adds Arnaout, who describes this idea as a "50,000 Pharmacogenomes Project," a pursuit along the lines of the 1,000 Genomes Project, the UK10K or the Veteran's Association Million Veteran Program.
"This model provides the start of a provocative conversation and illustrates the value of quantitative modeling in this very practical and publically relevant aspect of genomics," adds BIDMC Chief of Pathology Jeffrey Saffitz, MD, PhD. "Such models should help both decision makers and the public set expectations and priorities for translating genomic research into better patient care."
Provided by Beth Israel Deaconess Medical Center

Saturday, January 05, 2013


Why good resolutions about taking up a physical activity can be hard to keep

Why good resolutions about taking up a physical activity can be hard to keep


Physical inactivity is a major public health problem that has both social and neurobiological causes. According to the results of an Ipsos survey published on Monday 31 Dec., the French have put "taking up a sport" at the top of their list of good resolutions for 2013. However, Francis Chaouloff, research director at Inserm's NeuroCentre Magendie (Inserm Joint Research Unit 862, Université Bordeaux Ségalen), Sarah Dubreucq, a PhD student and François Georges, a CNRS research leader at the Interdisciplinary Institute for Neuroscience (CNRS/Université Bordeaux Ségalen) have just discovered the key role played by a protein, the CB1 cannabinoid receptor, during physical exercise. In their mouse studies, the researchers demonstrated that the location of this receptor in a part of the brain associated with motivation and reward systems controls the time for which an individual will carry out voluntary physical exercise. These results were published in the journal Biological Psychiatry.
05 jan 2013--The collective appraisal conducted by Inserm in 2008 highlighted the many preventive health benefits of regular physical activity. Such activity is limited, however, by our lifestyle in today's industrial society. While varying degrees of physical inactivity may be partly explained by social causes, they are also rooted in biology.
"The inability to experience pleasure during physical activity, which is often quoted as one explanation why people partially or completely drop out of physical exercise programmes, is a clear sign that the biology of the nervous system is involved", explains Francis Chaouloff.
But how exactly? The neurobiological mechanisms underlying physical inactivity had yet to be identified.
Francis Chaouloff (Giovanni Marsicano's team at the NeuroCentre Magendie; Inserm joint research unit, Université Bordeaux Ségalen) and his team have now begun to decipher these mechanisms. Their work clearly identifies the endogenous cannabinoid (or endocannabinoid) system as playing a decisive role, in particular one of its brain receptors. This is by no means the first time that data has pointed to interactions between the endocannabinoid system, which is the target of delta9-tetrahydrocannabinol (the active ingredient of cannabis), and physical exercise. It was discovered ten years ago that physical exercise activated the endocannabinoid system in trained sportsmen, but its exact role remained a mystery for many years. Three years ago, the same research team in Bordeaux observed that when given the opportunity to use a running wheel, mutant mice lacking the CB1 cannabinoid receptor, which is the principal receptor of the endocannabinoid system in the brain, ran for a shorter time and over shorter distances than healthy mice. The research published in Biological Psychiatry this month seeks to understand how, where and why the lack of CB1 receptor reduces voluntary exercise performance (by 20 to 30%) in mice allowed access to a running wheel three hours per day.
The researchers used various lines of mutant mice for the CB1 receptor, together with pharmacological tools. They began by demonstrating that the CB1 receptor controlling running performance is located at the GABAergic nerve endings. They went on to show that the receptor is located in the ventral tegmental area of the brain (see diagram below), which is an area involved in motivational processes relating to reward, whether the reward is natural (food, sex) or associated with the consumption of psychoactive substances.
Based on the results of this study and earlier work, the Bordeaux team suggests the following neurobiological explanation: at the beginning and for the duration of physical exercise, the CB1 receptor is constantly simulated by the endocannabinoids, lipid molecules that naturally activate this receptor in response to pleasant stimuli (rewards) and unpleasant stimuli (stress). Endocannabinoid stimulation of the CB1 receptor during physical exercise inhibits the release of GABA, an inhibitory neurotransmitter that controls the activity of the dopamine neurons associated with the motivation and reward processes. This stimulation of the CB1 receptor "inhibits inhibition", in other words, it activates the dopaminergic neurons in the ventral tegmental area. The CB1 receptor must therefore be stimulated before the exercise can go on for longer and the body must receive the necessary motivation.
Why good resolutions about taking up a physical activity can be hard to keep
Longitudinal section of the mouse brain (top) and diagram of interactions between the endocannabinoid, GABAergic and dopaminergic systems during voluntary physical exercise (bottom) ©Inserm/F. Chaouloff
Conversely, without these CB1 receptors, the "GABAergic brake" continues to act on the dopaminergic neurons in the ventral tegmental area, leading to the reduced performance levels observed above.
It is already known that CB1 receptors play a regulatory role in the motivation to consume rewards, whether natural or not. What is original about this research is that it shows that physical exercise can be added to the array of natural rewards regulated by the endocannabinoid system. "If confirmed, this motivational hypothesis would imply that the role played by the CB1 receptor has more to do with 'staying power' in the exercise than with actual physical performance levels" explain the researchers.
This work reveals that the endocannabinoid system plays a major role in physical exercise performance through its impact on motivational processes. It thus opens up new avenues of research into the mediators of pleasure – and even addiction – associated with regular physical exercise. "After endorphins, we now need to consider endocannabinoids as another potential mediator of the positive effects that physical exercise has on our mood," the researchers conclude.
More information: "Ventral Tegmental Area Cannabinoid Type-1 Receptors Control Voluntary Exercise Performance" Biological Psychiatry, 12 December 2012
Provided by Institut National de la Sante et de la Recherche Medicale