NHS Choices

High levels of tooth decay found in three-year-olds

NHS Choices - Behind the Headlines - Tue, 30/09/2014 - 12:20

"Tooth decay affects 12% of three-year-olds, says survey," BBC News reports. The survey, carried out by Public Health England, found big variations in different parts of the country. Experts believe sugary drinks are to blame for this trend.

The survey looked at the prevalence and severity of tooth decay in three-year-old children in 2013. This is the first time the dental health of this age group has been surveyed nationally. It found 12% of children surveyed had tooth decay – more than one in eight children.

Tooth decay (also known as dental decay or dental caries) occurs when a sticky acidic film called plaque builds up on the teeth and begins to break down the tooth's surface. A diet high in sugar can help stimulate the production of plaque.

As it progresses, tooth decay can cause an infection of underlying gum tissue. This type of infection is known as a dental abscess and can be extremely painful.

 

Who produced the children's dental health report?

The survey and subsequent report was produced by Public Health England (PHE), part of the Department of Health. PHE's role is to protect and improve the nation's health and wellbeing, and reduce health inequalities.

This survey of the prevalence and severity of tooth decay in three-year-olds was performed to help identify which age group interventions to improve tooth decay should be aimed at.

 

What data did the report look at?

The report looked at the prevalence and severity of dental decay in three-year-old children in 2013. At three years of age most children have all 20 milk teeth (also known as primary teeth).

PHE randomly sampled children attending private and state-funded nurseries, as well as nursery classes attached to schools and playgroups. The children's teeth were examined to see if they had missing teeth, filled teeth or obvious signs of tooth decay.

 

What were the main findings of the report?

Of the 53,814 children included in the survey, 12% had dental decay. Of the children with dental decay, on average these children had at least three teeth that were decayed, missing or filled.

Across all the children included in the survey, the average number of decayed, missing or filled teeth was 0.36 per child.

The report found a wide variation in the levels of decay experienced by three-year-old children living in different parts of the country. The four regions with the most dental decay were:

  • the East Midlands
  • the north west
  • London
  • Yorkshire and the Humber

 

What are the implications of the report?

Where there are high levels of tooth decay among three-year-olds, Public Health England wants earlier interventions to target this younger age group, rather than waiting until the age of five (when these interventions usually take place).

Where there are high levels of tooth decay found in the primary incisors (a condition known as early childhood caries), PHE wants local organisations to tackle problems related to infant feeding practices.

Early childhood caries are associated with young children being given sugar-sweetened drinks in a bottle – especially when these are given overnight or for long periods of the day.

Where tooth decay levels increase sharply between the ages of three and five, PHE wants local organisations to tackle this by helping parents reduce the amount and frequency of sugary food and drinks their children have, as well as increasing the availability of fluoride.

 

Conclusion

There are two important steps you can take to protect your children's teeth against tooth decay:

  • limit their consumption of sugar, especially sugary drinks
  • make sure they brush their teeth at least twice a day with fluoridated toothpaste
Sugar

Sugar causes tooth decay. Children who eat sweets every day have nearly twice as much decay as children who eat sweets less often.

This is caused not only by the amount of sugar in sweet food and drinks, but by how often the teeth are in contact with the sugar. This means sweet drinks in a bottle or feeder cup and lollipops are particularly damaging because they bathe the teeth in sugar for long periods of time. Acidic drinks such as fruit juice and squash can harm teeth, too.

Don't fall into the trap of thinking that a fruit juice advertised as "organic", "natural" or with "no added sugar" is inherently healthy. A standard 330ml carton of orange juice can contain almost as much sugar (30.4g) as a can of coke (around 39g).

As Dr Sandra White, director of dental public health at PHE, points out: "Posh sugar is no better than any other sugar … our key advice for [children] under three is to just have water and milk."

Tooth brushing

A regular teeth cleaning routine is essential for good dental health. Follow these tips and you can help keep your kids' teeth decay free:

  • Start brushing your baby's teeth with fluoride toothpaste as soon as the first milk tooth breaks through (usually at around six months, but it can be earlier or later). It's important to use a fluoride paste as this helps prevent and control tooth decay.
  • Children under the age of three can use a smear of family toothpaste containing at least 1,000ppm (parts per million) fluoride. Toothpaste with less fluoride is not as effective at preventing decay.
  • Children between the ages of three and six should use a pea-sized blob of toothpaste containing 1,350 to 1,500ppm fluoride. Check the toothpaste packet for this information or ask your dentist.
  • Make sure your child doesn't eat or lick the toothpaste from the tube.
  • Brush your child's teeth for at least two minutes twice a day, once just before bedtime and at least one other time during the day.
  • Encourage them to spit out excess toothpaste, but not to rinse with lots of water. Rinsing with water after tooth brushing will wash away the fluoride and reduce its benefits.
  • Supervise tooth brushing until your child is seven or eight years old, either by brushing their teeth yourself or, if they brush their own teeth, by watching how they do it. From the age of seven or eight they should be able to brush their own teeth, but it's still a good idea to watch them now and again to make sure they brush properly and for the whole two minutes. 

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Tooth decay affects 12% of three-year-olds, says survey. BBC News, September 30 2014

One in 8 three-year-olds has rotting teeth... and fruit juice is to blame: Parents warned organic drinks and smoothies can contain as much sugar as a glass of coke. Daily Mail, September 30 2014

Sugary drinks in baby bottles triggering rise in tooth extractions. The Guardian, September 30 2014

Fruit drinks fuelling tooth decay among under 3s. The Daily Telegraph, September 30 2014

Three-Year-Old Children Suffering Tooth Decay. Sky News, September 30 2014

1 in 8 children suffering from tooth decay 'due to too much sugar'. ITV News, September 30 2014

Shock figures reveal 1 in 8 three-year-olds have tooth decay. Daily Mirror, September 30 2014

Rotten truth: Tooth decay in under-3s is on the rise. Daily Express, September 30 2014

Categories: NHS Choices

Deep-fried Mars bars: unhealthy, but no killer

NHS Choices - Behind the Headlines - Tue, 30/09/2014 - 11:50

“Eating a deep-fried Mars bar could give you a stroke in minutes,” reports the Metro.

However, the study that prompted this headline found no evidence that the Scottish snack can potentially trigger a fatal stroke within minutes. 

Fans of deep-fried Mars bars actually have little to worry about in this regard, aside from the obvious risks of regularly consuming a meal full of sugar and saturated fats.

The over-alarmist headlines are based on the results of a small study using 24 healthy participants, which looked at whether eating a deep-fried Mars bar could affect the body’s ability to respond to breath holding by increasing blood flow to the brain (termed “cerebrovascular reactivity”). Impaired cerebrovascular reactivity has been associated with stroke, but this latest study didn’t look at stroke as an outcome.

Importantly, it found no significant differences in cerebrovascular reactivity after eating either a deep-fried Mars bar or porridge.

When the researchers analysed men and women separately, they also found no significant differences in cerebrovascular reactivity after eating a deep-fried Mars bar or porridge in either men or women.

However, when the researchers compared men with women, they found a significant difference.

Common sense suggests that eating deep-fried Mars bars regularly is not good for your health. However, this study didn't find any evidence that a deep-fried Mars bar alone can trigger a stroke within minutes.

 

Where did the story come from?

The study was carried out by researchers from the University of Glasgow and the British Heart Foundation Cardiovascular Research Centre in Scotland, and was funded departmentally.

The study was published in the peer-reviewed Scottish Medical Journal.

The media coverage of this story was poor. The oft-repeated claim that the snack can trigger a stroke within minutes is entirely baseless. Obviously, if you are recovering from a stroke or told that you have risk factors for a stroke, then a deep-fried Mars bar would probably be bottom of the list of recommended foods. However, it seems unlikely that a single sugary snack would trigger a stroke.

Causes of stroke are usually a combination of interrelated risk factors, such as smoking, high blood pressure, high cholesterol, obesity and excessive alcohol consumption.

 

What kind of research was this?

This was a randomised crossover trial that aimed to determine whether eating deep-fried Mars bars impaired cerebrovascular reactivity in comparison to eating porridge.

Cerebrovascular reactivity is the change of blood flow in the brain in response to a stimulus. In this trial, the researchers looked at the change in blood flow after participants were asked to hold their breath for 30 seconds. Holding your breath should increase blood flow to the brain.

The researchers say that impaired change in brain blood flow following a stimulus is associated with an increased risk of ischaemic stroke (stroke caused by a lack of blood flow to the brain).

In this randomised crossover trial, all participants ate both a deep-fried Mars bar and porridge. Half the participants ate the Mars bar first, and half the participants ate the porridge first. Whether they ate the Mars bar first or the porridge first was randomised.

A randomised crossover trial is an appropriate study design to answer this sort of question.

 

What did the research involve?

The researchers studied 24 people, with an average age of around 21. Their body mass index (BMI) was within the healthy range (an average of 23.7).

After fasting for at least four hours, people were randomised to receive a deep-fried Mars bar or porridge.

The researchers looked at changes in blood flow in the brain after participants held their breath for 30 seconds, before and 90 minutes after, eating either a deep-fried Mars bar or porridge.

They looked at blood flow using ultrasound.

Participants returned to receive the other foodstuff on a second visit at least 24 hours after the first.

The researchers compared the changes in blood flow after participants ate the deep-fried Mars bar and porridge.

 

What were the basic results?

Eating a deep-fried Mars bar caused a non-statistically significant reduction in cerebrovascular reactivity compared to eating porridge.

The researchers then looked at men and women separately (14 of the 24 people in the study were male). Changes in cerebrovascular reactivity were not significant in either men or women after they ate a deep-fried Mars bar or porridge.

The researchers then compared men with women. They found there was a significant difference in cerebrovascular reactivity after eating a deep-fried Mars bar compared to eating porridge, with a modest decrease of cerebrovascular reactivity in males. 

 

How did the researchers interpret the results?

The researchers concluded that, “Ingestion of a bolus of sugar and fat caused no overall difference in cerebrovascular reactivity, but there was a modest decrease in males. Impaired cerebrovascular reactivity is associated with increased stroke risk, and therefore deep-fried Mars bar ingestion may acutely contribute to cerebral hypoperfusion [decreased blood flow] in men.”

 

Conclusion

This study found no significant differences in cerebrovascular reactivity (the body’s ability to respond to breath holding by increasing blood flow to the brain) after eating either a deep-fried Mars bar or porridge.

When the researchers analysed men and women separately, they found no significant differences in cerebrovascular reactivity after eating a deep-fried Mars bar or porridge. However, when the researchers compared men with women, they found a significant difference, although whether there is any clinical significance to this finding is unclear.

The researchers point out that there are limitations to their study, including the fact they studied young, healthy individuals. It may be that there are differences in cerebrovascular reactivity in older patients at risk of stroke.

Confirming whether the risk is significant in this sub-group would be challenging, not least due to ethical considerations. Assigning people a diet you suspect could harm them would be a serious breach of medical ethics.

Common sense suggests that eating deep-fried Mars bars regularly will not be good for your health, as a diet high in saturated fats and sugar can increase the risk of cardiovascular diseases (diseases that affect the heart and blood vessels).

However, the very occasional late night “guilty pleasure” is highly unlikely to trigger a stroke.

Analysis by Bazian. Edited by NHS ChoicesFollow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Deep-fried Mars bar raises risk of having a stroke WITHIN MINUTES, experts claim. Daily Mirror, September 29 2014

Deep-fried Mars bars 'could trigger a stroke': Blood flow to the brain slows within minutes of eating the snack. Mail Online, September 29 2014

Eating a deep fried Mars bar could give you a stroke in minutes. Metro, September 29 2014

Links To Science

Dunn WG, Walters MR. A randomised crossover trial of the acute effects of a deep-fried Mars bar or porridge on the cerebral vasculature. Scottish Medical Journal. Published online September 22 2014

Categories: NHS Choices

Will a 'wonder drug' be available in 10 years?

NHS Choices - Behind the Headlines - Mon, 29/09/2014 - 12:50

"Wonder drug to fight cancer and Alzheimer's disease within 10 years," is the headline in The Daily Telegraph.

This headline is a textbook example of hope (and hype) triumphing over reality, as the new "wonder drug" is neither available today nor inevitable in the future.

The headline was based on a study that provides new information about the role of the protein N-myristoylation (NMT) in human cells and a mechanism that inhibits it.

The study's authors suggest NMT could be involved in the development and progression of a range of diseases, including cancerdiabetes and Alzheimer's disease.

Inhibiting the actions of NMT could help combat these diseases. But this remains to be seen: if true, this greater understanding may open up new avenues for medical research, which could ultimately lead to new treatments in the future.

While the results are both intriguing and promising, it is very difficult to predict the precise route or timing of future medical developments (drugs, treatments or therapies) based on early laboratory investigations.

Even if treatments based on NMT inhibition were developed and found to be effective, there is no guarantee they would also be safe or free from serious side effects.

All in all, the 10-year timeframe suggested by The Daily Telegraph should be taken with a pinch of salt.

 

Where did the story come from?

The study was carried out by researchers from Imperial College London and was funded by Cancer Research UK, the Biotechnology and Biological Sciences Research Council, the Engineering and Physical Sciences Research Council, the European Union, and the Medical Research Council.

It was published in the peer-reviewed journal Nature Communications.

While The Daily Telegraph's hyped-up headline was a little over the top, the coverage was accurate and balanced.

Optimistic quotes from the study authors, such as, "Eventually we hope this would simply be a pill you could take. It will be perhaps 10 years or so to a drug 'on the market' but there are many hurdles to get over", were counterbalanced with a note of realism from Cancer Research UK's senior science officer: "The next steps will be to develop this idea and make a drug – but there's a way to go before we'll know if it's safe and effective in people".

 

What kind of research was this?

This was a laboratory-based study looking at the structure and function of proteins in human cells.

Proteins are very important in human biology as they are involved in, or carry out, a huge range of biological tasks and processes.

This study looked at a specific chemical modification called N-myristoylation (NMT), which happens to some proteins as they are being made and after they have been made. This is a very common chemical modification of proteins, which in turn affects their function – a form of regulation.

The researchers say NMT has been implicated in the development and progression of a range of human diseases, including cancer, epilepsy, Alzheimer's disease, Noonan syndrome (a genetic condition that can disrupt the normal development of the body), and viral and bacterial infections.

 

What did the research involve?

The study used laboratory-grown human cells to study all the characteristics of the NMT process. This was achieved by identifying all the proteins undergoing the NMT process and finding out what these chemically tagged proteins did inside the cells, what processes they were involved in, other chemicals they interacted with, and whether the protein NMT process could be stopped (inhibited).

The group studied cells in the laboratory during normal cell function and apoptosis – the natural process in which a cell self-destructs in an ordered way, also known as programmed cell death. Apoptosis is often inhibited in cancer cells, causing them to grow indefinitely and not die.

 

What were the basic results?

The researchers' findings include:

  • Identifying more than 100 NMT proteins present in the human cells studied.
  • Identifying more than 95 proteins for the first time.
  • Quantifying the effect of inhibiting the NMT process across more than 70 chemicals (substrates) simultaneously. This showed which chemicals the NMT proteins were interacting with inside the cells.
  • Finding a way to inhibit the NMT process by inhibiting the main enzyme responsible for the chemical modification, called N-myristoyltransferase.

 

How did the researchers interpret the results?

The research team said: "Numerous important pathways involve proteins that are shown here for the first time to be co- or post-translationally N-myristoylated [N-myristoylated during or after their formation]."

Commenting on the wider implications of their research, they said: "These data indicate many potential novel roles for myristoylation that merit future investigation in both basal cell function and apoptosis, with significant implications for basic biology, and for drug development targeting NMT [N-myristoyltransferase]."

 

Conclusion

This laboratory protein study has provided new information about the role of protein N-myristoylation in human cells and a mechanism to inhibit it. The findings suggest proteins undergoing N-myristoylation are involved in many key biological processes and tasks.

Given the researchers' assumption that protein N-myristoylation has been implicated in the development and progression of a range of diseases is true, this greater understanding may open up new avenues for medical research, which could ultimately lead to new treatments in the future.

However, it is very difficult to predict the precise route or timing of future medical developments (drugs, treatments or therapies) based on early findings.

The study's authors struck a balance of optimism and realism when quoted in The Daily Telegraph.

They first said their findings could lead to new treatments in the future and that, "Eventually we hope this would simply be a pill you could take. It will be perhaps 10 years or so to a drug on the market". Balancing this out, they also said: "There are many hurdles to get over".

This study represents one of the first steps on the road to new drug discovery, so the exact path ahead is unclear.

But despite these promising early findings, there are no sure bets in drug development – the history of medicine is full of initially encouraging avenues of research that ended up leading to dead ends. 

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Wonder drug to fight cancer and Alzheimer's disease within 10 years. The Daily Telegraph, September 27 2014

Links To Science

Thinon E, Serwa RA, Broncel M, et al. Global profiling of co- and post-translationally N-myristoylated proteomes in human cells. Nature Communications. Published online September 26 2014

Categories: NHS Choices

Cherry juice touted as treatment for gout

NHS Choices - Behind the Headlines - Mon, 29/09/2014 - 11:40

“Daily drinks of cherry juice concentrate could help thousands of patients beat gout,” the Mail on Sunday reports.

This headline is based on a small study that found drinking tart cherry juice twice a day temporarily lowered the blood uric acid levels of 12 young healthy volunteers for up to eight hours after they consumed the drink. This is of potential interest, as high levels of uric acid can lead to crystals forming inside joints, which triggers the onset of the painful condition gout.

Somewhat puzzlingly, the study recruited healthy young volunteers who didn’t have gout. A more relevant study design would have included people with a history of gout, to see what effect, if any, cherry juice had on them.

So, based on this study alone, we cannot say that drinking cherry juice helps prevent the onset of gout, or the recurrence of gout in those who have had it before. It is not clear whether reductions in uric acid of the magnitude found in this study would be sufficient to prevent gout or relieve gout symptoms.

The Mail on Sunday’s assertion that “now doctors say drinking cherry juice daily could help beat the condition” is not backed up by this research alone, nor is health advice on gout from health professionals likely to change based on this small study.

 

Where did the story come from?

The study was carried out by researchers from the UK and South Africa, and was part funded by Northumbria University and the Cherry Marketing Institute. The latter is a non-profit organisation, funded by cherry growers, with a brief to promote the alleged health benefits of tart cherries.

This obviously represents a potential conflict of interest, although the research paper says, “The funders had no role in the study design, data collection and analysis, decision to publish, or preparation of the manuscript.”

The study was published in the peer-reviewed Journal of Functional Foods.

The Mail on Sunday’s coverage over-extrapolates the findings of this small study involving healthy people, not gout sufferers. While it is plausible that cherry juice may be of some benefit to people affected by gout, this is currently unproven.

 

What kind of research was this?

This was a single blind randomised crossover study testing the effects of two doses of cherry juice on levels of uric acid (urate) in the body.

The researchers say that nutritional research has focused more on the use of foods for improving human health, and particular attention has been placed upon foods containing high concentrations of anthocyanins – such as tart cherries.

Gout is a type of arthritis, where crystals of sodium urate form inside and around joints. The most common symptom is sudden and severe pain in a joint, along with swelling and redness. The joint of the big toe is commonly affected, but it can develop in any joint. Symptoms develop rapidly and are at their worst in just six to 24 hours. Symptoms usually last for three to 10 days (this is sometimes known as a gout attack). After this time, the joint will start to feel and look normal again, and the pain of the attack should disappear completely. Almost everyone with gout will have further attacks in the future.

People with gout usually have higher than normal urate levels in their blood, but the reasons for this may vary; for example, some people may produce too much urate, while in others the kidneys may not be so effective at filtering out urate from the bloodstream. The condition may run in families.

This study did not study people with gout, but only looked at the concentration of sodium urate (uric acid) in the blood of healthy young people who did not have gout or high levels of sodium urate, suggesting that they would develop gout in the near future. Hence, it does not provide good evidence that the cherry juice was beneficial to relieve gout symptoms or prevent the recurrence of symptoms.

randomised control trial including people with gout, or people more likely to develop gout (such as older men with a family history), would be required to give us better evidence on the issue.

 

What did the research involve?

The research took 12 healthy volunteers (average age of 26 years, 11 of which were male) and gave them two different volumes (30ml and 60ml) of concentrated cherry juice mixed with water, to see what effect this had on measures of uric acid activity and inflammation up to 48 hours later – both of which are biological measures indirectly related to gout.

None of the volunteers actually had a history of gout.

In an effort to reduce other dietary sources of anthocyanins (outside of that gained from the cherry juice), participants were requested to follow a low-polyphenolic diet by avoiding fruits, vegetables, tea, coffee, alcohol, chocolate, cereals, wholemeal bread, grains and spices for 48 hours prior to, and throughout each arm of the trial. Food diaries were completed for 48 hours before, and throughout, the testing phase to assess the diet for compliance.

Participants were required to attend the start of each phase of the study at 9am, following a 10-hour overnight fast to account for diurnal variation. Each phase was comprised of two days supplementation with cherry concentrate. One supplement was taken immediately following a morning blood and urine sample, and a second consumed prior to each evening meal.

Multiple supplements were administered to identify any cumulative effects. The length of the supplementation phase (48 hours) was chosen due to the short period of time in which anthocyanins are metabolised.

 

What were the basic results?

The main results were as follows

  • Blood urate (uric acid) concentrations in the volunteers went down for both the low and high doses of cherry juice in about the same amount, from around 500 micoMol per litre at the start to around 300 micoMol per litre after eight hours. The concentrations at the 24-hour and 48-hour time points appeared to have increased back up to the 400 micoMol per litre.
  • The amounts of urate (uric acid) removed from the body via urine increased, peaking at two to three hours. The amount excreted then dipped but remained broadly above the starting level up to 48 hours.
  • Levels of a general blood inflammatory marker (high sensitivity C-reactive protein; hsCRP) decreased. 
  • There was no clear dose effect between the cherry concentrate and the biological findings.

 

How did the researchers interpret the results?

The researchers said, “These data show that MC [Montmornecy tart cherry concentrate] impacts upon the activity of uric acid and lowers hsCRP, previously proposed to be useful in managing conditions such as gouty arthritis; the findings suggest that changes in the observed variables are independent of the dose provided.”

They also said, “these results provide rationale for the use of Montmorency cherry concentrate as an adjuvant therapy to NSAIDs [non-steroidal anti-inflammatory drugs] in the treatment of gouty arthritis [gout].”

 

Conclusion

This small study found that drinking tart cherry juice twice a day temporarily lowered the blood uric acid levels of 12 young healthy volunteers without gout, up to eight hours after they consumed the drink. The levels began to increase back to the starting levels after 24-48 hours. The researchers and media extrapolated this finding to mean that the drink may be useful for gout, which is caused by an excess accumulation of uric acid crystals.

Based on this study alone, we cannot say that drinking cherry juice helps prevent the onset of gout, or the recurrence of gout in those who have had it before. The study did not test the effect of the juice on people with gout, or those likely to get gout in the future, so is only indirectly relevant to these groups. For example, it is not clear whether reductions in uric acid of the magnitude found in this study would be sufficient to prevent or treat gout in people with a propensity for high uric acid levels in the body (for whatever reason).

Furthermore, there may have been other dietary factors contributing or interacting with the cherry juice compounds that could account for the changes observed. Hence, cherry juice might not be the sole cause of the effects seen.

The Mail on Sunday carried a useful quote from a UK Gout Society spokesman, who said that while “Montmorency cherries [those used in the study] could help reduce uric acid levels in the body, ‘People with gout should go to their GP because it can be linked to other conditions such as stroke and psoriasis’”.

We find no evidence to support the Mail’s comments that “Now doctors say drinking cherry juice daily could help beat [the] condition”.

For the reasons above, this study alone provides weak evidence that concentrated cherry juice might help those with gout. The media have somewhat overhyped the significance of the findings, which are underdeveloped and tentative. The hype would be justified if a more robust study of people with gout had been undertaken. 

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Fighting the agony of gout - with a daily glass of cherry juice. Mail on Sunday, September 28 2014

Links To Science

Bell PG, Gaze DC. Davison GW, et al. Montmorency tart cherry (Prunus cerasus L.) concentrate lowers uric acid, independent of plasma cyanidin-3-O-glucosiderutinoside. Journal of Functional Food. Published online September 27 2014

Categories: NHS Choices

Could curry spice boost brain cell repair?

NHS Choices - Behind the Headlines - Fri, 26/09/2014 - 12:30

“Spicy diet can beat dementia,” is the unsupported claim in the Daily Express. Researchers found that the spice turmeric stimulated the growth of neural stem cells in rats, though this is a long way from an effective dementia treatment for humans.

This was laboratory and animal research investigating the effect of a turmeric extract (aromatic turmerone) on neural stem cells (NSCs). NSCs have some ability to regenerate brain cells after damage, but usually not the damage caused by degenerative brain diseases such as Alzheimer’s disease.

The study found that when the turmeric extracts were either directly cultured with NSCs in the laboratory (in vitro) or when they were directly injected into the brains of live rats (in vivo), the extracts increased the growth and development of the stem cells.

However, this research is in the very early stages. We don’t know whether this apparent increase in stem cells would have any effect on repairing brain damage in rats with degenerative brain diseases, let alone humans with these conditions. We certainly don’t know that eating turmeric, or other spices, would have any effect on the brain’s powers of regeneration.

Though the researchers hope these findings may pave the way towards new treatments for degenerative brain conditions, this is likely to be a long way off.

 

Where did the story come from?

The study was carried out by researchers from Institute of Neuroscience and Medicine, Research Centre Juelich, and University Hospital of Cologne, both in Germany. The study was supported by the Koeln Fortune Program/Faculty of Medicine, University of Cologne and the EU FP7 project “NeuroFGL.”

The study was published in the peer-reviewed Stem Cell Research and Therapy on an open access basis, so it is free to read online.

The quality of the Daily Express’s and the Mail Online’s reporting of the study is poor. Both sources claim that eating curries can "beat dementia". These claims are entirely unproven and are at best sensationalist, and at worst cruel for giving people false hope.

BBC News and ITV News’ coverage takes a more appropriate tone, pointing out that any potential human application at this stage is entirely hypothetical.

 

What kind of research was this?

This was an animal and laboratory study, which aimed to investigate the effect of Aromatic (ar-) turmerone on brain stem cells.

Ar-turmerone and curcumin are active compounds of the herb Curcuma longa, or turmeric as it is more commonly known. Many studies (such as a study we covered in 2012) have suggested that curcumin has anti-inflammatory effects and may have a protective effect on brain cells, though the effects of ar-turmerone are yet to be examined.

Neural stem cells (NSCs) have some ability to regenerate brain cells that have been destroyed or damaged, but usually are insufficient to repair the damage caused by degenerative brain diseases (such as Alzheimer’s) or stroke.

This research aimed to investigate the effects of ar-turmerone on NSCs in brain cells in the laboratory and in live rats.

 

What did the research involve?

In the first part of the research, NSCs were obtained from the brains of rat foetuses and cultured in the laboratory. Ar-turmerone was added to the cultures at various concentrations and studied for a number of days to look at the rate of stem cell proliferation. 

In the second part of the research, a group of male rats were anaesthetised. Three then received an injection of ar-turmerone into the brain; six were injected with an equal volume of salt water. After recovery from the anaesthetic, the animals were put into cages and given free access to food and water as normal.

For five days following the surgical procedure, a tracer was injected into the animals (bromodeoxyuridine), which is taken up by replicating cells. Seven days after the surgery, the rats were scanned with a positron emission tomography (PET) scanner, which detects the tracer and produces 3-D images demonstrating the active cell division in the tissues.

After death, the brains of the rats were examined in the laboratory to look at how ar-turmerone had affected brain structure. 

 

What were the basic results?

In the laboratory, the researchers found that ar-turmerone increased the number of neural stem cells. Higher concentrations of ar-turmerone caused greater increases in NSC proliferation.

In the rats, they also found that injection of ar-turmerone into the brain promoted the proliferation of NSCs and differentiation into different brain cell types. This was evident on both PET scanning and autopsy examination of the brain after death.

 

How did the researchers interpret the results?

The researchers conclude that both in the laboratory and in live animals, ar-turmerone causes the proliferation of nerve stem cells. They suggest that “ar-turmerone thus constitutes a promising candidate to support regeneration in neurologic disease”.

 

Conclusion

This laboratory and animal research has found that an extract from turmeric (aromatic turmerone) seems to increase the growth and differentiation of neural stem cells (NSCs).

However, this research is in the very early stages. So far, the extract has only been added to brain stem cells in the laboratory, or directly injected into the brains of only three rats. Though NSCs have some ability to regenerate brain cells after damage, this is usually not enough to have an effect in degenerative brain diseases such as Alzheimer’s.

The hope is that by boosting the number of NSCs, they could be more effective at repairing damage in these conditions. This study has not investigated whether the observed effects would make any meaningful functional differences in rats with degenerative brain diseases, never mind humans with these conditions.

As the researchers further caution, there are various issues to be considered when contemplating the possibility of any trials in humans. For example, it is recognised that causing the increased rate of growth and differentiation of NSCs carries some risk of cancerous change. Also, the route of administration used here in the rats – direct injection into the brain – would be likely to carry far too much risk and may not be possible in humans. We certainly don’t know whether taking turmeric extracts by mouth – or just by eating a spicy diet as the Express headline suggests – would have any effect on the brain’s powers of regeneration.

Though the researchers hope these findings may pave the way towards new treatments for degenerative brain conditions, this is likely to be a long way off.

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Spicy diet can beat dementia: Breakthrough in fight to cure cruel disease. Daily Express, September 26 2014

Brain repair 'may be boosted by curry spice'. BBC News, September 26 2014

Eating a curry 'can help beat dementia': Ingredient found in turmeric may hold key to repairing brains of people with condition. Mail Online, September 26 2014

Tumeric 'link' to brain cell repair. ITV News, September 26 2014

 

Links To Science

Hucklenbroich K, Klein R, Neumaier B, et al. Aromatic-turmerone induces neural stem cell proliferation in vitro and in vivo. Stem Cell Research and Therapy. Published online September 26 2014

Categories: NHS Choices

Antibiotic treatments 'fail' 15% of the time

NHS Choices - Behind the Headlines - Fri, 26/09/2014 - 12:10

“Antibiotic treatments from GPs 'fail 15% of the time’,'' BBC News reports. In one of the largest studies of its kind, researchers estimated that just under one in seven antibiotic prescriptions in 2011 "failed".

This study examined the failure rates of antibiotics prescribed by GPs in the UK for common infections over a 21-year period – from 1991 to 2012. Most of the failures (94%) were cases where a different antibiotic needed to be prescribed within 30 days, suggesting that the first antibiotic had not worked.

In general, the overall failure rate remained fairly static over the course of three decades; 13.9% in 1991 only increased to 15.4% by 2012.

When considering specific types of infection in combination with specific classes of antibiotics, there were notable changes in failure rates. For example, when the antibiotic trimethoprim was prescribed for an upper respiratory tract infection, failure rates increased from 25% in 1991 to 56% in 2012. Reassuringly, failure rates with commonly prescribed antibiotics (such as amoxicillin) currently remain fairly low.

The study did not look at the reasons for antibiotic failure, but one reason could be antibiotic resistance – an increasing problem worldwide.

If you are prescribed an antibiotic, you can increase the chances of it working and decrease the risk of antibiotic resistance by ensuring that you take the full course as prescribed by your GP, even when you start to feel better.

 

Where did the story come from?

The study was carried out by researchers from Cardiff and Oxford universities, and Abbott Healthcare Products in the Netherlands, who also funded the study.

The study was published in the peer-reviewed British Medical Journal (BMJ) on an open access basis, so it is free to read online.

While the overall reporting by the UK media was broadly accurate, many of the headlines were not.

The Daily Telegraph claimed that “Up to half of antibiotics 'fail due to superbugs'”. 

We don't actually know the reason for needing another antibiotic prescription, as this was not examined in this study. Therefore, we don't know that any of these apparent antibiotic failures were due to “superbugs” as no laboratory data was available.

The Daily Mail claims that, “Now one in seven patients cannot be cured using antibiotics”, which is also not correct. It could well be the case that many patients were cured through the use of alternative antibiotics.

 

What kind of research was this?

This study examined the failure rates of antibiotics prescribed by General Practices in the UK over a 21-year period – from 1991 to 2012. Antibiotic resistance is a problem that has been increasing over the past few decades. As the World Health Organization (WHO) has declared, this is becoming worldwide public health crisis, as previously effective antibiotics become ineffective at treating certain infections. Though many people may think of antibiotic resistance as a problem predominantly found in hospital care (e.g. patients becoming ill with resistant “superbugs”), resistant bugs are just as much a problem in the community. As the researchers say, recent antibiotic treatment in primary care puts a person at risk of developing an infection that is resistant to antibiotics.

This study used a large general practice database to assess the failure of first-line (initial) antibiotic treatments prescribed in the UK over a 21-year period, alongside looking at general antibiotic prescription patterns.

 

What did the research involve?

This study used the UK Clinical Practice Research Datalink (CPRD) – an anonymised database collecting data from more than 14 million people attending almost 700 general practices in the UK. The database contains well-documented medical records and information on prescriptions, and these were examined between 1991 and 2012.

The researchers decided to look at antibiotics prescribed for four common classes of infection:

  • upper respiratory tract infections (e.g. sore throats, tonsillitis, sinusitis)
  • lower respiratory tract infections (e.g. pneumonia)
  • skin and soft tissue infections (e.g. cellulitis, impetigo)
  • acute ear infection (otitis media)

They looked at whether these infections had received treatment with a course of a single antibiotic (termed monotherapy, rather than two antibiotics in combination, for example). An antibiotic was considered as the first-line treatment if there had been no prescriptions for other antibiotics in the preceding 30 days.

They assessed the proportion of antibiotic courses resulting in treatment failure. As the researchers say, there is no specific definition of treatment failure, but based on previous research findings they considered treatment failure as:

  • prescription of a different antibiotic within 30 days of the first antibiotic prescription
  • GP record of admission to hospital with an infection-related diagnosis within 30 days of  prescription
  • GP referral to an infection-related specialist service within 30 days of prescription
  • GP record of an emergency department visit within three days of prescription (the shorter time window being selected to increase the probability that the emergency was related to the infection, rather than another cause)
  • GP record of death with an infection-related diagnostic code within 30 days of prescription

For each year, from 1991 to 2012, the researchers determined antibiotic treatment failure rates for the four infection classes and overall.

 

What were the basic results?

The database contained records of almost 60 million antibiotic prescriptions prescribed to more than 8 million people.

Almost 11 million prescriptions were the first-line single antibiotic treatment of the four groups of infection being studies: 39% for upper and 29% for lower respiratory tract infections, 23% for skin and tissue infections, and 9% for ear infections.

Overall, GP consultation rates for the four common infection groups decreased over time, but the number of consultations for which an antibiotic was prescribed marginally increased: 63.9% of consultations in 1991 and 65.6% in 2012. Across the whole 21 years, the proportion of consultations where an antibiotic was prescribed was 64.3%. However, within infection groups, there were more significant changes: prescriptions for lower respiratory tract infections decreased (59% in 1991 to 55% in 2012) while those for ear infection went up considerably (63% in 1991 to 83% in 2012).

The most commonly prescribed antibiotics were amoxicillin (42% of all prescriptions), and most upper respiratory tract infections received this antibiotic.

Most antibiotic treatment failures (94.4%) were cases where an alternative antibiotic had been prescribed within 30 days of treatment.

The overall antibiotic treatment failure rate for the four infection classes was 14.7%. The rate was 13.9% in 1991 and 15.4% in 2012, but there was not a clear linear increase in the rate over the time period. For each year, the highest failure rates were seen for lower respiratory tract infections (17% in 1991 and 21% in 2012).

Within the infection classes, individual antibiotics were associated with different failure rates. There were some particularly high rates of failure. For example, when the antibiotic trimethoprim (most often prescribed for urine infections) was prescribed for an upper respiratory tract infection, it failed 37% of the time overall, increasing from 25% in 1991 to 56% in 2012. For lower respiratory tract infections, failure rates were highest for a group of broad-spectrum antibiotics called cephalosporins (including antibiotics like cefotaxime and cefuroxime), with failure rates increasing from 22% in 1991 to 31% in 2012. 

In 2012, despite its high prescription rate for upper respiratory tract infections, amoxicillin had quite a low failure rate (12.2%).

 

How did the researchers interpret the results?

The researchers conclude that, “From 1991 to 2012, more than one in 10 first-line antibiotic monotherapies for the selected infections were associated with treatment failure. Overall failure rates increased over this period, with most of the increase occurring in more recent years, when antibiotic prescribing in primary care plateaued and then increased”.

 

Conclusion

Overall, this is a highly informative study of GP antibiotic prescribing for common infections in the UK. The overall antibiotic treatment failure rate was 15% over the course of the study period; these were mainly cases where there was a need to prescribe a different antibiotic within 30 days. There was a slight increase in failure rate, from 13.9% in 1991 to 15.4% in 2012. Within the infection classes, particular antibiotics had notable changes in failure rates, while others remained fairly stable. Reassuringly, amoxicillin and other commonly prescribed antibiotics currently still have fairly low failure rates.

However, despite this study using a wealth of data from a reliable GP database, there are some limitations to bear in mind.

Importantly, as the researchers say, there was no specific definition of treatment failure for them to use, so they had to use various proxy measures. They had no laboratory data available on the resistance of organisms to different antibiotics, so the study is not able to definitely say that antibiotic resistance was the reason for treatment failure. The most common indication of “treatment failure” in this study was the need for prescription of another antibiotic within 30 days, but it may not mean that the organism was resistant to the first antibiotic. – e.g. the person may not have taken the full prescribed treatment course, or the antibiotic may not have turned out to be appropriate for the type of bacteria the person had.

There is also the possibility of incorrect coding within the database, or the antibiotic not being prescribed for the indication that it was assumed to be.

However, antibiotic resistance is an increasing global problem, and is likely to have contributed to the failure rates. As a patient, it is important to be aware that many common respiratory infections can be self-limiting viral infections that do not need an antibiotic. If you are prescribed an antibiotic, you can help decrease the risk of the bug being developing resistance to the antibiotic by ensuring that you take the full course as prescribed by your GP, even when you start to feel better.

Analysis by Bazian. Edited by NHS ChoicesFollow Behind the Headlines on TwitterJoin the Healthy Evidence forum.

Links To The Headlines

Antibiotic treatments from GPs 'fail 15% of the time'. BBC News, September 26 2014

Up to half of antibiotics 'fail due to superbugs' study finds. The Daily Telegraph, September 26 2014

Now one in seven patients cannot be cured using antibiotics after they were handed out too freely by GPs. Mail Online, September 26 2014

Research shows 1 in 10 antibiotic prescriptions 'fail'. ITV News, September 26 2014

Links To Science

Currie CJ, Berni E, Jenkins-Jones S, et al. Antibiotic treatment failure in four common infections in UK primary care 1991-2012: longitudinal analysis. BMJ. Published online September 23 2014

Categories: NHS Choices

Skirt size increase ups breast cancer risk

NHS Choices - Behind the Headlines - Thu, 25/09/2014 - 12:20

“Skirt size increase linked to breast cancer risk,” BBC News reports. The story comes from a UK study of nearly 93,000 postmenopausal women that looked at whether changes in skirt size since their twenties was associated with increased risk of breast cancer.

It found that going up a skirt size every 10 years was associated with a 33% increased risk of developing breast cancer after the menopause. As an example, this could be going from a size 8 at 25 years old to a size 16 at 65 years old.

It's important to stress that the initial risk of developing breast cancer, the baseline risk, is small, with only 1.2% of women involved in the study going on to develop breast cancer.

This large study used skirt size as a proxy measure for “central obesity” – the accumulation of excess fat around the waist and stomach. While overweight and obesity is known to be a risk factor for several cancers, this study suggests that a thickening waist may be an independent measure of increased breast cancer risk.

The good news is that the “skirt size effect" appears to be reversible, as losing weight and trimming your waist size may help reduce your breast cancer risk.

 

Where did the story come from?

The study was carried out by researchers from the Universities of London and Manchester, and was funded by the Medical Research Council, Cancer Research UK and the National Institute of Health Research, as well as the Eve Appeal.

The study was published in the peer-reviewed medical journal BMJ Open. As the name suggests, this is an open-access journal, so the study can be read for free online.

The paper was widely covered in the UK media. Coverage was fair, if uncritical.

Several headlines gave the impression that going up a single skirt size would raise breast cancer risk by 33%. Such a rise in risk would only be expected if a person went up a dress size every decade from their mid-twenties to when they were over 50 years old – the youngest age of the women recruited to the study.

Several media sources included useful comments from independent experts.

 

What kind of research was this?

This was a cohort study that looked at whether changes in skirt size between a woman’s twenties and the menopause was associated with increased risk of breast cancer. Skirt size was used as a proxy measure for central obesity (an excessive amount of fat around the stomach and abdomen – sometimes known as a "pot belly" or "beer belly").

The researchers say that both overall and central obesity are associated with an increased breast cancer risk in postmenopausal women, yet no studies have looked at the relationship between breast cancer risk and changes in central obesity alone.

Skirt and trouser size, they say, provide a reliable estimate of waist circumference, which may be predictive of risk, independent of body mass index (BMI), which is based on the individual’s height and weight.

 

What did the research involve?

The researchers recruited to their study women taking part in a large UK trial of ovarian cancer screening. The women were aged 50 or over and had no known history of breast cancer when they entered the study, between 2005 and 2010.

At enrolment, they answered a questionnaire providing detailed information on height and weight, reproductive health, number of pregnancies, fertility, family history of breast and ovarian cancer, use of hormonal contraceptives and hormone replacement therapy (HRT) – all of which influence (confound) breast cancer risk.

They were also asked about their current skirt size (SS) and what their SS had been in their twenties. Women could choose from 13 SS categories, ranging from size 6 to 30. These answers were used to calculate an increase in SS for each 10 years gone by. A “one unit” increase in SS would mean an increase from, say, 10 to 12 – as odd sizes do not exist in the UK.

The women were followed up three to four years after recruitment, when they completed a further questionnaire, providing information on education, skirt size, continuing use of HRT, smoking, alcohol use, health status and any cancer diagnosis.

The researchers used official health records to identify those women who had a diagnosis of breast cancer during the follow-up period.

They used standard statistical methods to analyse their results, adjusting these for confounders such as BMI, HRT use and family history.

 

What were the basic results?

The researchers report that 92,834 women completed the study and were included in their analysis. The average age of participants was 64. Participants were mainly white, educated to university degree level, and overweight at the point of entry to the study, with an average BMI of just over 25. 

At the age of 25, the average skirt size had been a UK 12, and at 64 it was 14. An increase in skirt size over their lifetime was reported in 76% of women.

During the monitoring period, 1,090 women developed breast cancer, giving an absolute risk of just over 1%.

Researchers found that for each unit increase in skirt size per 10 years, the risk of breast cancer after menopause increased by 33% (hazard ratio (HR) 1.330, 95% confidence interval (CI) 1.121 to 1.579).

For those with an increase of two SS units every 10 years, the risk was increased by 77% (HR 1.769, 95% CI 1.164 to 2.375).

They also found that a reduction in skirt size since the twenties was associated with a decreased risk of breast cancer.

Changes in skirt size, they say, was a better predictor of breast cancer risk than BMI or weight generally. It should also be noted that the association of skirt size with breast cancer risk was independent of BMI.

 

How did the researchers interpret the results?

The researchers conclude that a change in skirt size is associated with a risk of breast cancer independent of a woman’s height and weight. They estimate an increase in five-year absolute risk of postmenopausal breast cancer from one in 61 to one in 51 with each increase in skirt size each 10 years.

Their findings, they say, may provide women with a simple and easy to understand message, given that skirt size is a reliable measure of waist circumference, and women may relate to skirt size more easily than other measures of fat, such as BMI.

They theorise that fat around the waist may be more “metabolically active” than fat elsewhere and may increase levels of circulating oestrogen – an established risk factor for breast cancer.

 

Conclusion

This study suggests that while obesity generally is a risk factor for breast cancer, an increase in waist circumference, as shown in skirt size, between a woman’s twenties and after the menopause, may be an independent measure of increased risk.

Keeping to a healthy weight is important for overall health, and for reducing the risk of several cancers. However, few women in their 60s have the same waist size as they did in their twenties – in this study, for example, the average skirt size at 25 was a 12, but at 64 it was a size 14.

The 33% increased risk of breast cancer after menopause calculated by the researchers was based on an increase in skirt size every 10 years, which could mean increasing from size 12 aged 25 to size 18 by age 55.

The study had several limitations that may affect the reliability of its results. For example, it had a short follow-up period (three to four years) and it also required postmenopausal women in their 50s and 60s to recall their skirt size in their twenties.

In addition, while researchers adjusted their results for several factors that might influence the risk of breast cancer, it is always possible that both measured and unmeasured confounders affected the results.

Finally, most of the women were white, well-educated and also overweight when they were recruited. The results may not be generalisable to other groups of women.

It’s important to maintain a healthy weight, but it would be sad if women in their sixties started needlessly worrying that they should have the same waist size as when they were in their twenties. Surely all of us are entitled to some degree of middle-age spread?

Other ways you can reduce your breast cancer risk include taking regular exercisechoosing to breastfeed rather than bottle feed, and attending screening appointments if invited. 

Read more about breast cancer prevention.

Analysis by Bazian. Edited by NHS ChoicesFollow Behind the Headlines on TwitterJoin the Healthy Evidence forum.

Links To The Headlines

Skirt size increase linked to breast cancer risk, says study. BBC News, September 25 2014

Expanding waistline for women ‘is a predictor for breast cancer risk’. The Guardian, September 25 2014

Women who go up a skirt size 'raise breast cancer risks'. The Daily Telegraph, September 24 2014

Long-term weight-gain increases women's risk of breast cancer by a third, study suggests. The Independent, September 25 2014

Going up a dress size raises breast cancer risk 33%: Danger of gaining weight every decade. Mail Online, September 24 2014

Increased breast cancer risk for women who go up a skirt size as they get older. Daily Mirror, September 24 2014

Links To Science

Fourkala E, Burnell M, Cox C, et al. Association of skirt size and postmenopausal breast cancer risk in older women: a cohort study within the UK Collaborative Trial of Ovarian Cancer Screening (UKCTOCS). BMJ Open. Published online September 24 2014

Categories: NHS Choices

Media multitasking 'brain shrink' claims unproven

NHS Choices - Behind the Headlines - Thu, 25/09/2014 - 12:00

“Multitasking makes your brain smaller,” the Daily Mail reports. UK researchers found that people who regularly “media multitasked” had less grey matter in a region of the brain involved in emotion.

The researchers were specifically interested in what they term media multitasking; for example checking your Twitter feed on your smartphone while streaming a boxset to your tablet as you scan your emails on your laptop.

In the study, 75 university students and staff were asked to complete a questionnaire about their media multitasking habits. The researchers compared the results with MRI brain scans and found that people with the highest level of media multitasking had a smaller volume of grey matter in a region of the brain called the anterior cingulate cortex (ACC), which is believed to be involved in human motivation and emotions.

The clinical implications are not clear – motivation and emotions were not assessed and all of the participants were healthy and intelligent.

Importantly, this study was essentially a single snapshot in time so it cannot prove cause and effect. The idea that this section of the brain has shrunk was not established by this study. It may be that people who used more media forms had a smaller size of this area of the brain to start with, and this could have influenced their media use. 

  

Where did the story come from?

The study was carried out by researchers from Graduate Medical School in Singapore, the University of Sussex and University College London. It was funded by the Japan Science and Technology Agency.

The study was published in the peer-reviewed medical journal PLOS One. PLOS One is an open access journal so the study is free to read online.

The Daily Mail’s reporting of the study gives the impression that a direct cause and effect relationship between media multitasking and brain shrinkage has been proven. This is not the case.

The Daily Telegraph takes a more appropriate and circumspect approach, including a quote from one of the researchers who points out that further cohort-style studies are required to prove (or not) a definitive causal effect.

 

What kind of research was this?

The researchers say that existing literature on the topic has suggested that people who engage in heavier media multitasking have poorer cognitive control (ability to concentrate and focus on one task despite distractions, to flexibly switch between thoughts, and to control thinking and emotions).

They conducted this cross-sectional study to see if there was an association with increased media multitasking and any differences in the size of the grey matter in the brain. As this was a cross sectional study it cannot prove causation – that is, that the level and combination of media use caused the brain to shrink.

The study can’t inform whether there has been any change in brain size at all or whether people with increased media use already had this brain structure.

A better study design would be a prospective cohort study that carried out regular brain scans of people over time from a young age to see whether their level of media use (for example through work or study) influenced their brain structure.

However, aside from any ethical considerations, it is likely there would be significant practical difficulties with such a study design; try telling a young person that they couldn’t text while watching TV for the next five years and see how far that gets you.

Also a cohort study would still be likely to be subject to potential confounders.

 

What did the research involve?

The researchers recruited 75 healthy university students and staff who were “well acquainted” with computers and media technologies. They asked them to fill out two questionnaires and to have an MRI brain scan.

A media multitasking index (MMI) score was calculated for each participant. This involved participants completing a media multitasking questionnaire, the results of which were converted into a score using a mathematical formula.

The first section of the questionnaire asked people to estimate the number of hours per week they spent using different types of media:

  • print media
  • television
  • computer-based video or music streaming
  • voice calls using mobile or telephone
  • instant messaging
  • short messaging service (SMS) messaging
  • email
  • web surfing
  • other computer-based applications
  • video, computer or mobile phone games
  • social networking sites

The second section asked them to estimate how long they used any of the media types at the same time, using a scale of:

  • 1 – never
  • 2 – a little of the time
  • 3 – some of the time
  • 4 – all of the time

Participants were then asked to complete another questionnaire called the Big Five Inventory (BFI), which is a 44-item measure for the personality factors:

  • extroversion
  • agreeableness
  • conscientiousness 
  • neuroticism
  • openness to experience

 

What were the basic results?

Higher media multitasking (MMI) score was associated with smaller grey matter volumes in the anterior cingulate cortex (ACC) portion of the brain. No other brain regions showed significant correlations with MMI score. The precise function of the ACC is not known, but it is believed to be involved in motivation and emotions.

There was a significant association between extroversion and higher MMI score.

After controlling for extroversion and other personality traits, there was still a significant association between higher MMI and lower grey matter density in the ACC part of the brain.

 

How did the researchers interpret the results?

The researchers concluded that “individuals who engaged in more media multitasking activity had smaller grey matter volumes in the ACC”. They say that “this could possibly explain the poorer cognitive control performance and negative socioeconomic outcomes associated with increased media multitasking” seen in other studies.

 

Conclusion

This cross-sectional study finds an association between higher media multitasking and a smaller volume of grey matter in the ACC portion of the brain that is believed to be involved in human motivation and emotions.

Despite the apparent link, a key limitation of the study is that, being cross-sectional, its assessment of brain size and structure has only provided a single snapshot in time, at the same time as assessing media use. We do not know whether there has actually been any change in the person’s brain size at all. The study cannot tell us whether using multimedia has caused this area to reduce in size, or conversely whether having this reduced ACC size influenced people’s use of more media forms at the same time.

Furthermore, motivation, emotions and ability to concentrate were not assessed in any of the participants, so it is unclear whether the observed differences in volume had any clinical relevance. The media makes reference to previous studies that suggested an association with poor attention, depression and anxiety, but this was not assessed in this study. It should also be noted that all the participants were educated to at least undergraduate degree level, implying a high level of cognitive control.

Further study population bias included that they were only selected if they had familiarity with computers and media technologies so there was no control group who did not use as many multimedia types.

Another limitation of the study is that the media multitasking score is unlikely to be precise, as it was reliant on the participants accurately estimating the amount of time they spent using each media type per week, and how much time there was cross-over of activities.

Overall, while of interest, this study does not prove that using multiple forms of media causes the brain to shrink.

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Multitasking makes your brain smaller: Grey matter shrinks if we do too much at once. Daily Mail, September 25 2014

Second screening 'may alter the brain and trigger emotional problems'. The Daily Telegraph, September 24 2014

Links To Science

Loh KK, Kanai R. Higher Media Multi-Tasking Activity Is Associated with Smaller Gray-Matter Density in the Anterior Cingulate Cortex. PLOS One. Published online September 24 2014

Categories: NHS Choices

Benefits of statins 'outweigh diabetes risk'

NHS Choices - Behind the Headlines - Wed, 24/09/2014 - 12:45

“Statins increase risk of diabetes, but benefits are still worth it, say experts,” The Guardian reports.

A large study found the medication lead to a modest increase in weight and subsequent diabetes risk. The authors report that these risks were more than offset by the reduction in cardiovascular disease, but these results were not provided in the study. 

The study involved nearly 130,000 people, which found that statin use (used to lower cholesterol levels) increases the risk of type 2 diabetes by 12% and is associated with weight gain of around quarter of a kilo (half a pound) over four years.

It found indirect evidence that the protein statins target to reduce cholesterol could be at least partly responsible for the effect on type 2 diabetes as well. This evidence was based on looking at the effect of natural genetic variations that affect the protein, and not on a direct analysis of the effect of statins.

Importantly, the authors themselves note that this “should not alter present guidance on prescription of statins for prevention of [CVD]”. They do suggest that lifestyle changes, such as exercise, should be emphasised as still being an important part of heart disease prevention in people who are taking statins. This seems reasonable, and it is likely to be part of what doctors already recommend. 

 

Where did the story come from?

The study was carried out by researchers from University College London, Glasgow University, and a large number of international universities and institutes. It was funded by the Medical Research Council, the National Institutes of Health, the British Heart Foundation, the Wellcome Trust, the National Institute on Aging, Diabetes UK and several other European grants.

The study was published in the peer-reviewed medical journal The Lancet on an open access basis, so it is free to read online (PDF, 1.2Mb).

The media focused on the part of this study that looked at the effect of statins on weight change and risk of type 2 diabetes. However, it didn’t really focus on the main aim of this research, which was to look at how statins might have an effect on these outcomes, although this is understandable, as this information is not likely to be of interest to the average reader.

Refreshingly, all of the media sources that reported on the study resisted the temptation to engage in fear mongering, and were careful to stress that the benefits of statins outweighed any risks.

 

What kind of research was this?

The current study aimed to investigate how statins increase the risk of type 2 diabetes. The researchers had carried out a previous statistical pooling (meta-analysis) of data from randomised crossover trials (RCTs) and found that statins increased the risk of type 2 diabetes compared to placebo or no statins. One part of the current study added new studies to this meta-analysis, to get a more up-to-date estimate of the effect, and to look at statins’ effect on bodyweight as well.

Statins lower cholesterol by reducing the activity of a protein called 3-hydroxy-3-methylglutaryl-CoA reductase (HMGCR). The main part of this study carried out a new meta-analysis of genetic studies, to look at whether this protein might also be related to the effect of statins on diabetes risk.

Meta-analyses are a way to pool lots of data from different studies together. It helps researchers to identify small effects that individual studies may not be able to detect.

However, the benefits of statins in reducing cardiovascular disease such as heart attack and stroke are believed to outweigh this risk, even for people with type 2 diabetes.

 

What did the research involve?

The original meta-analysis looking at the effect of statins on type 2 diabetes had included RCTs of at least 1,000 people, followed up for one year or more. This meta-analysis had not looked at the effect of statins on weight change. The researchers contacted the investigators from 20 of the trials to provide data on changes in bodyweight during the follow-up. They then analysed the effect on weight gain of statins compared to placebo (“dummy” pills with no active ingredient) or just usual treatment (with no statins or placebo pills). They also analysed the results without the participants who had a heart attack or stroke.

They also analysed the effect of statins on change in LDL cholesterol (sometimes called “bad” cholesterol), blood sugar and insulin concentrations, BMI, waist circumference and waist:hip ratio.

The main part of the study looked at how statins might have an effect on type 2 diabetes risk. Doing this is difficult, so the genetic meta-analysis took a novel approach. Statins reduce levels of LDL cholesterol by reducing the activity of the HMGCR protein. Rather than look directly at the effect of statins, the meta-analysis looked at whether people who have genetic variations which naturally reduce the function of HMGCR also have an increased risk of type 2 diabetes. Their thinking was that if this was the case, then the effect of statins on type 2 diabetes might at least partly be explained by its effect on HMGCR.

Their meta-analysis pooled data from studies which looked at whether these variations were linked to type 2 diabetes, and other outcomes such as weight.

The meta-analysis pooled observational population studies that assessed two genetic variations lying in the gene that encodes the HMGCR protein. People who have these variations tend to have lower LDL cholesterol. For the main analysis, they compared people with these variations to those without in terms of their total cholesterol, LDL cholesterol, non-HDL cholesterol, bodyweight, body mass index (BMI), waist and hip circumferences, waist:hip ratio, height, plasma glucose and plasma insulin.

 

What were the basic results?

Information was obtained on change in LDL cholesterol in 20 statin trials and bodyweight change for 15 of the 20 statin trials.  There was no information available from these studies about the effect of statins on plasma glucose and insulin concentrations, BMI, waist circumference and waist:hip ratio.

Results for the 129,170 people from the randomised trials found that statins:

  • lowered LDL cholesterol after one year by 0.92 mmol/L (95% confidence interval (CI) 0.18–1.67)
  • increased bodyweight in all trials combined over a mean of 4.2 years (range 1.9–6.7) of follow-up by 0.24 kg (95% CI 0.10–0.38)
  • increased bodyweight compared to placebo or standard care by 0.33 kg (95% CI 0.24–0.42)
  • increased the risk of new-onset type 2 diabetes by 12% in all trials combined (Odds Ratio (OR) 1.12, 95% CI 1.06–1.18)
  • increased the risk of new-onset type 2 diabetes by 11% in placebo or standard care controlled trials (OR 1.11, 95% CI 1.03–1.20)

The researchers found that higher (intensive) doses of statins:

  • reduced body weight compared to moderate dose statins by –0.15 kg (95% CI –0.39 to 0.08)
  • increased the risk of new-onset type 2 diabetes by 12% compared with moderate dose statins (OR 1.12, 95% CI 1.04–1.22)

Meta-analysis of a total of up to 223,463 individuals from 43 studies in whom genetic data was available, found that each copy of the main genetic variation in HMGCR gene that they looked at was associated with:

  • lower cholesterol: 0.06 to 0.07 mmol/L
  • lower LDL cholesterol, total cholesterol and non-HDL cholesterol
  • 1.62% higher plasma insulin
  • 0.23% higher blood sugar (glucose) concentration
  • a 300g increase in bodyweight and 0.11 point increase in BMI
  • a slightly greater waist circumference of 0.32cm and hip circumference of 0.21cm
  • a 2% higher risk of type 2 diabetes that was almost statistically significant (OR 1.02, 95% CI 1.00 to 1.05)

They found similar results for the second genetic variation they looked at.

 

How did the researchers interpret the results?

The researchers concluded that “the increased risk of type 2 diabetes noted with statins is at least partially explained by HMGCR inhibition”. Importantly, they say that this “should not alter present guidance on prescription of statins for prevention of CVD”. Despite this, they say that their findings “suggest lifestyle interventions such as bodyweight optimisation, healthy diet and adequate physical activity should be emphasised as important adjuncts to prevention of [heart disease] with statin treatment to attenuate risks of type 2 diabetes.”

 

Conclusion

The results of these updated meta-analyses indicate that statin use is associated with a 12% increase in risk of type 2 diabetes and also weight gain of half a pound over the course of four years. This confirms the findings of the previous meta-analysis of the effect on diabetes, and adds new findings for weight.

The main meta-analyses in this study attempted to address how statins might have this effect. They found that people who have genetic variations in the gene encoding the protein HMGCR that is targeted by statins, have lower LDL (bad) cholesterol but also increased levels of insulin, blood sugar, body weight and BMI, and slightly increased risks of diabetes. The researchers conclude that the effects of statin on HMGCR could therefore be at least part of the cause of the increased risk of type 2 diabetes seen with statins.

While the results support this theory, this study cannot directly prove this. The genetic variations were used as a “mimic” or “proxy” of the effect of statins, and the study populations in this analysis had not taken statins. Also, the exact effect of the genetic variations on the HMGCR protein need to be looked into further, as they are not in the part of the gene which actually contains the instructions for making the protein.

Drugs can have an effect on the body in more than one way, and statins may also have other effects which could account for the weight gain or increased risk of type 2 diabetes. It is likely that further studies will be carried out to test the theory arising from this research.

If you are taking statins and are worried about your diabetes risk then taking steps to achieve or maintain a healthy weight, such as taking regular exercise and eating a healthy diet, should help reduce your diabetes risk. It will also have the added benefit of reducing your CVD risk as well; win-win! 

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Statins increase risk of diabetes but benefits are still worth it, say experts. The Guardian, September 24 2014

Statins increase weight and blood sugar and raise diabetes risk, study finds. The Daily Telegraph, September 23 2014

Statin users more at risk of piling on the pounds: Scientists warn millions of users to do more exercise to counter side effect. Mail Online, September 24 2014

Benefits of Statins 'greatly outweigh' small risks say experts. Daily Express, September 24 2014

Benefits of taking statins outweigh the diabetes risks, major new study finds. The Independent, September 23 2014

Patients on statins are told to exercise more. The Times, September 24 2014

Links To Science

Swerdlow DI, Preiss D, Kuchenbaecker KB, et al. HMG-coenzyme A reductase inhibition, type 2 diabetes, and bodyweight: evidence from genetic analysis and randomised trials (PDF,1.2Mb). The Lancet. Published online September 24 2014

Categories: NHS Choices

Ebola outbreak to get worse, says WHO

NHS Choices - Behind the Headlines - Wed, 24/09/2014 - 11:40

“Ebola infections will treble to 20,000 by November,” BBC News reports, following the publication of an analysis of the current epidemic by the World Health Organization (WHO).

The report assesses what is known about the spread and devastating impact of the Ebola outbreak to date, while also predicting what may happen in the near future.

The study used data from five West African countries affected by the ongoing Ebola outbreak to estimate that around 70% of people infected (probable or confirmed cases) died from it up to September 14 2014. It states that the disease is likely to continue to spread, unless there are rapid improvements in disease control measures. Without this, it estimates that 20,000 people could be infected by the end of November – an almost quadrupling of the numbers affected up to mid-September (around 4,500). 

This report appears to be based on the pragmatic data available during the outbreak, meaning that it will be prone to some error. However, given the circumstances, it is unlikely that substantially better data will be available any time soon.

However, the analysis did offer a glimmer of hope. It discussed how new cases of the disease may be reduced within two to three weeks of introducing disease control measures, such as

  • improvements in contact tracing
  • adequate case isolation
  • increased capacity for clinical management
  • safe burials
  • greater community engagement
  • support from international partners

 

Where did the story come from?

The study was carried out by members of the WHO Ebola Response Team and was funded by numerous sources, including: the Medical Research Council, the Bill and Melinda Gates Foundation, the Models of Infectious Disease Agent Study of the National Institute of General Medical Sciences (National Institutes of Health), the Health Protection Research Units of the National Institute for Health Research, European Union PREDEMICS consortium, Wellcome Trust and Fogarty International Center.

The study was published in The New England Journal of Medicine – a peer-reviewed medical journal – on an open access basis, so it is free to read online.

BBC News covered the research accurately.

The Mail Online and The Independent covered reports by both the WHO and CDC. Again, their reporting reflected the underlying research.

 

What kind of research was this?

This was a cross-sectional study assessing cases of Ebola virus disease (EVD, or Ebola for short) in five West African countries.

As of September 14 2014, a total of 4,507 confirmed and probable cases of Ebola, as well as 2,296 deaths from the virus, had been reported from five countries in West Africa: Guinea, Liberia, Nigeria, Senegal and Sierra Leone.

Smaller Ebola outbreaks have happened before, but the current outbreak is far larger than all previous epidemics combined. This latest study aimed to gather information from the five countries most affected, to gain an insight into the severity of the outbreak and predict the future course of the epidemic.

 

What did the research involve?

By September 14 2014, a total of 4,507 probable and confirmed cases, as well as 2,296 deaths, from Ebola (Zaire species) had been reported to the WHO from five West African countries – Guinea, Liberia, Nigeria, Senegal and Sierra Leone. The latest WHO report analysed a detailed subset of data on 3,343 confirmed and 667 probable Ebola cases from these countries.

Ebola outbreak data was collected during surveillance and response activities for Ebola in the respective countries during the outbreak.

Clinical and demographic data were collected from probable and confirmed cases using a standard Ebola case investigation form. Additional information on the outbreak was gathered from informal case reports, by data from diagnostic laboratories and from burial records. The data recorded for each Ebola case included the district of residence, the district in which the disease was reported, the patient’s age, sex, signs and symptoms, the date of symptom onset and of case detection, the name of the hospital, the date of hospitalisation, and the date of death or discharge.

The analysis focused on describing epidemiological characteristics of the outbreak using the individual confirmed and probable cases records for each country. Results related to suspected cases were demoted to an appendix, as they were less reliable.

 

What were the basic results?

The main characteristics of the Ebola outbreak are:

  • The majority of patients are 15 to 44 years of age (49.9% male).
  • The estimated chance of dying from Ebola is 70.8% (95% confidence interval [CI], 69 to 73) among persons with known infection. This was very similar across the different countries.
  • The average delay between being infected with the Ebola virus and displaying symptoms is 11.4 days. The course of infection, including signs and symptoms, is similar to that reported in previous Ebola outbreaks.
  • The estimated current reproduction numbers are: 1.81 (95% CI, 1.60 to 2.03) for Guinea, 1.51 (95% CI, 1.41 to 1.60) for Liberia and 1.38 (95% CI, 1.27 to 1.51) for Sierra Leone. The reproduction number is the number of new cases one existing case generates over the time they are infected with the virus. For example, the Guinea rate of 1.81 means that, on average, every person with Ebola infects just under 2 new people with the disease. Reproduction numbers greater than 1 indicate the disease is spreading in a population, with a higher number indicating that the spread is faster. A reproduction rate of above 2 is particularly concerning, as it means that an infection is now spreading exponentially (1 person infects 2; 2 infects 4, 4 infects 8, and so on).
  • The corresponding doubling times – the time it takes for the disease incidence to double – was 15.7 days (95% CI, 12.9 to 20.3) for Guinea, 23.6 days (95% CI, 20.2 to 28.2) for Liberia and 30.2 days (95% CI, 23.6 to 42.3) for Sierra Leone.
  • On the basis of the initial periods of exponential growth, the estimated basic reproduction numbers for the future are: 1.71 (95% CI, 1.44 to 2.01) for Guinea, 1.83 (95% CI, 1.72 to 1.94) for Liberia and 2.02 (95% CI, 1.79 to 2.26) for Sierra Leone.
  • Assuming no change in the control measures for this epidemic, by November 2 2014, the cumulative reported numbers of confirmed and probable cases are predicted to be 5,740 in Guinea, 9,890 in Liberia and 5,000 in Sierra Leone – exceeding 20,000 in total.

 

How did the researchers interpret the results?

The Ebola Response Team were clear in their conclusions, saying their findings “indicate that without drastic improvements in control measures, the numbers of cases of and deaths from EVD [Ebola virus disease] are expected to continue increasing from hundreds to thousands per week in the coming months.”

 

Conclusion

This latest WHO study used data from five West African countries affected by the ongoing Ebola outbreak to estimate that around 70% of people infected (probable or confirmed cases) died from it up to September 14 2014. They found the disease is spreading, and is likely to continue spreading unless there are improvements in disease control measures. This means that if the status quo is maintained, they predict that the outbreak will get worse, rather than better.

This report appears to be based on the pragmatic data available during the outbreak. Such data is always prone to some error, as record keeping and case detection is not 100% accurate, particularly in resource-poor countries or districts. The WHO team thinks their estimates of the disease underestimate the size of the issue, as not all cases will have been detected by their methods, and case records were often incomplete.

One way the WHO investigators got around this was to focus their analysis on the confirmed or probable cases of Ebola. They placed much less emphasis on the more uncertain “suspected cases”. Hence, the data can be viewed as a broadly useful estimate of the situation. It is not precise but, given the circumstances, it is unlikely that significantly better information will be available any time soon.

The team found that the infectiousness and fatality rate of this Ebola outbreak was similar to previous smaller outbreaks. They thought this outbreak was much larger and more serious because the populations affected were different – for example, the populations of Guinea, Liberia and Sierra Leone are highly interconnected. The report said there was “much cross-border traffic at the [Ebola outbreak] epicentre and relatively easy connections by road between rural towns and villages, and between densely populated national capitals. The large intermixing population has facilitated the spread of infection”.

However, they said the large epidemic in these countries was not inevitable. They explained how in Nigeria, including in densely populated large cities such as Lagos, the disease was contained, possibly due to the speed of implementing rigorous control measures.

There was, however, a glimmer of hope in the otherwise worrying report. It discussed how, based on previous outbreaks, new cases of the disease can be reduced within two to three weeks of introducing disease control measures.

The report called for a swift improvement in current control measures to address this problem, specifically:

  • improvements in contact tracing
  • adequate case isolation
  • increased capacity for clinical management
  • safe burials
  • greater community engagement
  • support from international partners

Want to help, but don’t know how? A donation to one of the medical charities that are helping to combat the spread of Ebola could help. A quick online search for "Ebola charities" will bring up a range of deserving causes for you to choose from.

Analysis by Bazian. Edited by NHS ChoicesFollow Behind the Headlines on TwitterJoin the Healthy Evidence forum.

Links To The Headlines

Ebola death rates 70% - WHO study. BBC News, September 23 2014

Ebola outbreak: Experts warn cases could number one million by January as 'window closes' to stop disease becoming endemic. The Independent, September 23 2014

Experts warn Ebola could infect 1.4 million by January in just two African nations. Mail Online, September 23 2014

Links To Science

WHO Ebola Response Team. Ebola Virus Disease in West Africa — The First 9 Months of the Epidemic and Forward Projections. The New England Journal of Medicine. Published online September 23 2014

Meltzer MI, Atkins CY, Knust B, et al. Estimating the Future Number of Cases in the Ebola Epidemic — Liberia and Sierra Leone, 2014–2015. Morbidity and Mortality Weekly Report. Published online September 23 2014

Categories: NHS Choices

Job insecurity may increase adult asthma risk

NHS Choices - Behind the Headlines - Tue, 23/09/2014 - 12:30

“People fearful of losing their jobs are 60% more likely to develop asthma,” The Independent reports.

Researchers have looked at whether perceived job insecurity (specifically, the likelihood that they would lose their jobs) affected people’s risk of developing asthma in Germany during the “Great Recession” (the global economic downturn that lasted from 2008 to 2012).

They found that people who felt there was more than a 50:50 chance of them losing their jobs in the next two years were about 60% more likely to be diagnosed with asthma in this period.

Despite finding a link between job insecurity and asthma, there are a number of things to bear in mind. Associations between mental health, genetic and environmental factors, and physical health can be complex, so it is often difficult to tease out precise causal relationships.

For example, people who reported high levels of job insecurity in this study were also more likely to smoke and be in jobs that might increase their risk of asthma. The researchers tried to take this into account, but it is difficult to know whether factors such as these had an effect.

It does seem plausible that job insecurity – a potentially stressful situation – could cause adult asthma attacks, given that stress can be a trigger. However, we cannot be certain, based on this study alone, whether job insecurity directly increases the risk of developing adult asthma.

 

Where did the story come from?

The study was carried out by researchers from the University of Düsseldorf in Germany, and other universities in the Netherlands and New Zealand. No funding for the study was reported, and the authors declared that they had no competing interests.

The study was published in the peer-reviewed Journal of Epidemiology and Community Health.

The Independent’s headline wrongly suggests that the study looked at workplace stress, which it did not – rather, it assessed just job insecurity. For the purpose of this study, high job insecurity was defined as a person perceiving that there was more than a 50:50 chance of them losing their job in the next two years. While the majority of us would find such as prospect stressful this may not be the case for everyone – for example, if you hate your job and have a good redundancy package you may even welcome redundancy. A person may also have a very secure job, but still have high levels of work stress.

The Independent, however, does report both the actual risk of people developing asthma in the study as well as the relative increase in risk, which helps put the increase into a meaningful context.

 

What kind of research was this?

This was a cohort study looking at whether there was a link between job insecurity and new diagnoses of adult asthma. Studies have suggested that job insecurity may increase risk of poor health, and that job-related stress may be a risk factor for asthma, but have not looked at whether job insecurity might be linked with asthma. This latest study used data collected as part of the German Socio-Economic Panel (GSOEP) study, between 2009 and 2011 – during the European economic crisis, when job insecurity increased.

This study design is the best way to look at the link between an exposure and outcome when it is not feasible or ethical to randomly assign people to have the exposure or not (in this case job insecurity). It allows researchers to establish that the exposure did in fact occur before the outcome, and could therefore potentially be contributing to it.

The main limitation is that factors other than the exposure (called confounders) that differ between the exposed and unexposed groups could be causing any differences seen, rather than the exposure itself. Researchers can use statistical methods to try to remove their impact, but these methods are not 100% effective. They also can’t remove the effect of factors the researchers didn’t know about or measure.

 

What did the research involve?

The researchers analysed data on employed adults who did not have asthma when assessed in 2009. They measured how insecure the participants thought their jobs were at this point and then checked whether they had been diagnosed with asthma two years later, in 2011. They then assessed whether those who felt more job insecurity were more likely to develop asthma.

The data used in this study was collected in face-to-face interviews. Asthma was assessed in both 2009 and 2011 by asking participants if they had ever been diagnosed with this condition by a doctor.

Participants were asked in 2009 to rate on an 11-point scale, from 0% to 100%, how likely they thought they were to lose their job in the next two years. This allowed the researchers to classify and analyse their job insecurity as:

  • less than 50%, or 50% and higher
  • no insecurity (0%), low job insecurity (10% to up to 50%), high job insecurity (50% and over)
  • a continuous measure based on how many standard deviations they were from the average

In their analyses, the researchers took into account confounders that could affect results, including:

  • demographic characteristics – such as age and gender
  • job factors – such as type of contract and working in a profession which could cause a high risk for asthma
  • health behaviours and conditions – such as smoking, overweight and obesity, and depression

Of the approximately 20,000 participants in GSOEP, this latest study analysed the 7,031 who were employed and did not have asthma in 2009, and had answered questions on all the factors included in the analyses.

 

What were the basic results?

The researchers found that just under a quarter of participants (23%) reported high levels of job insecurity in 2009. These people tended, on average, to be slightly younger, have less education, lower income, and were more likely to be unmarried, be smokers, do less exercise, work in a high-risk asthma occupation, have a non-permanent contract, and to have been diagnosed with depression.

In total, 105 people (1.5%) reported having been diagnosed with asthma during the study period. Among those reporting low or no job insecurity, 1.3% developed asthma, compared to 2.1% of those who reported high job insecurity.

After taking into account the potential confounders, this equated to those with high job insecurity being at 61% higher risk of developing asthma (relative risk 1.61, 95% confidence interval 1.08 to 2.40). The researchers also found similar results if they analysed the effect of job insecurity in different ways.

 

How did the researchers interpret the results?

The researchers concluded that “perceived job insecurity may increase the risk of new-onset adult asthma”.

 

Conclusion

This latest study has found that people who reported feeling less secure about their job during the economic crisis were more likely to develop asthma.

It collected data prospectively from a large sample representative of the German population, and excluded people who already reported having asthma at the start of the study. This meant the researchers could be sure that the job insecurity came before the asthma diagnosis.

However, there are also some limitations that mean we should interpret its findings cautiously. Firstly, the researchers tried to take into account some factors that differed between those experiencing high levels of job insecurity and those who did not and might affect results. However, this may not remove their effect entirely. For example, they only had data on smoking at one point in time (in 2008), and did not assess how much a person smoked, or whether this changed over time. People who felt more insecure in their job may have been more likely to start smoking or smoke more, and this could contribute to the link seen.

Secondly, the study only asked people whether they had been diagnosed with asthma by a doctor. It did not check their medical records to confirm this, or give all participants a medical to see if they did have asthma. Some people who already had the condition might not have been diagnosed at the start of the study.

Thirdly, a lot of the participants in the overall study (more than 4,000) could not be analysed as they had missing data. These people differed from the ones who could be analysed in terms of their age, smoking habits and incomes, but not in their reported job insecurity or reported levels of asthma. If these people had been followed up, this could have altered the results.

Finally, it is also worth noting that only a relatively small proportion of people in both groups developed asthma in the study – 2.1% of those reporting high job insecurity and 1.3% reporting low or no insecurity. So most people, regardless of job security, did not develop asthma

The exact causes of asthma development are uncertain, though it is thought to be a combination of hereditary and environmental influences (such as being exposed to smoke as a child). In people who are susceptible to asthma, various things can then trigger an asthma attack – one of which is known to be emotions, which can include stress. For this reason, it is plausible that a stressful situation (job insecurity) could also be a trigger.

Overall, although this study has found a link, there is no certainty that perceived job insecurity was directly causing the development of asthma in people previously without the condition.

If you are worried that concerns about job insecurity are affecting your health, there are a number of steps you can take, such as:

  • not working longer hours than you need to just because you want to demonstrate your commitment; you have to have a good balance of work and leisure if you want to be resilient
  • being focused; it’s more effective to work in short, intense bursts and then take a break
  • if you’re feeling really insecure about your job, talk to your boss or to a trusted colleague and tell him or her how you’re feeling; rumours are often worse than the reality

If you're still feeling anxious or low after a few weeks, see your GP. You may find that talking to a professional therapist helps, and your GP can advise you on talking therapy services in your area.

Read more about coping better with job insecurity  

Analysis by Bazian. Edited by NHS ChoicesFollow Behind the Headlines on TwitterJoin the Healthy Evidence forum.

Links To The Headlines

Stress at work increases risk of developing asthma, according to major new research. The Independent, September 23 2014

Links To Science

Loerbroks A, Bosch JA, Douwes J, et al. Job insecurity is associated with adult asthma in Germany during Europe's recent economic crisis: a prospective cohort study. Journal of Epidemiology and Community Health. Published online September 22 2014

Categories: NHS Choices

Watch less TV to prevent obesity, says NICE

NHS Choices - Behind the Headlines - Tue, 23/09/2014 - 12:30

“Take TV-free days to combat obesity, health experts urge,” The Guardian reports. This is one of a range of new recommendations from National Institute of Health and Care Excellence (NICE) draft guidelines that are designed to help adults and children maintain a healthy weight.

Although the headlines have largely focused on TV (as well as other types of screen time, such as smartphones), the recommendations cover a range of health-related behaviours, such as walking to work and avoiding fizzy drinks.

This draft guidance is mainly aimed at people in organisations who set up, pay for, or put into practice programmes that aim to help people maintain a healthy weight and prevent excess weight gain. The guidance is designed to help them know what sorts of behaviours these programmes should target.

The draft NICE guidelines are now be available for anyone to comment on. NICE will consider the comments and make revisions to the guidance as needed, before publishing its finalised guidance.

 

Why do we need this guidance?

The guidance aims to help reduce risk of diseases associated with excess weight, especially obesity, including:

Like most of the developed world, the UK is in the grip of an obesity epidemic. The latest statistics suggest that one in four adults are obese, making the UK “The Fat Man of Europe”.

Aside from the impact on health, if current trends continue in the coming decades, then the costs of treating obesity-related complications will become untenable.

Helping people to not become overweight or obese in the first place, and even stopping people who are overweight or obese from getting heavier, should bring significant benefits.

 

What does the draft guidance recommend?

The draft recommendations urge people in charge of weight management services to:

  • support everyone to maintain a healthy weight or prevent excess weight gain
  • focus on both physical activity and dietary habits
  • encourage physical activity habits that increase energy expenditure
  • encourage dietary habits that reduce the risk of excess energy intake
  • encourage adults to limit the amount of alcohol they drink
  • address misconceptions about behaviours that may influence weight
  • encourage self-monitoring (such as regularly weighing yourself or using a pedometer to measure steps)
  • provide sources of information and support

Each recommendation then gives more detail about how to go about these things, and some of this detail is reported below.

 

What do they say about the misconceptions?

The draft guidance says that public health messages should address misconceptions people may have about how to influence weight. They say that this includes, for example, making it clear that:

  • healthy eating and being active are as important for people who are currently a healthy weight as for people who are already overweight
  • gaining weight as an adult is not inevitable; most people will gain weight as they get older if they are inactive and eat an energy-dense diet (one which includes a high number of calories per gram)
  • extreme behaviours (such as avoiding all carbohydrates) are difficult to keep up in the long run and may not be accompanied by improvements in health
  • no single behaviour (for example, consuming or not consuming a specific food or drink, or taking part in physical activity) will maintain a healthy weight by itself
  • all foods and drinks, even those sometimes seen as "healthy" (such as olive oil, fruit juice or milk), contain energy and can contribute to weight gain if consumed in large amounts
  • eating in the evening (for example, after 5pm, compared with earlier in the day) does not make it harder to maintain a healthy weight, unless total energy intake is increased

 

Does it say I can’t watch TV for more than two hours a day?

Watching less than two hours of TV a day is just a single example the draft guidance gives. It gives the example as one way to encourage habits and routines that will gradually increase the amount and intensity of physical activity people do. It says that any strategy that reduces TV viewing and other leisure screen time may be helpful, such as having TV-free days or aiming to watch TV for no more than two hours a day.

It’s an eye-catching recommendation, and one that has clearly struck a chord with the media. Arguably, it could have been more helpful for the media to focus on the misconceptions that they may intentionally or unintentionally reinforce through reporting of fad diets and focusing on single foods.

In addition, recommendations don’t just focus on TV, they also want to promote:

  • regular walking, particularly brisk walking, or cycling as a form of active travel
  • activities during leisure time and breaks at work or school
  • activity as part of daily routines (such as taking the stairs instead of the lift)
  • support and encouragement for children to be active at every opportunity, such as having active school breaks

 

What does the draft NICE guidance say about food and drink?

NICE’s draft guidance says people should try to:

  • reduce the energy density of their diet; energy-dense foods (such as fried foods, confectionery and full-fat cheese) pack a lot of calories into a small quantity of food, and the guidance recommends reducing how often and how much of these is eaten – it also recommends substituting them with less energy-dense foods, such as fruit and vegetables
  • follow the principles of a Mediterranean diet – which is mainly based on vegetables, fruits, beans and pulses, whole grains, fish and using olive oil, instead of other fats
  • eat breakfast, and to choose healthy breakfast foods, such as unsweetened wholegrain cereals or bread and lower-fat milk
  • aim for meals to be enjoyable and without distractions (for example, avoid eating while watching television)
  • reduce fast food and takeaways – for example, by limiting them to no more than once a week
  • avoid sugar-sweetened drinks
  • reduce total fat intake
  • increase proportion of high-fibre or wholegrain-rich foods
  • limit intake of meat and meat products

 

How have the news sources covered this guidance?

Most news sources covered this draft guidance relatively briefly and factually, with the focus in the headlines often on the recommendations around TV watching.

While most media sources are broadly supportive, the Mail Online’s headline refers to the draft guidance as “Health watchdog's 42 pages of health tips – for the perfectly healthy!” and says it is “aimed at those who are in good health and not overweight”. This seems to imply that the guidance is a waste of time. It also goes against one of the misconceptions NICE aims to tackle – that healthy eating and physical activity are not important in people who are a healthy weight.

The Mail has failed to grasp the concept that prevention is better than cure.

Also, the guidance aims to make recommendations that can be applied to the population as a whole – which includes individuals who are overweight or obese. It doesn’t cover specific recommendations about treating overweight or obesity (that is, about how to lose weight), as there is other NICE guidance covering this.

Note – Bazian Ltd. produced two evidence reviews to support the development of this NICE guidance. This Behind the Headlines analysis was produced under the standard process.

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Take TV-free days to combat obesity, health experts urge. The Guardian, September 23 2014

Limit TV to help fight obesity, says NICE. BBC News, September 22 2014

Have TV-free days to keep the weight off: Health watchdog's 42 pages of health tips -for the perfectly healthy! Mail Online, September 23 2014

Health experts tell us: Turn off the telly to stop getting fat. Daily Mirror, September 22 2014

Stop obesity: Eat healthier and move more warn health experts. Daily Express, September 23 2014

Doctors to recommend 'TV-free days' to tackle obesity. ITV News, September 23 2014

Categories: NHS Choices

Raising low self-esteem

NHS Choices - Live Well - Tue, 23/09/2014 - 11:02
Raising low self-esteem

We all have times when we lack confidence and don’t feel good about ourselves.

But when low self-esteem becomes a long-term problem, it can have a harmful effect on our mental health and our lives.

Self-esteem is the opinion we have of ourselves. When we have healthy self-esteem, we tend to feel positive about ourselves and about life in general. It makes us able to deal with life’s ups and downs better.

When our self-esteem is low, we tend to see ourselves and our life in a more negative and critical light. We also feel less able to take on the challenges life throws at us.

What causes low self-esteem?

Low self-esteem often begins in childhood. Teachers, friends, siblings, parents, and even the media give us lots of messages – both positive and negative. But for some reason, the message that you are not good enough sticks.

You may have found it difficult to live up to other people’s expectations of you, or to your own expectations.

Stress and difficult life events, such as serious illness or a bereavement, can have a negative effect on self-esteem. Personality can also play a part. Some of us are simply more prone to negative thinking, while others set impossibly high standards for themselves.

How does low self-esteem affect us?

The problem with thinking we’re no good is that we start to behave as if it’s true. “Low self-esteem often changes people’s behaviour in ways that act to confirm the person isn’t able to do things or isn’t very good,” says Chris Williams, Professor of Psychosocial Psychiatry at the University of Glasgow. 

If you have low self-esteem or confidence, you may hide yourself away from social situations, stop trying new things and avoid things you find challenging.

“In the short term, avoiding challenging and difficult situations makes you feel a lot safer,” says Professor Williams. “In the longer term, this avoidance can actually backfire because it reinforces your underlying doubts and fears. It teaches you the unhelpful rule that the only way to cope is by avoiding things.”

Living with low self-esteem can harm your mental health, leading to problems such as depression and anxiety. You may also develop unhelpful habits, such as smoking and drinking too much, as a way of coping.

How to have healthy self-esteem

In order to boost self-esteem, you need to identify and challenge the negative beliefs you have about yourself.

“You need to look at your beliefs, how you learned them and why you believe them,” says Professor Williams. “Then actively begin to gather and write down evidence that disconfirms them.”

Learn to spot the negative thoughts you have about yourself. You may tell yourself you are "too stupid" to apply for a new job, for example, or that "nobody cares" about you. Start to note these negative thoughts and write them down on a piece of paper or in a diary, suggests Professor Williams. Ask yourself when you first started to think these thoughts.

Next, start to write down evidence that challenges these negative beliefs: "I am really good at cryptic crosswords" or "My sister calls for a chat every week". Write down other positive things you know to be true about yourself, such as "I am thoughtful" or "I am a great cook" or "I am someone that others trust". Also write down good things that other people say about you.

Aim to have at least five things on your list and add to it regularly. Then put your list somewhere you can see it. That way, you can keep reminding yourself that you are OK.

“It’s about helping people recognise they have strengths as well as weaknesses, like everyone else, and begin to recognise those strengths in themselves,” says Professor Williams.

“You might have low confidence now because of what happened when you were growing up,” he says. “But we can grow and develop new ways of seeing ourselves at any age.”

Other ways to improve low self-esteem

Here are some other simple techniques that may help you feel better about yourself.

Recognise what you are good at 

We are all good at something, whether it’s cooking, singing, doing puzzles or being a friend. We also tend to enjoy doing the things we are good at, which can help to boost your mood.

Build positive relationships

If you find certain people tend to bring you down, try to spend less time with them, or tell them how you feel about their words or actions. Seek out relationships with people who are positive and who appreciate you.

Be kind to yourself

Professor Williams advises: “Be compassionate to yourself. That means being gentle to yourself at times when you feel like being self-critical. Think what you’d say to encourage a friend in a similar situation. We often give far better advice to others than we do to ourselves.”

Learn to be assertive

Being assertive is about respecting other people’s opinions and needs, and expecting the same from them.

One trick is to look at other people who act assertively and copy what they do. “It’s not about pretending you’re someone you’re not,” says Professor Williams. “It’s picking up hints and tips from people you admire and letting the real you come out. There’s no point suddenly saying, ‘I’m going to be Chris Hoy’, but you might be able to get your bike out and do a bit of cycling for the first time in ages.”

Start saying 'no'

People with low self-esteem often feel they have to say yes to other people, even when they don’t really want to. The risk is that you become overburdened, resentful, angry and depressed.

“For the most part, saying no doesn’t upset relationships,” says Professor Williams. “It can be helpful to take a scratched-record approach. Keep saying no in different ways until they get the message.”

Give yourself a challenge

We all feel nervous or afraid to do things at times. People with healthy self-esteem don’t let these feelings stop them from trying new things or taking on challenges.

Set yourself a goal, such as joining an exercise class or going to a social occasion. Achieving your goals will help to increase your self-esteem. 

Where to find help for low self-esteem

You may feel you need some help to start seeing yourself in a more positive light. Talking therapies, such as counselling or cognitive behavioural therapy, can help. Your GP can explain the different types and tell you what’s available in your area.

Read more about the different types of therapy.

You can also refer yourself for counselling or therapy. Use the NHS Choices Services Directory or visit the British Association for Counselling & Psychotherapy website to find a registered counsellor and therapist near you.

Hear Dr Williams' podcast about tackling unhelpful thinking.

 

 

Categories: NHS Choices

Late cancer diagnosis 'costing lives and money'

NHS Choices - Behind the Headlines - Mon, 22/09/2014 - 12:30

"Almost half of cancer patients diagnosed too late," says The Guardian, citing a new report that explored both the financial and health impact of late cancer diagnosis.

The late diagnosis of almost all types of cancer usually means the disease has already spread within the body, making it less treatable, reducing a patient's chances of survival, and potentially increasing the cost of effective treatments.

This means an enduring aim of cancer treatment is to pick up the disease as soon as possible, so treatment is more likely to be effective.

The report predicted around 52,000 cases of four common cancers (colon, rectallung and ovarian) may be spotted too late every year, costing the NHS around an extra £150 million to treat.

Various theories have been put forward to explain why this is the case, including "patients put[ting] their heads in the sand when they feared cancer", and how "doctors are struggling to get patients seen quickly".

 

Who produced this report on late cancer diagnoses?

The report was produced by Incisive Health, a specialist health policy and communications consultancy, in collaboration with experts at Cancer Research UK, a leading cancer charity. It was funded by Cancer Research UK.

The report – titled "Saving lives, averting costs: an analysis of the financial implications of achieving earlier diagnosis of colorectal, lung and ovarian cancer" – presumed that early diagnosis is crucial, and aimed to uncover the financial implications of achieving earlier diagnosis for colon, rectal, non-small cell lung (the most common type of lung cancer) and ovarian cancers.

The report estimated the number of people currently diagnosed with cancer using national guidance and data sources. This included data on the stage of the cancer when it was diagnosed (where available), and the authors calculated the cost of treatment. They then modelled what would happen if the cancers had been diagnosed earlier.

 

Links To The Headlines

Almost half of cancer patients diagnosed too late. The Guardian, September 22 2014

50,000 lives cut short by cancer diagnosis failings. The Daily Telegraph, September 22 2014

52,000 cancer cases a year are spotted too late: Delays blamed on 'stiff upper lip' mentality and pressure on GPs not to refer patents for costly tests. Mail Online, September 22 2014

Almost half of cancers 'caught too late'. ITV News, September 22 2014

Links To Science

Cancer Research UK. Half of cancers diagnosed at late stage as report shows early diagnosis saves lives and could save the NHS money. September 22 2014

Categories: NHS Choices

Dry-roasted peanuts may be worst for nut allergies

NHS Choices - Behind the Headlines - Mon, 22/09/2014 - 01:00

“Dry-roasted peanuts 'worst for allergies',” the Mail Online reports. New research involving mice suggests that the roasting process increases the "allergic power" of peanuts.

Researchers exposed mice to small amounts of proteins derived from either "raw" peanuts or dry-roasted peanuts, to “prime” their immune systems for an allergic reaction. They later gave them larger doses of the proteins and found that the intensity of the allergic reaction was much larger after priming with the dry-roasted protein, compared with the raw.

The researchers speculated that the roasting process may change the chemical composition of nuts, making them more likely to provoke an allergic reaction.

The research team thought this might partially explain why there is a much higher prevalence of peanut allergies in Western countries – where dry roasting is more common – compared with Eastern countries.

Importantly, the findings were based on mice, so are not directly applicable to humans. Studies involving humans would be needed to better explore these issues. There may be ethical considerations, however, due to the possible risk of anaphylaxis – a severe allergic reaction.

This research alone does not warrant the avoidance of dry-roasted peanuts out of fear of developing a nut allergy. Similarly, if you have a history of nut allergies, you shouldn’t assume that raw, boiled or fried nuts will be safe to eat. Those with an existing allergy should continue to take their normal action to prevent triggering their own allergy, which will vary from person to person.

 

Where did the story come from?

The study was carried out by researchers from Universities in Oxford (UK) and Philadelphia (US), and was funded by the National Institute for Health Research Oxford Biomedical Research Centre (UK), the National Institutes of Health (US) and the Swiss National Science Foundation Prospective and Advanced Research Fellowships.

The study was published in The Journal of Allergy and Clinical Immunology, a peer-reviewed science journal. 

The UK media’s reporting was generally accurate, with some warning against over-extrapolating the results to humans, and that new treatments or allergy-prevention strategies may take a long time to develop, if at all.

 

What kind of research was this?

This was an animal study, using mice to research allergic reactions to peanuts.

Peanut allergies are relatively common and can be serious, sometimes fatal. The researchers highlight how, despite similar peanut consumption, the Western world has a much higher prevalence of peanut allergy than the Eastern world. The research team suggested that this might be due to the way nuts are prepared. Eastern countries tend to eat their nuts raw, boiled or fried, whereas Western countries consume more dry-roasted nuts.

Researchers often use mice for research purposes because, as mammals, they are biologically similar to humans. Hence, conducting research on mice can tell us what might happen to humans without directly experimenting on them. The caveat is that there is no guarantee the results seen in mice will be applicable to humans; while similar, the biology of the two organisms is not identical, and the differences can sometimes be crucial.

 

What did the research involve?

The researchers studied the immune response of mice to various peanut products: peanut protein extracted from raw nuts; peanut protein extracted from dry-roasted nuts; raw peanut kernels (grain or seed); and dry-roasted peanut kernels. 

The team studied how immune cells reacted to the peanut products and the biochemistry involved in the response.

They studied three main routes of exposure to the peanut products:

  • peanut protein extracts were injected into the mice under the skin (subcutaneous route)
  • peanut kernels were fed to the mice for them to eat as they normally would (gastrointestinal route)
  • extracts were applied to sores in the skin (epicutaneous route)

The main analysis looked at the immune reactions of the mice, comparing raw with dry-roasted peanuts and peanut proteins.

 

What were the basic results?

The main finding was that the dry-roasted peanut protein extracts and whole peanut kernel elicited a much stronger immune response in the mice than the equivalent raw peanuts and extracts. This occurred consistently across all three exposure routes – on skin, in the stomach and under the skin.

Interestingly, when the mice were “primed” with low levels of dry-roasted peanut proteins to give a low-level reaction, they gave a much larger subsequent reaction to both raw and dry-roasted products. This suggested that exposure to dry-roasted nuts influenced subsequent reaction to raw nuts, possibly sensitising an individual for a strong reaction in the future.

 

How did the researchers interpret the results?

The researchers indicated that this is the first experiment to show a larger immune response elicited by dry-roasted peanuts compared with raw peanuts in a living mammal.

They suggest that: “A better understanding of how high-temperature antigen modification, such as peanut dry roasting, leads to allergic sensitisation should inform future preventive strategies, including those concerning early-age exposure, and therapeutic measures, such as the choice and route of antigen delivery in desensitisation strategies.”

 

Conclusion

This small animal study indicates that dry-roasted nuts and nut proteins cause a larger immune reaction than raw nuts. The team hypothesise that this might explain the difference between the prevalence of nut allergies in Western countries – where dry roasting is more common – and Eastern countries – where raw nuts are more typically consumed. While this study lends some weight to this idea, it does not directly prove it.

The study was consistent in its findings, giving them some validity, but we should consider that this was a small study involving mice. The findings are not directly applicable to humans, so we cannot say for sure that dry-roasted peanuts cause more allergic reactions or are the cause of the higher prevalence in the West – studies involving people would be needed to better explore this.

As the researchers acknowledge, further research confirming the findings of this study are required, which could include exploring ways to prevent allergies to nuts through desensitisation (immunotherapy). After these methods are developed in mice models, they might be investigated in humans. The path to a treatment or preventative strategy from this very early-stage research might be long and complex, so readers should not expect any immediate or short-term impact. 

This research alone does not warrant the avoidance of dry-roasted nuts out of fear of developing a nut allergy. Similarly, if you have a history of nut allergies, you shouldn’t assume that raw, boiled or fried nuts will be safe to eat.

Those with an existing allergy should continue to take their normal action to prevent triggering their own allergy. Allergies can be very different in different people, so this might vary between individuals.

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Dry-roasted peanuts 'worst for allergies': Findings will help scientists develop nuts that prevent reactions. Mail Online, September 22 2014

Peanut allergies: 'Roasted worse than raw nuts'. BBC News, September 22 2014

Peanut allergy: Roasted worse than raw for sufferers. The Independent, September 22 2014

Links To Science

Moghaddam AE, Hilson WR, Noti M, et al. Dry roasting enhances peanut-induced allergic sensitization across mucosal and cutaneous routes in mice. The Journal of Allergy and Clinical Immunology. Published online September 22 2014

Categories: NHS Choices

Mums 'feel shame' about how they feed their babies

NHS Choices - Behind the Headlines - Fri, 19/09/2014 - 13:00

"Mothers are made to feel 'marginalised and ashamed' when they breastfeed in public, according to an international study," the Mail Online reports. But the same study found mothers who bottlefeed also feel subject to criticism.

The study used discussion groups and interviews to explore the thoughts, feelings and experiences – as well as perceived barriers and facilitators – of feeding infants among a small sample of mothers in north-west England.

A common theme was the shame felt by both mothers who breastfeed and bottlefeed their babies. For example, some breastfeeding mothers discussed concerns about how they are viewed by others when exposing their bodies in public, while conversely women who bottlefeed their baby often feel frowned upon for not breastfeeding.

This was a small study involving just 63 women in one region of England, so we cannot assume its findings are representative of other, larger populations. But it does provide a useful insight into how, for some women, breastfeeding has become an emotional minefield. It suggests there is an important psychological, not just physical, aspect to breastfeeding.

The researchers conclude that health professionals need to find effective methods of providing support to combat feelings of shame in mothers who either breast or bottlefeed.

 

Where did the story come from?

The study was carried out by researchers from the University of Central Lancashire in England, the Georg Eckert Institute for International Textbook Research in Germany, and Dalarna University in Sweden.

Funding was provided by the North Lancashire Primary Care Trust.

It was published in the peer-reviewed medical journal Maternal and Child Nutrition on an open-access basis, so it can be read online for free.

The Mail Online is generally representative of this research, making it clear these findings are from only 63 women.

However, the headline and general tone of its article mainly focuses on the shame that may be felt on breastfeeding in public. The experience of women who do not breastfeed is largely ignored.

 

What kind of research was this?

This was a qualitative study that aimed to review women's experiences, thoughts and feelings related to feeding their baby. Qualitative research uses methods such as interviews, observations and discussion groups to understand people's views and feelings, and what motivates them.

The researchers state that emotions such as guilt and blame are often reported among mothers who do not breastfeed, while those who do breastfeed can sometimes feel fear and humiliation when feeding in public places.

In this study, a sample of breastfeeding women and those who did not breastfeed (taken from two primary care trusts in north-west England) took part in discussion groups and individual interviews to explore their experiences, opinions and perceptions of feeding their baby.

 

What did the research involve?

This study reflects information as part of a wider UNICEF UK Baby Friendly Initiative community project in two community health facilities in north-west England.

A total of 63 women were recruited from various mother and baby groups or clinics (such as baby massage, mother and baby groups, and breastfeeding groups). The researchers report they took care to include women representative of low to high socioeconomic status by recruiting them from a range of different backgrounds.

The mothers' average age was 30 years, most were white British, and most were married or cohabiting and had one or two children. Their infants were mostly aged between 4 and 24 weeks, though 11 infants were aged 6 to 12 months, and 10 were over the age of 1.

Of the women recruited, 28 were breastfeeding, 11 were formula feeding, 7 were mixed feeding through breast and formula, and the remainder were feeding a combination of complementary foods with either breast or formula.

Thirty-three of the women took part in 7 discussion groups (focus groups), and 30 women received individual interviews in their homes, though 2 sets of women were interviewed in pairs.

In both settings, women were asked a range of questions designed to explore the women's current infant feeding status, intentions and motivations behind the feeding pattern, and barriers and facilitators to support. For example, among other questions, the researchers asked:

  • Why did you choose to breastfeed or formula feed your baby?
  • What information did you receive in regard to infant feeding (antenatally and postnatally)?
  • Did any professionals discuss (or provide demonstrations) on infant feeding (breastfeeding or formula feeding)?

The interviews and focus groups took between 25 and 80 minutes to complete, and were digitally recorded and transcribed in full.

 

What were the basic results?

The researchers explain how many of the women's discussions about their infant feeding experience involved feelings of shame, frequently indicating a sense of feeling out of control and a dependence on others because of insufficient information and a lack of appropriate infant feeding support.

They also say that when a mother's infant feeding method was not as she (or others) intended, this could lead to further feelings of incompetence, inadequacy and inferiority.

The researchers discuss the concept of how both bottle and breastfeeding can both be a source of "offence" to others in different ways.

They also discuss how some of the discussions revealed how women sometimes held ideals and expectations of being a "good" mother. Some women felt anxious, fearful and dependent as a result of various influences: the experience of birth, being overwhelmed by new motherhood and not feeling prepared, cultural influences, and infant feeding.

These feelings were particularly common among first-time mothers, who often weren't aware of what support they would need until faced with the realities of motherhood. Many referred to how they felt expected or under pressure to breastfeed, a pressure transmitted by cultural messages as well as health professionals. Women were said to often experience this as an additional burden of the already bewildering state of new motherhood.

When exploring the social context of any feelings of shame that were experienced by breastfeeding women, a common theme that emerged was related to exposing their breasts in public and concerns about what people thought, or being stared at or frowned upon.

Similar feelings of shame and judgement were reported by women who don't breastfeed their babies, such as people judging them for not breastfeeding. Some women also reported feeling a lack of confidence or difficulties in asking professionals for support about feeding.   

 

How did the researchers interpret the results?

The researchers say how their paper "highlights how breastfeeding and non-breastfeeding women may experience judgement and condemnation in interactions with health professionals as well as within community contexts, leading to feelings of failure, inadequacy and isolation".

They say there is a "need for strategies and support that address personal, cultural, ideological and structural constraints of infant feeding".

 

Conclusion

This informative study explores the attitudes and experiences surrounding infant feeding, as well as the perceived barriers and ways this could be changed, looking at a sample of mothers in north-west England.

A common theme revealed by mothers related to public and professional perceptions and expectations around infant feeding practices. Both breastfeeding and non-breastfeeding women discussed a sense of shame around their feeding practice for different reasons.

For example, some breastfeeding women discussed concerns about how they are viewed by others when exposing their bodies in public, while conversely women who bottlefeed their baby can feel frowned upon for not breastfeeding. Another common theme discussed by mothers related to feelings of difficulties in accessing support.

This study has provided new insights into the factors that may induce shame in new mothers. Qualitative research of this nature aims to give a detailed exploration of people's views and experiences, and all data and quotes were carefully collected and analysed.

But because of the depth of the analysis, the sample size in these studies tends to be quite small. This study therefore reflects the experiences of only 63 women in one region of England. With only four mothers from minority ethnic groups, it is not known how representative these experiences are of other cultural groups.

Women should never feel ashamed of breastfeeding in public. If other people take issue with it, it is their problem, not yours.

On the other hand, women who find it just too difficult to breastfeed, or can't for other reasons, should also not feel ashamed or guilty.

While breastfeeding does bring proven health benefits to a baby, having a happy and confident mother is probably as, if not more, important. 

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Breastfeeding in public 'still frowned upon': Mothers made to feel 'marginalised and ashamed', study finds. Mail Online, September 18 2014

Links To Science

Thomson G, Ebisch-Burton K, Flacking R. Shame if you do – shame if you don't: women's experiences of infant feeding. Maternal and Child Nutrition. Published online August 19 2014

Categories: NHS Choices

'Angelina Jolie effect' doubled breast gene tests

NHS Choices - Behind the Headlines - Fri, 19/09/2014 - 12:30

“Referrals to breast cancer clinics more than doubled in the UK after Angelina Jolie announced she had had a double mastectomy,” BBC News reports. NHS services saw a sharp rise in referrals from women worried about their family history of breast cancer.

In May 2013, actress Angelina Jolie announced that she had decided to undergo a double mastectomy followed by breast reconstruction surgery, as gene testing estimated she had an 87% chance of developing breast cancer.

Examination of trends in genetic testing clinics in the UK showed that there was a peak in referral rates in June and July, with numbers standing at around two-and-half times higher than the previous year. There was almost a doubling in requests for predictive genetic tests for cancer risk genes, and many more enquiries about preventative mastectomy. Researchers were also encouraged by finding that all referrals to genetic or family history clinics were appropriate (that the so-called “worried well” weren’t diverting resources from where they were needed).

This study can’t prove a direct cause and effect, but the evidence seems compelling.

The researchers also speculate that, as Angelina Jolie is seen as a glamorous icon, her decision may have reassured women who fear that preventative surgery would make a woman less attractive.

The actress would have been well within her rights to keep her health confidential, particularly knowing the media interest it would create. Her decision to speak out and help destigmatise mastectomies should be congratulated.

 

Where did the story come from?

The study was carried out by researchers from the University Hospital of South Manchester NHS Trust, and the Manchester Centre for Genomic Medicine at St. Mary’s Hospital. Financial support was provided by the Genesis Breast Cancer Prevention Appeal and Breast Cancer Campaign.

The study was published in the peer-reviewed medical journal Breast Cancer Research on an open-access basis, so it is free to read online.

The UK media’s reporting was generally accurate, though the Daily Mirror got a little confused with its headline "'Angelina Jolie effect' credited for huge rise in double mastectomies to reduce breast cancer risk”.

The effect did cause a rise in the number of women being tested to see if a double mastectomy was required. However, the research didn't look at the number of operations carried out. As most of the tests would have actually proved negative, the impact on the number of operations is unlikely to have been a “huge rise”.

 

What kind of research was this?

This was a review of breast cancer-related referrals to family history clinics and genetics services within the UK for 2012 and 2013, to see how the trends changed between the two years.

As the researchers discuss, it is common for news items related to a particular health service to lead to a short-term temporary increase in interest. There is rarely a long-lasting effect once the media attention has died down. For example, the 2009 death of reality TV star Jade Goody from cervical cancer led to a shortlived increase in the number of young women attending cervical cancer screening appointments.

In 2013, there was said to be “unprecedented publicity of hereditary breast cancer” in the UK. This was associated with two things. First came the release of draft guidance from the National Institute of Health and Care Excellence (NICE) on familial (hereditary) breast cancer in January, followed by the final publication in June 2013. Second, and seemingly more significant, was the high-profile news reports that broke in May 2013 of actress Angelina Jolie’s decision to undergo a double mastectomy when finding that she had inherited the BRCA1 gene – putting her at high risk of developing breast cancer.

Studies suggested that the news stories were associated with increases in attendance at hereditary breast cancer clinics and genetics services in the US, Canada, Australia, New Zealand and the UK. This study assessed the potential effects of the “Angelina Jolie effect” by looking at UK referrals due to breast cancer family history in the UK for the year 2012 compared to 2013.

 

What did the research involve?

This research looked at referrals specific to breast cancer for 21 centres in the UK. This included 12 of 34 family history clinics invited to participate, and nine of 19 regional genetics centres. Centres that did not supply data were reported to either not have this available, or were unable to collate the data. Monthly referrals to each centre for 2012 and 2013 were assessed, and the trends analysed. 

 

What were the basic results?

The results show that overall referral rates were 17% higher in the period January to April 2013 than they had been in the previous year (the draft NICE guidance on familial breast cancer hit the media in January 2013, prior to final publication in June). However, there was nearly a 50% rise in May 2013, which was too early to have been associated with the final publication of NICE guidance, and coincided with the media reports about Angelina Jolie.

In June and July 2013, referral rates to the clinics were 4,847 – two-and-a-half times as many as in the same period the previous year (1,981 in 2012). From August to October, they were around twice as high as they had been in the same period the previous year. The referral rates then settled down again to being 32% higher in November and December 2013 than in November and December 2012.

In total, referrals rose from 12,142 in 2012 to 19,751 in 2013. There was almost a doubling in requests for BRCA1/2 testing, and many more enquiries about preventative mastectomies.

Encouragingly, internal reviews from specific centres show that there was no increase in inappropriate referrals.

 

How did the researchers interpret the results?

The researchers conclude that, “the Angelina Jolie effect has been long-lasting and global, and appears to have increased referrals to centres appropriately”.

 

Conclusion

This is an interesting study that reviewed how the trends in breast cancer-related referrals to breast cancer family history clinics and genetics centres in the UK changed between 2012 and 2013. The overall results show an increase in 2013, with particular peaks following high-profile media events – most notably, news of Angelina Jolie’s decision to have a double mastectomy in May of that year.

However, there are a couple of points to bear in mind when interpreting these results.

Firstly, the study did not have data available from all family history clinics and genetics centres in the UK, and the results are only representative of 40% of those who would have been eligible to participate. Therefore, it is not known whether the trends would be the same were data available from all services. However, this is a good representation, so is likely to give a good indicator.

Studies such as this can assess trends, but it is still not possible to know the direct cause of any changes. As this study says, there were two related events that received media attention in 2013: the publication of NICE guidance on familial breast cancer (pre-publication in January and final publication in June); and the higher-profile news reports in May of Angelina Jolie’s decision to have a double mastectomy due to her high risk of developing familial breast cancer.

While it may be plausible that the rises in referral rates to family history and genetics clinics were associated with this increased media attention, particularly the “Angelina effect”, it still cannot be proven that this is the only cause. Alternatively, the increase in trend could also be related to a gradual year-on-year increase in people’s health awareness.       

It would be interesting to see how trends changed in years prior to 2012. It would also be interesting to know what has happened to the trend in referral rates through 2014. 

Overall, the particular peaks in referral rates in June and July 2013 suggest that the news related to Angelina Jolie, perhaps combined with the publication of NICE guidance on familial breast cancer testing around this time, have a high chance of being associated with the increased referral rates.

This is not surprising given the thought-provoking influence that the media is known to have.

It is also encouraging to know that all referrals to genetic or family history clinics were appropriate, suggesting that the media attention is likely to have had a positive effect in increasing health awareness.

Analysis by Bazian. Edited by NHS ChoicesFollow Behind the Headlines on TwitterJoin the Healthy Evidence forum.

Links To The Headlines

Breast cancer test 'Angelina Jolie effect' found. BBC News, September 19 2014

The Angelina effect: Surge in women going for breast cancer checks after actress speaks out about her mastectomy. Daily Mail, September 19 2014

The 'Angelina Jolie effect': Her mastectomy revelation doubled NHS breast cancer testing referrals. The Independent, September 19 2014

Angelina Jolie's breast cancer announcement doubled number of women being tested: study. The Daily Telegraph, September 19 2014

'Angelina Jolie effect' credited for huge rise in double mastectomies to reduce breast cancer risk. Daily Mirror, September 19 2014

Angelina Jolie's op sparks huge surge in the number of cancer tests. Daily Express, September 19 2014

Links To Science

Evans DGR, Barwell J, Eccles DM, et al. The Angelina Jolie effect: how high celebrity profile can have a major impact on provision of cancer related services. Breast Cancer Research. Published online September 19 2014

Categories: NHS Choices

Chokeberry extract 'boosts pancreas cancer chemo'

NHS Choices - Behind the Headlines - Thu, 18/09/2014 - 12:30

“Wild berries native to North America may have a role in boosting cancer therapy,” BBC News reports.

It has been found – in a laboratory study using pancreatic cancer cells – that chokeberry extract may help increase the powers of chemotherapy drugs in treating pancreatic cancer

Researchers tested an extract of chokeberry – a plant found on the eastern side of the continent – on pancreatic cancer cells. They examined what happened to these cells in the laboratory when they were treated with chemotherapy alone, chokeberry extract alone, or with a combination of both.

Researchers found that adding the chokeberry extract to gemcitabine (a chemotherapy drug used in the treatment of pancreatic cancer) was more effective at halting the growth of cancer cells than the drug alone.

Pancreatic cancer is a condition with notoriously poor prognosis, and the possibility of any new treatment on the horizon is encouraging. However, it is uncertain whether these positive lab results would translate to a real-world setting. It is expected that, based on these promising results, further studies will look into the possibility of human trial(s).

For now, people with pancreatic cancer should not consider taking these chokeberry extracts or supplements, based on this very early-stage research. "Herbal remedies" should never be assumed to be safe, and some can react unpredictably with chemotherapy drugs.

 

Where did the story come from?

The study was carried out by researchers from Middlesex University, University of Southampton, Portsmouth University and Kings College Hospital. It was funded by the Malaysian Ministry of Higher Education and a US charitable organisation called Have a Chance Inc.

The study was published in the peer-reviewed Journal of Clinical Pathology.

The BBC’s coverage was fair, pointing out that research was at an early stage and including independent comments from cancer experts on the need for human trials. The Daily Telegraph’s coverage only included comments from the study’s authors.

 

What kind of research was this?

This was a laboratory study, with scientists having conducted various experiments examining the effect of adding extracts of chokeberry to pancreatic cancer cells.

The researchers point out that pancreatic cancer has a very poor outlook and a high mortality rate, with only 1-4% of those with the cancer surviving to five years. Only 10-20% of people with pancreatic cancer are suitable for surgery, and pancreatic cancer cells are resistant to both chemotherapy and radiotherapy.

Researchers say that many studies have explored the use of dietary agents, particularly antioxidant substances called polyphenols, found in fruits and vegetables. This is because of their ability to promote apoptosis – programmed cell death – in a variety of cancer cells. Previous studies have also shown that a number of polyphenols, including those from chokeberry extracts, have potential anticancer properties in malignant brain tumours.

Chokeberry (aronia melanocarpa) is a shrub found in North American wet woods and swamps. Extracts and supplements are popular for their apparent health-giving qualities, including their high level of antioxidants.

 

What did the research involve?

Researchers used a line of pancreatic cancer cells called AsPC-1, which were cultured in the laboratory. In a number of experiments, they assessed how well the cells grew when treated with:

  • the chemotherapy drug gemcitabine alone at different doses (gemcitabine is one of the drugs sometimes given to people after they have had surgery to remove their pancreatic cancer, to try and prevent it returning)
  • differing levels of chokeberry extract
  • a combination of gemcitabine with chokeberry extract

They also carried out experiments to examine how chokeberry extract might cause the death of cancer cells, and at what concentration it caused cell death. As a control, they also tested chokeberry extract on the healthy cells that line blood vessels. These are taken from the veins of the umbilical cord and are often used in laboratory studies.

 

What were the basic results?

Researchers found that gemcitabine in combination with chokeberry extract was more effective at killing cancer cells than gemcitabine by itself. This difference in effect was also present when using lower doses of gemcitabine.

The analysis indicated that when incubated with gemcitabine for 48 hours, a concentration of one microgram per millilitre of chokeberry extract was required to induce cell death. Generally, the higher the concentration of chokeberry extract used in combination with gemcitabine, the more cancer cells were killed.

However, chokeberry extract alone without gemcitabine was not effective at killing the cancer cells at the concentrations tested.

Healthy cells were unaffected by chokeberry extract up to a concentration of 50 micrograms per millilitre.

 

How did the researchers interpret the results?

The researchers say that chokeberry extract and other micronutrients should be considered as part of cancer therapy. More specifically, they suggest that elements in chokeberry extract may have “supra-additive effects” when used in combination with at least one conventional anti-cancer drug.

In an accompanying press release, Bashir Lwaleed, at the University of Southampton, comments: "These are very exciting results. The low doses of the extract greatly boosted the effectiveness of gemcitabine when the two were combined. In addition, we found that lower doses of the conventional drug were needed, suggesting either that the compounds work together synergistically [where the whole is greater than the sum of its parts], or that the extract exerts a "supra-additive" effect. This could change the way we deal with hard-to-treat cancers in the future. "

 

Conclusion

It is now commonly thought that the antioxidants found in fruits and vegetables may have many health benefits, including reducing the risk of some cancers.

Pancreatic cancer is a condition with notoriously poor prognosis, and the possibility of any new treatment on the horizon is encouraging. This study found that when pancreatic cancer cells in the laboratory were directly treated with a combination of the chemotherapy drug gemcitabine and chokeberry extract, adding the extract enhanced the cancer-killing potential compared to the chemotherapy drug alone.

However, directly adding an extract to cells in the laboratory is a lot different from people actually taking chokeberry extracts themselves. Though these are promising findings, it is too early to say whether the micronutrients found in this extract could be effective in the treatment of pancreatic cancer. Further scientific study will be needed before initial developments could progress to the next stage of trials in people with pancreatic cancer, to see whether chokeberry extract might enhance the effects of chemotherapy. 

For now, as experts importantly highlight, people with pancreatic cancer should not consider taking these chokeberry extracts in the form of a herbal remedy or supplement, based on this very early-stage research.

Herbal remedies, just like pharmaceutical medicines, will have an effect on the body and can be potentially harmful.

They should therefore be used with the same care and respect as pharmaceutical medicines. Being "natural" doesn't necessarily mean they're safe to take.

Read more about herbal medicines and supplements.

Analysis by Bazian. Edited by NHS ChoicesFollow Behind the Headlines on TwitterJoin the Healthy Evidence forum.

Links To The Headlines

Berries in cancer therapy trial. BBC News, September 18 2014

Study: Berries could boost standard cancer treatment. The Daily Telegraph, September 18 2014

Links To Science

Thani NAA, Keshavarz S, Lwaleed BA, et al. Cytotoxicity of gemcitabine enhanced by polyphenolics from Aronia melanocarpa in pancreatic cancer cell line AsPC-1. Journal of Clinical Pathology. Published online September 17 2014

Categories: NHS Choices

Do artificial sweeteners raise diabetes risk?

NHS Choices - Behind the Headlines - Thu, 18/09/2014 - 12:00

"Artificial sweeteners may promote diabetes, claim scientists," reports The Guardian. But before you go clearing your fridge of diet colas, the research in question – extensive as it was – was mainly in mice.

The researchers' experiments suggest artificial sweeteners, particularly saccharin, change the bacteria that normally live in the gut and help to digest nutrients.

These changes could reduce the body's ability to deal with sugar, leading to glucose intolerance, which can be an early warning sign of type 2 diabetes.

Assessments in human volunteers suggested the findings might also apply to people. But human studies so far are limited.

The researchers only directly tested the effect of saccharin in an uncontrolled study on just seven healthy adults over the course of a week. It is far too early to claim with any confidence that artificial sweeteners could be contributing to the diabetes "epidemic".

In the interim, if you are trying to reduce your sugar intake to control your weight or diabetes, you can always try to do so without using artificial sweeteners. For example, drinking tap water is a far cheaper alternative to diet drinks.

 

Where did the study come from?

This study was carried out by researchers at the Weizmann Institute of Science and other research centres in Israel.

It was funded by the Weizmann Institute and the Nancy and Stephen Grand Israel National Center for Personalized Medicine, as well as grants from various research funders globally.

The study was published in the peer-reviewed medical journal Nature.

The Guardian covered this study well, avoiding sensationalising the results. The paper and other media outlets, including the Daily Mail, included balanced quotes from various experts that highlight the study's limitations.

However, The Guardian reports the daily amount of saccharin used in the study in humans "was enough to sweeten around 40 cans of diet cola", but it is unclear where this estimate came from. Saccharin is not commonly used in diet drinks any longer, with aspartame being the preferred choice of most manufacturers. 

The Daily Express only included quotes from the study author (for) and a representative of the British Soft Drinks Association (against), which – as you would expect – polarised the debate.

 

What kind of research was this?

This was animal and human research looking at the effect of artificial sweeteners on bacteria in the gut and how this influences glucose metabolism.

Animal research is often one of the first steps in investigating theories about the biological effects of substances. It allows researchers to carry out studies that could not be done in humans.

Because of differences between species, results in animals may not always reflect what happens in humans, but they allow researchers to develop a better idea of how things might work.

They can then use this knowledge to develop ways to test their theories using information that can be obtained in humans. This study has carried out both the animal and early human tests of their theories. But the human part of this study was relatively limited, as the focus was on the animal research.

The researchers carried out a cross-sectional analysis of artificial sweetener exposure and indicators of metabolic problems and gut bacteria. This approach is not able to determine whether the sweetener could be contributing to the outcomes seen, or vice versa.

The researchers also tested the short-term effect of saccharin on people who never consumed the sweetener, but without a control group.

 

What did the research involve?

The researchers compared the effect of consuming the artificial sweeteners against water, glucose and sucrose on glucose tolerance in lean mice and obese mice (mice eating a high-fat diet). Glucose tolerance testing assesses how quickly the body can clear glucose from the blood after glucose is eaten.

The body normally responds by quickly taking glucose up into cells for use and storage. If the body is slow to do this, this is called glucose intolerance. Very high glucose intolerance in humans indicates diabetes.

The researchers carried out various experiments to test whether the changes seen might relate to the artificial sweeteners having an effect on the bacteria in the gut, and exactly what these effects were.

They then carried out tests to see whether artificial sweetener consumption could have similar effects in humans. They did this by cross-sectionally assessing long-term artificial sweetener consumption and various indicators of glucose metabolism problems in a sample of 381 people who were not diabetic.

They also tested the effects of commercial saccharin given to seven healthy adult volunteers who did not normally consume saccharin. This was given over the course of six days at the US Food and Drug Agency's (FDA) maximum acceptable level (5mg per kg body weight), equivalent to 120mg a day.

 

What were the basic results?

The researchers found both lean and obese mice consuming the artificial sweeteners saccharin, sucralose or aspartame in their water over 11 weeks developed glucose intolerance, while those consuming just water, glucose or sucrose did not.

Saccharin had the greatest effect on glucose intolerance, and the researchers focused most of their experiments on this sweetener. It caused glucose intolerance within five weeks when given at a dose equivalent to the US Food and Drug Administration (FDA) maximum acceptable daily intake in humans.

The researchers found the mice consuming the artificial sweeteners did not differ in their liquid and food consumption or their walking and energy expenditure compared with the controls. These factors were therefore considered to not be causing the glucose intolerance.

However, treating mice with antibiotics stopped the artificial sweeteners having this effect. Mice with no gut bacteria developed glucose intolerance when the researchers transplanted gut bacteria taken from mice consuming saccharin or being treated with saccharin in the lab. These results suggest the sweeteners were having some effect on the gut bacteria, which was causing the glucose intolerance.

The researchers also found drinking saccharin changed the types of bacteria in the mice's guts. Drinking water, glucose or sucrose did not have this effect.

The bacteria in the gut are involved in helping to digest nutrients. The specific changes seen in mice consuming saccharin suggest the sweeteners could be increasing the amount of energy that could be harvested from these nutrients.

In their human studies, the researchers found:

  • Long-term artificial sweetener consumption in 381 people who were not diabetic was associated with greater waist circumference, waist to hip ratio, levels of glucose in the blood after fasting, and worse glucose tolerance.
  • People who consumed artificial sweeteners had a different gut bacteria composition from people who did not consume artificial sweeteners.
  • Four out of seven healthy adult volunteers who did not normally consume artificial sweeteners developed worse glucose tolerance after consuming the maximum US FDA-recommended level of saccharin for six days. These four people showed gut bacteria differences compared with the three people who did not show an effect, both before and after consuming the saccharin.
  • Transfer of gut bacteria from the volunteers showing a response to bacteria-free mice caused the mice to develop glucose intolerance. This was not seen if they transferred gut bacteria from the non-responding human volunteers to mice.

 

How did the researchers interpret the results?

The researchers concluded that consuming artificial sweeteners increases the risk of glucose intolerance in mice and humans by changing the gut bacteria and therefore affecting their function.

They say their findings suggest artificial sweeteners "may have directly contributed to enhancing the exact epidemic [obesity and diabetes] that they themselves were intended to fight".

 

Conclusion

This fascinating and controversial study in mice and humans suggests artificial sweeteners, particularly saccharin, could lead to glucose intolerance by having an effect on gut bacteria. The fact that both the animal and human experiments seem to support this adds some weight to the findings.

However, the researchers' investigations in humans are currently limited. They assessed the link between long-term artificial sweetener consumption and various indicators of metabolic problems, such as fat around the waist, using a cross-sectional design. This cannot establish which came first and therefore which could be influencing the other. Also, the only confounder in humans that seemed to be considered was body mass index.

The researchers also only directly tested the effect of one artificial sweetener (saccharin) in an uncontrolled study on just seven healthy adults over the course of a week. Saccharin is less commonly used than other artificial sweeteners, and the participants also consumed it at the maximum US FDA-recommended level (equivalent to 120mg a day).

The findings suggest – at least in the short term – saccharin may only affect glucose response in some people, depending on their gut bacteria. Larger studies, which also incorporate a control group, are needed to see whether they support the results and whether other sweeteners have similar effects.

Some earlier human studies have found links between artificial sweeteners and weight gain and increased diabetes risk. However, it has generally been assumed this is because the people who consume more artificial sweeteners because the sweeteners contain no calories already have problems with their weight, which is why they are at more risk, not vice versa (reverse causation).

This study raises the intriguing possibility that artificial sweeteners could also be directly affecting how our bodies respond to sugar. However, this research is only in its early stages, and we cannot say for certain whether artificial sweeteners are contributing to the diabetes epidemic.

In the interim, if you are trying to reduce your sugar intake, you can do so without replacing sugar with artificial sweeteners.

For people trying to lose weight and those with diabetes who are trying to control their blood sugar, it is important to do what works for them as this is more likely to be sustainable in the long term.

For some people, substituting food and drinks containing artificial sweeteners, rather than those containing sugar, may help with these goals.

At this stage, it is far too early to drop artificial sweeteners from the arsenal of sugar alternatives that could be used to fight the diabetes and obesity epidemic.

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Artificial sweeteners may promote diabetes, claim scientists. The Guardian, September 17 2014

Sweeteners 'linked to rise in obesity and diabetes'. The Independent, September 17 2014

Low-calorie sweeteners found in diet drinks RAISE the risk of obesity and diabetes by affecting how the body processes sugar. Daily Mail, September 18 2014

Artificial food sweeteners linked to diabetes. Daily Express, September 17 2014

Sweeteners 'could cause obesity' scientists warn. The Daily Telegraph, September 17 2014

Artificial sweeteners linked to glucose intolerance. New Scientist. September 17 2014

Links To Science

Suez J, Korem T, Zeevi D, et al. Artificial sweeteners induce glucose intolerance by altering the gut microbiota. Nature. Published online September 17 2014

Categories: NHS Choices

Cosmetics blamed for raised child asthma risk

NHS Choices - Behind the Headlines - Wed, 17/09/2014 - 12:10

"Chemicals in make-up and perfumes fuelling rise in children with asthma," reports the Mail Online.

One scientist, the website claims, suggests that women should take measures such as checking the contents of their make-up and avoiding using plastic containers for food.

This story is based on research following 300 inner-city children in the US and their mothers from the time of their pregnancy to age 11. The women's urine was tested in the third trimester for a group of chemicals called phthalates as a measure of the child's potential exposure in the womb.

They found the children of mothers who had the highest levels of exposure to two phthalates (butylbenzyl phthalate [BBzP] and di-n-butyl phthalate [DnBP]) in pregnancy were more likely to report asthma-like symptoms such as wheezing between the ages of 5 and 11, and to have current asthma.

Crucially, BBzP and DnBP are among several phthalates that have been banned from children's toys and cosmetics in the EU. The Daily Telegraph reports that from 2015 BBzP will be routinely banned. Countries outside the EU may have different legislation on the use of these chemicals.

The study's relatively small size means the size of the potential impact on risk is uncertain. Another limitation is that the study only looked at African American and Dominican inner-city women, and the results may not apply to wider groups of females.

It's also difficult to say for certain whether the phthalates are directly causing the increase in asthma cases. The authors themselves acknowledge that the findings need to be treated with caution until they are checked in other studies.

 

Where did the story come from?

The study was carried out by researchers from Columbia University and other research centres in the US. It was funded by the National Institute of Environmental Health Sciences.

The study was published in the peer-reviewed journal Environmental Health Perspectives.

The Daily Telegraph and The Guardian both crucially note the restrictions on the use of these phthalates in the EU. The Guardian states the US has fewer restrictions on phthalate use.

This difference may contribute to the Mail Online's reports that US scientists are "urging parents to reduce the risk by avoiding using plastic containers, perfume and heavily scented washing detergents".

The researchers do not do this in their research paper, which suggests caution in interpreting its results, although one of the authors is quoted in the Mail Online as making some suggestions to reduce exposure.

This may cause unnecessary concern, given that the Mail Online do not report on the existing, and impending, restrictions on the use of these chemicals in the EU. It is worth bearing in mind that many of the Mail Online's readers are based in the US, so this content may have been aimed at them.

 

What kind of research was this?

This was a prospective cohort study looking at whether exposure to chemicals called phthalates while in the womb is linked to a child's risk of developing asthma.

Phthalates are found in many consumer products, such as food packaging materials and various household products, including some beauty products. As such, people may consume some phthalates in their food or through the wider environment.

Previous studies suggested phthalates in the environment and in the body may be associated with asthma, but no studies have looked at the impact of exposure to these chemicals in the womb.

This type of study is the best way to assess whether there is an association between an earlier exposure and a later outcome in humans. While such research can provide evidence of an association, it is not possible to say for certain whether the exposure directly causes the outcome.

To weigh up whether the exposure is causing the outcome, researchers need to draw on a wide range of evidence, including human and animal studies. All or most of the evidence needs to support the possibility that the exposure causes the outcome before researchers can be relatively confident this is the case.

 

What did the research involve?

The researchers collected urine from 300 pregnant women and measured the levels of various phthalates in these samples as an indication of the exposure of the foetus to these chemicals.

They then followed up the women's children when they were aged 5 to 11 to identify anyone who had developed asthma. They analysed whether higher levels of exposure to phthalates was linked with an increased risk of developing asthma.

Pregnant African American or Dominican women were enrolled to take part in the Columbia Center for Children's Environmental Health (CCCEH) longitudinal birth cohort study between 1998 and 2006. To be eligible, they had to have lived in Northern Manhattan or the South Bronx for at least one year before their pregnancy.

Women who smoked or took illegal drugs, who had not received prenatal care early in their pregnancy, or who had medical conditions such as diabetes or HIV were not eligible to participate. Of the 727 women taking part in the CCCEH study, 300 had provided all of the samples and information needed to be analysed.

The women provided urine samples for testing in their third trimester of pregnancy, and the children provided samples at ages three, five and seven.

Researchers measured four chemicals formed during the breakdown of four different types of phthalates in the samples (called metabolites). These phthalates have long chemical names, which are abbreviated to DEHP, BBzP, DnBP and DEP.

They also measured levels of another type of chemical called bisphenol A, which is also found in consumer plastics and has suggested links to various illnesses.

The mothers were sent asthma questionnaires five times when the children were between the ages of five and 11. These asked about whether the children had asthma symptoms or took asthma medication over the previous year.

The first time the mother reported that their child had symptoms that could indicate asthma (such as wheeze or whistling in the chest, or a cough lasting more than a week) or took asthma medications, the child was referred for a standard assessment by a doctor, including lung function tests.

Based on this assessment, the children were classified as having current asthma or not current asthma (despite a history of symptoms).

The researchers also assessed various factors that could impact results (confounders) as they were thought to be associated with people's phthalate exposure or asthma. This included things such as:

  • exposure to household tobacco smoke prenatally or after birth
  • maternal asthma
  • financial hardship during pregnancy (lack of food, housing, gas, electricity, clothing, or medicine)
  • prenatal bisphenol A exposure
  • child exposure to phthalates after birth (as measured in the child's urine)

They took these factors into account in their analyses, which looked at whether the level of prenatal exposure to phthalates was related to a child's risk of developing asthma.

 

What were the basic results?

Just over half of the children (51%) were assessed by a doctor because they had been reported to have wheezing or other asthma-related symptoms, or to have used asthma medications. After assessment, 31% were judged to have current asthma and 20% to not have current asthma.

The levels of prenatal exposure to two phthalates, called BBzP and DnBP, showed significant association with having a history of asthma-like symptoms and having current asthma.

Compared with children whose mothers had the lowest levels of these phthalates prenatally (levels in the bottom third of measurements), children whose mothers had the highest levels (levels in the top third of measurements) were:

  • about 40% more likely to have a history of asthma symptoms (relative risk [RR] 1.39 and 1.44 for the two different phthalates; confidence intervals [CI] showed the links were statistically significant)
  • about 70% more likely to have current asthma (RR 1.72 and 1.78 for the two different phthalates; CI showed the links were statistically significant)

Analyses suggest the levels of prenatal exposure to the other two phthalates, called DEHP and DEP, were not associated with a history of asthma symptoms or current asthma. The children's levels of exposure to the phthalates from ages three to seven were not associated with childhood asthma.

 

How did the researchers interpret the results?

The researchers concluded that, "prenatal exposure to BBzP and DnBP may increase the risk of asthma among inner-city children". They note that, as this is the first study to find this, the results need to be interpreted cautiously until they are replicated in other studies.

 

Conclusion

This study, analysing 300 inner-city women and their children, suggests there may be a link between exposure to certain phthalate chemicals prenatally and a child's risk of asthma and asthma symptoms between the ages of 5 and 11.

The strength of this study is its design – prospectively setting out the data it wanted to collect and doing so in a standardised way, also following up the participants over time.

Many studies looking at the links between chemical exposures and adverse outcomes measure both at the same time, meaning it is not clear whether one came before, and therefore might directly influence, the other.

This study also had children with reported asthma symptoms assessed by a doctor to confirm their diagnosis, which is likely to be more accurate than relying solely on parental reporting.

The study does have its limitations, however:

  • The study was relatively small and in a very select group of women (of African American and Dominican ethnicity, living in inner-city areas). Results may not be representative of what might be found in a larger, more diverse, sample.
  • The small sample size also means it's hard to be precise about what level of risk could be associated with the chemicals, and the increase could be anywhere from 5%, and for current asthma could be anywhere from 15%.
  • Phthalate metabolites in the pregnant women's urine was only measured once, in the third trimester, and this may not be representative of exposures throughout the whole pregnancy. The researchers report that studies that have compared levels of these chemicals in people's urine over time show only "moderate" consistency.
  • As with all studies of this type, other factors may have an effect on the results (confounders). The authors did take into account a range of potential confounders, but their effect may not be completely removed, and unmeasured factors may also be having an effect.

These are early findings on this particular association, and it's not possible to say for certain whether these chemicals are definitely having an effect on the child's asthma risk. The authors of the study themselves are appropriately cautious, suggesting that their findings need to be confirmed in other studies before firm conclusions can be drawn.

The study also did not assess the sources of the women's exposure to phthalates. The researchers say that, based on previous studies, PVC products could be a likely "substantial source" of BBzP exposure in the home.

If evidence accumulates that chemicals used in consumer products may be associated with health risks, it's likely that government agencies will review this evidence and come to a decision about whether their use needs to be limited.

Phthalates are a group of chemicals that are being extensively studied, and there are already EU-wide regulatory controls on their use.

For example, there is a ban on using six phthalates, including BBzP and DnBP, in toys and products for children under the age of three. BBzP and DnBP are also banned in cosmetics in the EU.

The UK Food Standards Agency also says there has been a move away from using phthalates in some food packaging in Europe, and has assessed the levels of phthalates in food and the associated potential risks.

The Daily Telegraph reports BBzP "will be among three chemicals whose use is routinely banned by the EU" from 2015. 

Analysis by Bazian. Edited by NHS ChoicesFollow Behind the Headlines on TwitterJoin the Healthy Evidence forum.

Links To The Headlines

Some household plastics could increase risk of childhood asthma, study finds. The Guardian, September 17 2014

Chemicals in make-up and perfumes fuelling rise in children with asthma. Daily Mail, September 17 2014

Asthma risk from exposure to chemicals in the womb. The Daily Telegraph, September 17 2014

Links To Science

Whyatt RM, Perzanowski MS, Just AC, et al. Asthma in Inner-City Children at 5-11 Years of Age and Prenatal Exposure to Phthalates: The Columbia Center for Children’s Environmental Health Cohort (PDF, 643kb). Environmental Health Perspectives. Published online September 17 2014

Categories: NHS Choices

Pages