Food as medicine: why do we need to eat so many vegetables and what does a serve actually look like?

This is the first article in a three-part package “food as medicine”, exploring how food prevents and cures disease. The Conversation


Most Australian adults would know they’re meant to eat two or more serves of fruit and five or more serves of vegetables every day. Whether or not they get there is another question.

A recent national survey reported 45% of Australian women and 56% of Australian men didn’t eat enough fruit. And 90% of women and 96% of men didn’t eat enough vegetables. This figure is worse than for the preceding ten years.

Men had on average 1.6 serves of fruit and 2.3 serves of vegetables per day, and women had 1.8 serves of fruit and 2.5 serves of vegetables. A serve of fresh fruit is a medium piece (about 150 grams) and a serve of vegetables is half a cup of cooked vegetables or about a cup of salad.
Why do we need so many veggies?

A high intake of fruit and vegetables lowers the risk of type 2 diabetes, heart disease, stroke and some cancers. These chronic diseases are unfortunately common – it’s been estimated A$269 million could have been saved in 2008 if everyone in Australia met fruit and vegetable recommendations.

The recommendation to include plenty of vegetables and fruit in our diet is based on a large body of evidence showing the risk of a range of health conditions is reduced as we eat more fruit and vegetables. The specific targets of two serves for fruit and five to six serves for vegetables are largely based on nutrient requirements for healthy people and what diets usually look like for the average Australian.

So to set these guidelines, certain assumptions are made about dietary practices, such as breakfast being based around cereal/grain and dairy foods, and main meals being comprised of meat and vegetables, usually with a side of something starchy like rice, pasta or the humble potato – an Australian staple.

Does this mean it’s the only pattern to meet all the nutrient requirements? No. Could an adult be equally healthy if they ate three serves of fruit and four serves of vegetables? Yes, probably.

Some recent research even suggests our current targets don’t go far enough. It estimates an optimal intake for reducing our risk of heart disease and early death to be around ten serves of fruit and vegetables a day. Whether we are aiming for two and five, or ten serves, is somewhat academic – the clear message is most of us need to increase our fruit and vegetable intake.

Why is two and five such a hard ask?

The populations of most Western countries report eating far less fruit and vegetables than they’re supposed to. So what’s making it so hard for us to get to two and five?

Diets higher in fat, sugar and grains are generally more affordable than the recommended healthy diets high in fruit and veg. In fact, for Australians on low incomes, a healthy food basket for a fortnight would cost 28 to 34% of their income, up to twice the national average for food expenditure.

As a result, people with limited access to food for financial reasons often choose foods with high energy content (because they are filling) over those with high nutritional value but low energy content like fruit and vegetables. These high-energy foods are also easy to over-consume and this may be a contributing factor to weight gain. People who are poorer generally have a diet poorer in quality but not lower in energy content, which contributes to a higher rate of obesity, particularly in women.

Fresh fruit and vegetables cost more to purchase on a dollars per kilojoule basis, and also perish more quickly than processed foods. They take more time and skill to prepare and, after all of that effort, if they don’t get eaten for reasons of personal preference, they go to waste. For many it may not stack up financially to fill the fridge with fruit and vegetables. Under these circumstances, pre-prepared or fast food, which the family is sure to eat without complaint or waste, is all too convenient.

How we can increase veggie intake

The home and school environments are two key influencers of children’s food preferences and intakes. Parents are the “food gatekeepers” and role models particularly for younger children. Where there is parental encouragement, role modelling and family rules, there is an increased fruit and vegetable intake.

Dietary behaviours and food choices often start in childhood and continue through adolescence to adulthood. So encouraging fruit and vegetable intake in schools by mechanisms such as “fruit snack times” may be a good investment.

Policy approaches include subsidies on healthy foods. Other examples include levying a tax on foods of low nutritional value, improved food labelling, and stricter controls on the marketing of unhealthy foods. In Australia debate continues around a tax on sugar-sweetened beverages, which could be used to subsidise healthy foods such as fruit and vegetables.

Research has found the more variety in fruit and vegetables available, the more we’ll consume. Those who meet the vegetable recommendation are more likely to report having at least three vegetable varieties at their evening meal. So increasing the number of different vegetables at the main meal is one simple strategy to increase intake.

This could be made a journey of discovery by adding one new vegetable to the household food supply each week. Buying “in season” fruit and vegetables and supplementing fresh varieties with frozen and canned options can bring down the total cost. Then it’s a matter of exploring simple, quick and tasty ways to prepare them so they become preferred foods for the family.

Genevieve James-Martin, Research Dietitian, CSIRO; Gemma Williams, Research Dietitian, CSIRO, and Malcolm Riley, Nutrition Epidemiologist, CSIRO

This article was originally published on The Conversation. Read the original article.

Can turmeric really shrink tumours, reduce pain and kill bacteria?

Turmeric is a yellow coloured spice widely used in Indian and South East Asian cuisine. It’s prepared from the root of a plant called Curcuma longa and is also used as a natural pigment in the food industry. The Conversation

In the literature, curcumin is reported to be an antioxidant that protects the body against damage from reactive molecules. These are generated in the body as a result of metabolism and cause cell damage (known as free radicals).

It’s also reported to have anti-inflammatory, anti-bacterial and anti-cancer properties, as well as encouraging the death of cells that are dangerous or no longer needed by the body.

Curcumin has been widely studied in relation to numerous ailments, but what does the literature say? Is consuming turmeric beneficial?

For aches and pains

Chronic inflammation has been linked to the development of numerous diseases such as obesity, diabetes, heart disease and cancer. There is some evidence curcumin reduces the levels of certain substances (cytokines) that produce inflammation.

Systematic reviews and meta-analyses, which combine data from several randomised controlled trials (where an intervention is tested against a placebo, while the subjects and those conducting the study don’t know who has received which treatment) support this finding to a certain extent.

A meta-analysis of nine randomised controlled trials showed taking curcumin supplements led to a significant reduction in cytokines that produce inflammation. But the authors claimed these reductions were modest, and it’s unclear if they would actually have a benefit in real life.

These trials were conducted with small sample sizes ranging from 10 to 50 people, which reduces the strength of the evidence. It’s difficult to draw a conclusion on a beneficial dose and how long you should take curcumin, or the population group that can benefit the most from curcumin.

A meta-analysis investigated the effects of turmeric/curcumin on pain levels in joint arthritis patients. The group supplemented with 1000mg of curcumin per day said they had reduced pain compared with the placebo group.

In this study, curcumin was found to be as effective as ibuprofen in terms of reducing pain levels in these patients. But the authors of this meta-analysis themselves suggested that due to small sample size and other methodological issues there is not sufficient evidence to draw definitive conclusions.

Turmeric is often marketed as an anti-inflammatory.
Screen shot/Suppkings

For diabetes and heart disease

Curcumin is also thought to be beneficial in preventing insulin resistance (which leads to increased blood sugar), improving high blood sugar and reducing the toxic effects of high blood glucose levels.

But these studies have been conducted in animals and are very few human trials have been conducted in this area.

One study that reported reduction in blood glucose levels in type 2 diabetes patients reports a change in blood glucose from 8.58 to 7.28 millimoles per litre after curcumin supplementation. People with levels above seven are classified as diabetics. So in clinical terms, the change is not that much.

Similarly in relation to heart disease, animal studies show benefits of curcumin supplementation in improving heart health, but there are very few clinical trials conducted in heart disease patients.

Smaller clinical trials looking at ten patients also show benefits of curcumin in reducing serum cholesterol, which is a risk factor for heart disease. But meta-analysis looking at combined effects of different trials does not show these benefits.

For cancer

Curcumin has also been widely studied in relation to its anti-cancer properties. Laboratory and animal studies support this claim, but the evidence for cancer prevention in human trials is lacking.

Although there are some small studies (in 25 cancer patients) that showed reductions in precancerous lesions, and two patients showed shrunken tumors, this small number is not enough to conclude anti-cancerous effects of curcumin.

There is some evidence curcumin lessens the severity of side-effects from radiation therapy such as radiation-induced dermatitis and pneumonitis (inflammation of lungs), but not the cancer itself.

Safety

Research shows not all curcumin we take orally is absorbed. This has led to the use of other things such as lipids (fats) and piperine (found in black pepper), to help it absorb into our system.

High intakes (up to 12 grams a day) of curcumin can cause diarrhoea, skin rash, headaches and yellow-coloured faeces. Looking at the Indian population, they consume about 100mg of curcumin a day, which corresponds to 2 to 2.5 grams of turmeric per day.

Consuming turmeric as part of a balanced diet is probably good for you. Injecting it definitely isn’t.
from www.shutterstock.com

 

But they also consume these amounts over relatively long periods of time (typically their lifespan). There are reports of lower cancer rates in the Indian population and this has been linked to turmeric consumption, but there are no longer term trials proving this link.

It appears that in order to receive benefits from high doses over a short period of time, people are now resorting to injecting turmeric intravenously. There is no evidence to support the benefits of high doses of turmeric or IV injections of turmeric at all.

In fact, at very high doses, curcumin’s predominant activity switches from antioxidant to pro-oxidant, which means rather than preventing cells from damage, it promotes cell damage and has also been reported to cause tumours in rodents.

Although curcumin is showing some encouraging effects in reducing markers of inflammation in humans, the majority of the pharmacological effects of curcumin are in lab studies or animal experiments. Until there are more high quality randomised controlled trials conducted to confirm the benefits of curcumin or turmeric, it’s best to consume turmeric orally as a spice as part of a healthy, nutritious diet.

Gunveen Kaur, Lecturer In Nutritional Sciences, Deakin University

This article was originally published on The Conversation. Read the original article.

Top image: Sutterstock

WA alcohol reduction ad ranked best in the world [VIDEO]

A graphic advertisement showing how alcohol is absorbed into the bloodstream, increasing the risk of cancerous cell mutations is the most effective alcohol education advertisement in the world, according to a new study.

The study, published in British Medical Journal Open, tested 83 alcohol education advertisements from around the world and found that Western Australian advertisement ‘Spread’ (by The Brand Agency) was most likely to motivate drinkers to reduce their alcohol consumption.

The advertisement demonstrates that alcohol is carcinogenic, which Cancer Council Victoria CEO Todd Harper said is still widely unknown in the community.

“Our 2015 survey of Victorian men and women found that nearly half of the respondents either believed that alcohol made no difference or were not sure if it had any effect on a person’s risk of cancer,” Harper said.

“It’s worrying because alcohol is a Group 1 carcinogen – the highest classification available. It means that there is strong evidence that alcohol causes cancer at some body sites in humans.

Harper said that every drink increases the risk of cancer of the mouth, throat, bowel, liver and female breast.

“More than 3200 cases of cancer each year in Australia could be prevented if people limited their alcohol consumption,” he said.

“We recommend that those who choose to drink alcohol consume no more than two standard drinks on any day.”

Harper said the research highlights how mass media campaigns can be used to help the public understand more about the consequences of long-term alcohol consumption.

“We’ve seen how effective campaigns around drink driving and short term harms such as injury or violence have been in terms of changing our drinking habits, but in Victoria and the majority of the rest of Australia, we rarely see the long-term health effects of alcohol portrayed on our screens,” he said.

Cancer Council Victoria is hoping to use the top advertisement in a campaign later in 2017.

Diageo rolls out new alcohol labelling

Global alcohol producer Diageo has begun rolling out new labelling intended to make it clear how much alcohol consumers are having in each drink.

The labelling introduces clearer ‘icon-led’ on-pack information panels including alcohol content and nutritional information per typical serve.

According to research released by the company, consumers want more information about what’s in their drink and want it presented in a clearer way.

The study found that, of over 1,000 Australians aged 18 and over, nearly three-quarters (73 per cent) of consumers believe that it is important to have clear information about the number of standard drinks, calorie content and alcohol strength of their beverages.

Research insights also show that women (52 per cent) are significantly more likely to rate clearer labelling very important, than men (38 per cent).

“As part of our commitment to responsible drinking, we’re always looking for ways to help consumers make the most informed decisions around drinking, or choosing not to drink. This initiative helps consumers have clearer information about what’s in their glass, and in a way that they can easily understand it at a glance,” said Diageo spokesperson Kylie McPherson.

Updated labelling - Bundaberg Rum

Research insights also highlight information gaps in existing alcohol labelling – given calories aren’t currently detailed on most labels. Over half (57 per cent) of Australians find it difficult to work out the calorie content within a serve of alcohol, and 87 per cent of people have no understanding of the calorie content in their favourite drink.

Additionally, only one in five (18 per cent) consumers claimed they found it very easy to know how many standard drinks there are per serve of alcohol.

Bundaberg Rum Original is the first brand to receive the updated information panels and will be followed by a roll out across the wider Diageo portfolio.

Marmite may be good for the brain – study

Iconic British yeast extract Marmite may affect brain function, according to researchers at the University of York.

The study, published in the Journal of Psychopharmacology, found that those who ate the product had an apparent increase of a chemical messenger associated with healthy brain function.

Participants consuming a teaspoon of Marmite every day for a month, compared to a control group who consumed peanut butter, showed a substantial reduction of around 30 per cent in their brain’s response to visual stimuli, measured by recording electrical activity using electroencephalography (EEG).

Researchers think this may be due to the prevalence of vitamin B12 in Marmite increasing levels of a specific neurotransmitter – known as GABA – in the brain.

Anika Smith, PhD student in York’s Department of Psychology and first author of the study, said: “These results suggest that dietary choices can affect the cortical processes of excitation and inhibition – consistent with increased levels of GABA – that are vital in maintaining a healthy brain.

“As the effects of Marmite consumption took around eight weeks to wear off after participants stopped the study, this suggests that dietary changes could potentially have long-term effects on brain function.

“This is a really promising first example of how dietary interventions can alter cortical processes, and a great starting point for exploring whether a more refined version of this technique could have some medical or therapeutic applications in the future. Of course, further research is needed to confirm and investigate this, but the study is an excellent basis for this.”

 

Aussies not eating enough fruit and veggies – report

Four out of five Australian adults are not eating enough fruit and vegetables in order to meet the Australian Dietary Guidelines, according to a report by the CSIRO.

The Fruit, Vegetables and Diet Score Report released today, found one in two (51 per cent) adults are not eating the recommended intake of fruit, while two out of three adults (66 per cent) are not eating enough vegetables.

The report, produced by the CSIRO and commissioned by Horticulture Innovation Australia, compiled the dietary habits of adults across Australia over an 18 month period.

With 145,975 participants nationwide, the survey was the largest of its kind ever conducted in Australia.

The overwhelming message is that most Australians are not as healthy as they think, and need to eat higher quantities and a greater variety of fruit and vegetables every day to meet the minimum Australian benchmark.

Percentage-of-men-and-women-who-meet-the-guidelines

To help meet the benchmark, CSIRO suggests adults eat at least three serves of different vegetables every dinner time.

“Many Aussies believe themselves to be healthy, yet this report shows the majority of those surveyed are not getting all the beneficial nutrients from fruit and vegetables needed for a healthy, balanced diet,” Research Director and co-author of the CSIRO Total Wellbeing Diet Professor Manny Noakes said.

“One simple way to boost your intake is to eat three different types of vegetables with your main evening meal.”

One of the key findings in the research is that a focus on variety could be the solution to boosting consumption.

CSIRO researchers analysed this data to develop a comprehensive picture of the country’s fruit and vegetable consumption.

Women reported slightly better fruit and vegetable consumption with 24 per cent meeting both guidelines, compared with only 15 per cent of men surveyed.

When comparing the figures by occupation, construction workers and those in the science and programming sector recorded the poorest fruit and vegetable eating habits.

On the other hand, retirees and health industry workers were more likely to meet the recommended dietary guidelines.

The report also found that the CSIRO Healthy Diet Score (which measures overall diet quality on a scale of zero to 100) is positively correlated with fruit and vegetable intake.

To take the free CSIRO Healthy Diet Score visit www.csirodietscore.com

 

The online tool that can track, monitor and analyse nutritional intake

We all know that most people could improve the quality of their diet. Most of us do not eat the recommended five-a-day portions of fruit and vegetables – let alone seven or even ten, as some have suggested. Nor do we consume adequate amounts of oily fish. The Conversation

Instead, intakes are often too high in saturated fats and sugars added to foods and fruit juice, and too low in fibre and some key vitamins and minerals, including vitamin A and iron. A significant proportion of adults in both the UK and the US are obese or overweight. Intake of red and processed meat is too high, and meat consumption continues to rise in the US, the European Union and the developed world. Despite a shift toward higher poultry consumption, the largest proportion of meat consumed in the US is still red meat (58%).

There are serious implications for long-term health as a result of this disordered way of eating. To improve the situation we need to know how much energy and nutrients are being provided by our food. To help do this, we developed myfood24, an online dietary assessment tool that can support accurate, detailed recording of food and nutrient intake by researchers, but which can also support patients with diet-related conditions, sports enthusiasts, families with “picky” eaters and others. With data on 40,000 nutrients, it includes the largest and most complete food composition table in the UK, and possibly the world.

Monitoring intake

The size of portions and packaging has increased over the past 50 years, as has the number of products on supermarket shelves. This variety of choice makes it hard for consumers to even start to estimate how many calories or nutrients they might be consuming.

A new generation of smartphone apps offer users a chance to monitor their intake. However, there isn’t strong evidence that most of these are effective. Twenty-eight of the top 200 rated health and fitness category apps from Google Play and iTunes focused on both weight management and self-monitoring diets. When these apps were compared to people using a standard record of weighed food that they ate, the apps over- or underestimated energy intake by 10-14%.

But it’s not just consumers who are affected by inaccurate monitoring. Researchers, who base their studies on this kind of data, also encounter problems.

A major limitation of nutrition research is getting an objective measure of dietary intake. Misreporting is a big problem when people self-report their diet and is particularly common in overweight or obese people. Misreporting generally tends towards under-reporting of unhealthy foods and over-reporting of fruits and vegetables.

Metabolic profiling, which involves testing urine for the hundreds of metabolites that provide chemical signatures of food and nutrient intakes, doesn’t require self-reporting and may be a useful addition to self-reporting. A highly controlled study of 19 people fed four different diets found differences in metabolite concentrations. While this approach cannot replace the need to determine what actual food and nutrients have been consumed, it could be used as an objective measure to validate self-reports.

Understanding what nutrients are in the food we eat also relies on having comprehensive and up-to-date food composition tables – standardised national databases with accurate measures of many nutrients in typical foods. Standard food composition tables in the UK list around 3,000 food items, the majority of which are generic rather than branded (which more of us are likely to consume). While they include the full range of nutrients, they only include a limited selection of foods which are available for purchase.

Useful data.
Shutterstock

Pre-packed foods legally have labels with nutritional values. These include values for energy (kJ and kcal), and amounts of fat, saturates, carbohydrate, sugars, protein and salt. Further information can be included but is not compulsory for mono and polyunsaturated fats, starch, fibre, vitamins or minerals. If a nutrition or health claim is made on the packaging then the amount of that nutrient must also be stated.

Real time feedback

Developed with funding from the Medical Research Council, myfood24 combines the convenience of new technologies with an enhanced food composition table. Covering a wide range of generic and branded foods, it’s a quick and easy tool to help researchers, and potentially also clinicians, to track, monitor and analyse nutritional intake. We mapped the 40,000 nutrients from food label information and generic food data. To get an idea of the scale of this, the number of products on supermarket shelves is around 50,000 items.

The tool replaces the need for time consuming and costly coding of paper records that researchers and clinicians use. It means that people can record their dietary intake by selecting foods and portion sizes and adding them to their food diary. We hope this will support more accurate self-reporting, especially as users can be less self-conscious than when reporting to an interviewer. Researchers can then use results from this to find out in detail what foods and nutrients are being eaten. This data can then be linked to health outcomes or matched against recommendations.

Real time feedback of nutrients in foods could help us choose a more appropriately balanced diet over the week. Much as we have come to rely on regular visits to the dentist to ensure our teeth are healthy, the regular use of dietary monitoring could help us to ensure that our food and nutrient intakes are also healthy.

Janet Cade, Professor of Nutritional Epidemiology and Public Health, University of Leeds

This article was originally published on The Conversation. Read the original article.

Top image: Shutterstock

 

Does gluten prevent type 2 diabetes? Probably not

A recent analysis of a massive study observing the effect of food on the health of nearly 200,000 American health professionals suggested eating more gluten was associated with a lower risk of type 2 diabetes. The Conversation

But is it really this simple?

Can gluten be linked to diabetes?

A considerable amount of published research has looked at the potential links between coeliac disease and type 1 diabetes (a chronic condition where the pancreas produces little or no insulin). This has led to the discovery that they often share similar genetic markers linked to the immune system.

Another recent study found that although coeliac disease was more common in people with type 1 diabetes there were no more cases of coeliac disease in people with type 2 diabetes (which usually presents in adulthood, and is typically associated with lifestyle factors) than the general population.

However, while studies in animals suggest gluten may increase risk of developing type 1 diabetes, human studies do not. A large review investigating when infants are first given gluten and their risk of developing type 1 diabetes found no link, unless infants were fed solids in their first three months, which is much younger than the six months recommended by the World Health Organisation.

And in animal studies of type 2 diabetes, it has been suggested gluten may increase the risk of developing diabetes.

How reliable are the study results?

Mice studies are interesting, but we need to look at data from people. This is typically done in either clinical trials, which can assess causality (that one thing caused the other), or by observing groups, which identify associations only (two things happened together, but one didn’t necessarily cause the other).

This new study fits into the latter. The study looked at data from three big studies that started 40 years ago with the Nurses’ Health Study, and continued with Nurses’ Health Study II (1989) and Health Professionals Follow Up Study (1986). These looked at the effect of nutrition on long-term health.

The latest news, suggesting gluten may lower risk of type 2 diabetes, was reported at an American Heart Association conference last week. The full research paper is not readily available, so we have to rely on a press release from the AHA.

This reported that the 20% of people with the highest intake of gluten had a 13% lower risk of developing type 2 diabetes compared to those eating less than 4g a day (which is equivalent to less than two slices of bread).

Foods that contain gluten often also contain other good things.
from www.shutterstock.com

So, it could seem that gluten intake is protective against developing type 2 diabetes.

However, a more likely explanation could be that this is an effect of other things in foods that also contain gluten. Perhaps, eating wholegrains – including wheat, barley and rye could be responsible for the reported results. They are key dietary sources of gluten and are rich in fibre and a number of vitamins (such as vitamin E) and minerals (such as magnesium).

Evidence of this can be seen in an earlier analysis of the same data, which found that those consuming the most wholegrain had a 27% reduced risk of developing type 2 diabetes.

It’s also plausible that the foods people were eating that didn’t contain gluten were more likely to be discretionary foods, such as French fries, and that could be a factor. This was also seen in another analysis of this data, which found the highest consumers of French fries had a 21% increased risk of developing type 2 diabetes.

Avoiding gluten can mean losing important nutrients

So, any conclusions regarding effects of gluten in prevention of type 2 diabetes cannot be drawn from this study. The authors acknowledge this in the conference media release. The observed effect is likely to be related to other factors in foods consumed or not consumed.

The study also suggests that for people who do not have a clinical reason to avoid gluten (such as coeliac disease, wheat allergy or other gluten sensitivities), restricting the intake of foods that could have other benefits can be harmful. They need to look for replacement sources of fibre and other nutrients.

Avoiding gluten is an increasing trend, possibly linked to media attention associated with popular alternative dietary messages such as “paleo”, or following the latest fad diets observed in celebrities and athletes. This may not be a problem if nutrients are replaced by other foods. But that can be challenging, particularly if there are diet or food restrictions in such plans.

To get the best out of this way of eating, it’s important to have a comprehensive understanding of diet and nutrition, which may require a visit to a dietitian or other healthcare professional.

Including foods containing gluten, unless you have a medical reason to exclude them, can be the simplest way to benefit from the fibre and other nutrients they contain. If you wish to remove gluten from your diet, you should look to include healthy, naturally gluten-free grains such as quinoa or buckwheat.

Although this study is interesting, it’s important to remember that without a medical reason, going gluten free is unlikely to result in any therapeutic benefits. But if you do, you need to ensure you don’t replace these foods with discretionary foods high in fat, salt and sugar.

Duane Mellor, Associate Professor in Nutrition and Dietetics, University of Canberra and Cathy Knight-Agarwal, Clinical Assistant Professor of Nutrition and Dietetics, University of Canberra

This article was originally published on The Conversation. Read the original article.

Why gluten-free food is not the healthy option

It’s hard not to notice that the range of gluten-free foods available in supermarkets has increased massively in recent years. This is partly because the rise in the number of people diagnosed with coeliac disease and gluten sensitivity, and partly because celebrities such as Gwyneth Paltrow, Miley Cyrus and Victoria Beckham have praised gluten-free diets. What used to be prescription-only food is now a global health fad. But for how much longer? New research from Harvard University has found a link between gluten-free diets and an increased risk of developing type 2 diabetes. The Conversation

Gluten is a protein found in cereals such as wheat, rye and barley. It is particularly useful in food production. For example, it gives elasticity to dough, helping it to rise and keep its shape, and providing a chewy texture. Many types of foods contain gluten, including less obvious ones such as salad dressing, soup and beer.

Gluten gives dough its elasticity.
Marko Poplasen/Shutterstock.com

The same protein that is so useful in food production is a nightmare for people with coeliac disease. Coeliac disease is an autoimmune disorder in which the body mistakenly reacts to gluten as if it were a threat to the body. The condition is quite common, affecting one in 100 people, but only a quarter of those who have the disease have been diagnosed.

There is evidence that the popularity of gluten-free diets has surged, even though the incidence of coeliac disease has remained stable. This is potentially due to increasing numbers of people with non-coeliac gluten sensitivity. In these cases, people exhibit some of the symptoms of coelaic disease but without having an immune response. In either case, avoiding gluten in foods is the only reliable way to control symptoms, which may include diarrhoea, abdominal pain and bloating.

Without any evidence for beneficial effects, many people without coeliac disease or gluten sensitivity are now turning to gluten-free diets as a “healthy” alternative to a normal diet. Supermarkets have reacted to meet this need by stocking ever growing “free from” ranges. The findings of this recent study, however, suggest that there could be a significant drawback to adopting a gluten-free diet that was not previously known.

Inverse association

What the Harvard group behind this study have reported is that there is an inverse association between gluten intake and type 2 diabetes risk. This means that the less gluten found in a diet the higher the risk of developing type 2 diabetes.

The data for this exciting finding comes from three separate, large studies which collectively included almost 200,000 people. Of those 200,000 people, 15,947 cases of type 2 diabetes were confirmed during the follow-up period. Analysis showed that those who had the highest intake of gluten had an 80% lower chance of developing type 2 diabetes compared to those who had the lowest levels of gluten intake.

This study has important implications for those who either have to avoid or choose to avoid gluten in their diet. Type 2 diabetes is a serious condition that affects more than 400m people worldwide – a number which is certain to increase for many years to come.

Collectively, diabetes is responsible for around 10% of the entire NHS budget and drugs to treat diabetes alone cost almost £1 billion annually. There is no cure for type 2 diabetes and remission is extremely rare. This means that once diagnosed with type 2 diabetes, it is almost impossible to revert back to being healthy.

It is important to note that the data for this study was retrospectively gathered. This allows for very large numbers to be included but relies on food-frequency questionnaires collected every two to four years and the honesty of those recruited to the study. This type of study design is rarely as good as a prospective study where you follow groups of people randomly assigned to either have low- or high-gluten diets over many years. However, prospective studies are expensive to run and it’s difficult to find enough people willing to take part in them.

While there is some evidence for a link between coeliac disease and type 1 diabetes, this is the first study to show a link between gluten consumption and the risk of type 2 diabetes. This is an important finding. For those who choose a gluten-free diet because they believe it to be healthy, it may be time to reconsider your food choices.

James Brown, Lecturer in Biology and Biomedical Science, Aston University

This article was originally published on The Conversation. Read the original article.

 

Top image: Teri Virbickis/Shutterstock.com

 

Do smaller plates make you eat less?

Dependent on how you spend your Monday evenings you may have caught Channel 4’s Food Unwrapped on TV. The programme covered two topics of interest to me; portion sizes and plate sizes. The Conversation

There is evidence that portion sizes of commercially provided foods have increased over time and the programme covered this story. One of the main reasons this is of relevance to public health is because there is also now compelling evidence that the amount of food you are served or provided with reliably affects how much you eat – and that larger portions appear to cause most people to eat more. Our modern day “obesity epidemic” is thought to have been caused primarily by an increase in how much we are eating. So this is important stuff.

The other topic covered by Food Unwrapped, however, is a pet hate of mine: plate size. There is a commonly held belief that using smaller plates reduces the amount of food that people eat. It sounds plausible; when you use a smaller plate, you serve yourself less and because of this you end up eating less. Right?

Wrong.

Small vs big.
Shutterstock

I became interested in the magic of smaller plates after reading an article that discussed some of the research on smaller plates but neglected to mention a number of studies that had found that smaller plates did not reduce how much people ate. Not long after that a team of us reviewed and analysed all available studies that addressed this question.

Our conclusion was that the evidence for the magic of smaller plates was very unconvincing. There were more studies that had found no benefit on calorie consumption of dining with smaller plates than there were studies that supported the smaller plates equals eat less hypothesis. Also, the studies that did support the smaller plate idea all came from the same research group and we noted a number of important limitations in some of those studies’ methodologies. It just so happens that it was the same research group that has recently come under fire for questionable research practices.

We next conducted our own study to examine if giving participants smaller bowls to serve themselves with popcorn reduced the amount of popcorn that they ate. We did not find that using a smaller bowl reduced how much participants ate – if anything participants ate more when using a smaller bowl, as opposed to a larger bowl. Likewise, a further study in 2016 from another research group found no evidence that smaller plates promoted reduced food consumption.

Now back to Food Unwrapped. The programme tried a similar experiment to the one that we did and what did they find? Again, like us they found no evidence to suggest that giving people smaller plates reduced how much they ate – instead they appeared to find the opposite – participants ate about twice as much when dining with smaller as opposed to larger plates.

Why might smaller plates not reduce how much people eat? One good guess is because if you are using a smaller plate you may initially serve yourself a little less but then go back for second helpings – you do have a small plate after all.

Rather worryingly though, at the end of the episode we were reassured that there is still clear evidence that smaller plates do make people eat less and Food Unwrapped’s experiment must have been a fluke.

The idea that simply giving people smaller plates to eat from will magically reduce how much they eat is an idea that may never die (indeed the Food Unwrapped programme was a repeat of an episode first shown in 2016). But it should do. This is because we need to make sure that we are taking aim at the types of environmental factors that can reliably help people eat more healthily.

So what should we be sizing up? There is now accumulating evidence that if the food industry made substantial reductions to the number of calories in popular food and drink products then we would be eating less as a nation. Making this kind of change happen will of course be more difficult than simply telling the general public to eat from miniature plates, but if we are to tackle obesity effectively then it is a change that must happen.

Eric Robinson, Senior Lecturer, University of Liverpool

This article was originally published on The Conversation. Read the original article.

 

Top Image: Shutterstock

Soy foods may benefit breast cancer patients

Oestrogen-like compounds found mainly in soy foods may decrease mortality rates in women with some breast cancers, according to new research.

Researchers at the Friedman School of Nutrition Science and Policy at Tufts University, led by Dr Fang Fang Zhang analysed data from more than 6,000 American and Canadian women with breast cancer.

They found that post-diagnosis consumption of foods containing the compounds called isoflavones was associated with a 21 percent decrease in all-cause mortality.

This decrease was seen only in women with hormone-receptor-negative tumors, and in women who were not treated with endocrine therapy such as tamoxifen.

“At the population level, we see an association between isoflavone consumption and reduced risk of death in certain groups of women with breast cancer. Our results suggest, in specific circumstances, there may be a potential benefit to eating more soy foods as part of an overall healthy diet and lifestyle,” said Zhang in a statement.

As News.com.au and AAP report, there have previously been fears that, because of its oestrogen-like properties, the consumption of soy may reduce the effectiveness of breast cancer treatment.

Kathy Chapman, chair of the nutrition and physical activity committee at Cancer Council Australia said women should still be cautious about soy supplements.

“Soy foods are usually good for people to be consuming but the advice is not to take this study as meaning it’s OK to for breast cancer survivors to take the large doses you would get in a soy supplement,” she said.

Fact or fiction – is sugar addictive?

Some of us can definitely say we have a sweet tooth. Whether it’s cakes, chocolates, cookies, lollies or soft drinks, our world is filled with intensely pleasurable sweet treats. Sometimes eating these foods is just too hard to resist.

As a nation, Australians consume, on average, 60 grams (14 teaspoons) of table sugar (sucrose) a day. Excessive consumption of sugar is a major contributor to the increasing rates of obesity in both Australia and globally.

Eating sugary foods can become ingrained into our lifestyles and routines. That spoonful of sugar makes your coffee taste better and dessert can feel like the best part of dinner. If you’ve ever tried to cut back on sugar, you may have realised how incredibly difficult it is. For some people it may seem downright impossible. This leads to the question: can you be addicted to sugar?

Sugar activates the brain’s reward system

Sweet foods are highly desirable due the powerful impact sugar has on the reward system in the brain called the mesolimbic dopamine system. The neurotransmitter dopamine is released by neurons in this system in response to a rewarding event.

Drugs such as cocaine, amphetamines and nicotine hijack this brain system. Activation of this system leads to intense feelings of reward that can result in cravings and addiction. So drugs and sugar both activate the same reward system in the brain, causing the release of dopamine.

This chemical circuit is activated by natural rewards and behaviours that are essential to continuing the species, such as eating tasty, high energy foods, having sex and interacting socially. Activating this system makes you want to carry out the behaviour again, as it feels good.

Our brain systems encourage us to undertake activities that will continue our species – such as eating high energy foods.
from www.shutterstock.com

The criteria for substance use disorders by the Diagnostic and Statistical Manual of Mental Disorders (DSM 5) cites a variety of problems that arise when addicted to a substance. This includes craving, continuing use despite negative consequences, trying to quit but not managing to, tolerance and withdrawal. Although sugary foods are easily available, excessive consumption can lead to a number of problems similar to that of addiction. So it appears sugar may have addictive qualities. There is no concrete evidence that links sugar with an addiction/withdrawal system in humans currently, but studies using rats suggest the possibility.

Sweet attractions

Dopamine has an important role in the brain, directing our attention towards things in the environment like tasty foods that are linked to feelings of reward. The dopamine system becomes activated at the anticipation of feelings of pleasure.

This means our attention can be drawn to cakes and chocolates when we’re not necessarily hungry, evoking cravings. Our routines can even cause sugar cravings. We can subconsciously want a bar of chocolate or a fizzy drink in the afternoon if this is a normal part of our daily habits.

Sugar tolerance

Repeated activation of the dopamine reward system, for example by eating lots of sugary foods, causes the brain to adapt to the frequent reward system stimulation. When we enjoy lots of these foods on a regular basis, the system starts to change to prevent it becoming overstimulated. In particular, dopamine receptors start to down-regulate.

Now there are fewer receptors for the dopamine to bind to, so the next time we eat these foods, their effect is blunted. More sugar is needed the next time we eat in order to get the same feeling of reward. This is similar to tolerance in drug addicts, and leads to escalating consumption. The negative consequences of unrestrained consumption of sugary foods include weight gain, dental cavities and developing metabolic disorders including type-2 diabetes.

Quitting sugar leads to withdrawal

Sugar can exert a powerful influence over behaviour, making cutting it out of our diets very difficult. And quitting eating a high sugar diet “cold turkey” leads to withdrawal effects.

Sugar can trigger similar addictive responses as do drugs.
from www.shutterstock.com

The length of unpleasant withdrawal symptoms following a sugar “detox” varies. Some people quickly adjust to functioning without sugar, while others may experience severe cravings and find it very difficult to resist sugary foods.

The withdrawal symptoms are thought to be factors of individual sensitivity to sugar as well as the dopamine system readjusting to a sugar-free existence. The temporary drop in dopamine levels are thought to cause many of the psychological symptoms including cravings, particularly as our environment is filled with sweet temptations that you now have to resist.

Why quit sugar?

Cutting sugar from your diet may not be easy, as so many processed or convenience foods have added sugars hidden in their ingredients. Switching from sugar to a sweetener (Stevia, aspartame, sucralose) can cut down on calories, but it is still feeding the sweet addiction. Similarly, sugar “replacements” like agave, rice syrup, honey and fructose are just sugar in disguise, and activate the brain’s reward system just as readily as sucrose.

Physically, quitting sugar in your diet can help with weight loss, may reduce acne, improve sleep and moods, and could stop those 3pm slumps at work and school. And if you do reduce sugar consumption, sugary foods that were previously eaten to excess can taste overpoweringly sweet due to a recalibration of your sweetness sensation, enough to discourage over-consumption!

The Conversation

Amy Reichelt, Lecturer, ARC DECRA, RMIT University

This article was originally published on The Conversation. Read the original article.

Energy drinks could be fatal for some – research

A world-first study has found that having just one to two energy drinks could be life-threatening for some young people with no known history of heart disease.

The study, published in the International Journal of Cardiology, found that people born with a genetic cardiac rhythm disorder called Long QT Syndrome are at higher risk of dangerous heart rhythms, or even death, after consuming energy drinks.

About one in 2000 people has Long QT Syndrome but many are unaware until they undergo an ECG or a relative dies suddenly at a young age. For some patients with Long QT Syndrome, the first symptom is sudden cardiac death.

The study was conducted at Royal Prince Alfred Hospital in Sydney over two years and involved 24 people with Long QT Syndrome aged between16 and 50.

Patients were given energy drinks or control drinks over a 90 minute period, while undergoing continuous monitoring as well as regular ECGs and blood samples.

“We found patients had a significant increase in their blood pressure of more than 10 per cent after the energy drinks, which was not seen in the control group,” Royal Prince Alfred Hospital cardiologist and Centenary Institute researcher Dr Belinda Gray (pictured below) said.

p335

“Additionally, while none of the patients in the study experienced dangerous arrhythmias, we did identify dangerous ECG changes in some patients; 12.5 per cent of patients showed a marked QT prolongation of 50 milliseconds or more.

“For ethical reasons, we could only give patients in this study low doses of energy drinks but, the reality is, many young people will consume four or more energy drinks with alcohol in one evening. These drinks are widely available to all young people.”

Professor of Medicine at the University of Sydney and RPA cardiologist Chris Semsarian said it was still unknown whether a specific ingredient in energy drinks was responsible for ECG changes, or a combination of ingredients.

“But, because many young people do not even know they have familial Long QT Syndrome, we have to caution against anyone consuming these drinks,” he said.

While small, the study was robust as the patients acted as their own controls with each consuming both energy and control drinks on separate occasions with at least one week washout, Professor Semsarian said.

Personalised nutrition emerging as next big trend

Personalised nutrition is rapidly emerging as a key issue for the long term future of the industry, new research has shown. The findings come from a survey by the organisers of the Vitafoods Europe event.

They asked Vitafoods Europe visitors what they saw as the three most important trends in the nutrition industry. For the short term (over the next 12 months) personalised nutrition was picked by one in five respondents (19 per cent). However, when they were asked to think about the long term (the next three years) over a third (35 per cent) identified it as an important trend.

The figures reflect the emergence of new possibilities such as individualised dietary guidelines, wearable technology, and personalised nutrition based on genetic testing. Accordingly, another hot topic for the future was nutrigenomics, which was seen as an important short-term trend for 8 per cent of respondents, but an important long-term trend for twice as many (17 per cent).

The survey also demonstrates the continuing importance of high quality and evidence-based claims in order to meet regulatory requirements and consumer demand. The issue most likely to be seen as important – both now and in the future – was scientifically supported health claims, which was identified as a key trend in the short term by 47 per cent of Vitafoods visitors, and in the long term by 50 er cent.

The health needs created by demographic trends such as population ageing and obesity continue to shape the agenda for much of the industry. Respondents were asked which three health benefit areas were most important to their companies. Healthy ageing, picked by one in four (23 per cent) ranked top, followed by bone and joint health (22 per cent), cardiovascular health (21 per cent), general wellbeing (21 per cent), and weight management (20 per cent).

 

Do vegetarians live longer? Probably, but not because they’re vegetarian

In the past few years, you may have noticed more and more people around you turning away from meat. At dinner parties or family barbecues, on your social media feed or in the news, vegetarianism and its more austere cousin, veganism, are becoming increasingly popular.

While the veggie patty and the superfood salad are not going to totally replace lamb, chicken or beef as Aussie staples any time soon, the number of Australians identifying as a vegetarian is rising steadily.

According to Roy Morgan Research, almost 2.1 million Australian adults now say their diet is all or almost all vegetarian. Ask someone why they are a vegetarian and you are likely to get many different answers. The reasons include environmental, animal welfare and ethical concerns, religious beliefs and, of course, health considerations.

It’s this last factor we set out to investigate. There are several existing studies on the impact of vegetarianism on health, but the results are mixed. A 2013 study, which followed more than 95,000 men and women in the United States from 2002 to 2009, found vegetarians had a 12% lower risk of death from all causes than non-vegetarians.

Given the contentious nature of discussions about vegetarianism and meat eating, these findings generated lots of coverage and vegetarianism advocates hailed the study.

We set out to test these findings, to see if being a vegetarian would translate into lower risk of early death in the Australian population. Australia is home to the largest ongoing study of healthy ageing in the southern hemisphere, the Sax Institute’s 45 and Up Study. This gives us a pool of more than 260,000 men and women aged 45 and over in New South Wales to work with.

We followed a total of 267,180 men and women over an average of six years. During the follow-up period, 16,836 participants died. When we compared the risk of early death for vegetarians and non-vegetarians, while controlling for a range of other factors, we did not find any statistical difference.

Put more simply, when we crunched the data we found vegetarians did not have a lower risk of early death compared with their meat-eating counterparts.

Vegetarians are less likely to be obese.
from www.shutterstock.com

This lack of “survival advantage” among vegetarians, outlined in our paper in Preventive Medicine, does not come as a complete surprise. In 2015, a United Kingdom-based cohort study concluded vegetarians had a similar risk of death from all causes when compared with non-vegetarians. This is contrary to the US-based study findings.

Does that mean everyone should drop the asparagus, fire up the barbie and fill up on snags, steaks and cheeseburgers? Not necessarily.

Other ‘healthy’ factors

It’s standard practice in epidemiological studies to statistically control for various factors (we call them “confounders” as they may confound an association). We controlled for a number of factors to get a true sense of whether vegetarianism by itself reduces risk of death.

It’s important to acknowledge that in most studies vegetarians tend to be the “health-conscious” people, with overall healthier lifestyle patterns than the norm. For example, among the Sax Institute’s 45 and Up participants, vegetarians were less likely than non-vegetarians to report smoking, drinking excessively, insufficient physical activity and being overweight/obese. They were also less likely to report having heart or metabolic disease or cancer at the start of the study.

In most previous studies, vegetarians did have lower risk of early death from all causes in unadjusted analysis. However, after controlling for other lifestyle factors, such as the ones listed above, the risk reduction often decreased significantly (or even completely vanished).

This suggests other characteristics beyond abstinence from meat may contribute to better health among vegetarians. More simply, it’s the associated healthier behaviours that generally come with being a vegetarian – such as not smoking, maintaining a healthy weight, exercising regularly – that explain why vegetarians tend to have better health outcomes than non-vegetarians.

In a separate study we conducted using data from the 45 and Up Study, we found people who ate more fruit and vegetables, particularly those who had seven or more serves per day, had a lower risk of death than those who consumed less, even when other factors were accounted for.

And although there is unclear evidence a vegetarian diet promotes longevity, studies have consistently shown other health benefits. For example, a vegetarian diet has been consistently associated with a reduced risk of high blood pressure, type 2 diabetes and obesity.

A meta-analysis (a statistical analysis that combines data from multiple studies) from 2012 concluded vegetarians had a 29% lower risk of early death from heart disease and an 18% lower risk for cancer.

It’s important to keep in mind that the International Agency for Research on Cancer, the cancer agency of the World Health Organisation, has classified the consumption of processed meat as carcinogenic and red meat as probably carcinogenic to humans.

So what does it all mean?

While we can’t say for certain if being a vegetarian helps you live longer, we do know having a well-planned, balanced diet with sufficient fruit and vegetables is certainly good for you.

We also know sufficient physical activity, moderating alcohol consumption and avoiding tobacco smoking are key factors in living longer. And the growing body of evidence shows vegetarians are more likely to have these healthy habits.

The Conversation

Melody Ding, Senior Research Fellow of Public Health, University of Sydney

This article was originally published on The Conversation. Read the original article.

Doctors call for end to alcohol sponsorship of cricket

The Royal Australasian College of Physicians (RACP) is calling for an end to alcohol sponsorships in cricket, with currently more than 20 alcohol-related sponsorships in cricket across Australia.

The RACP is concerned about the impact alcohol promotion has on young cricket fans – a sentiment backed by the majority of Australians, with 61 per cent concerned about the exposure of children to alcohol promotions in sport.

RACP Paediatrics & Child Health Division President, Dr Sarah Dalton, says it’s unacceptable that young children are being bombarded with alcohol promotion both at the ground and at home watching on TV.

“It is time for a national conversation to discuss how big brewers are using sport as a channel to market their product, leaving our children as the collateral damage,” explained Dr Dalton. “It is happening in too many Australian sports and it needs to stop.

“These promotions normalise alcohol, with Australian kids getting the message that alcohol is an important part of socialising and sports,” said Dr Dalton.

“During one of the VB ODI games, I urge you to keep a tally of how many times you spot an alcohol ad or logo, either at the ground, on a player’s shirt, or in an advert on TV – I’m sure the number would surprise and shock you.

“Sadly, we know this type of marketing leads children and adolescents to start drinking earlier and makes young drinkers prone to binge drinking patterns.

Dr Dalton also criticised the Australian Communications and Media Authority (ACMA) who she says need to do more to ensure children are protected during sports broadcasts.

“Sports are the only programs allowed to broadcast alcohol advertisements before 8:30pm, on weekends and public holidays, at times when children are most likely to be watching television. Because of this it’s estimated that children under the age of 18 are exposed to 50 million alcohol advertisements each year.

“As a paediatrician, I am interested in finding out why this is allowed to happen. The ACMA needs to step up, remove this loophole, and help protect Australian children from alcohol promotion.”

Dr Dalton encouraged Cricket Australia and the ACMA to review the RACP’s Alcohol Policy, which calls for national, comprehensive, evidence-based strategies to combat the harms of alcohol.

Image: https://www.cricket.com.au/

 

Bribing kids to eat vegetables is not sustainable – here’s what to do instead

How can you get a fussy child to eat vegetables? It’s a question that plagues many frustrated parents at countless mealtimes. Some take to hiding morsels in more delicious parts of meals, while others adopt a stricter approach, refusing to let little ones leave the table until plates are clear.

One “alternative” idea touted recently is for parents to essentially bribe their children, depositing money into a child’s bank account as a reward when they eat vegetables – an idea actually backed up by research.

A US study in 2016 showed that the technique continued to encourage primary school age children to eat their greens for up to two months after these incentives were stopped. Children who were incentivised for a longer period of time were more likely to continue eating vegetables after the deposits ended too.

The core idea here is that, providing children have the cognitive ability to understand the exchange, they will learn to eat healthily as well as learn the value of money. After a while, they will continue eating the food, not because of the reward, but because they will get into the habit of eating healthy.

But one study is really not enough to draw conclusions and suggest action – especially as there was not a control group to compare money with other types of incentives, or no incentive at all.

And monetary incentives can actually decrease our motivation to perform the activity we are paid for, and eventually we lose interest. So, even if bribing kids with cash to eat their greens works at first, it is not sustainable in the long term.

Non-monetary rewards aren’t much better either. The phrase: “You can have dessert as long as you eat your sprouts”, will ring a bell for most people. This, though said with the best intentions, may increase the intake of the target food in the short term, but can convey the wrong message to its receipents: “This food must be really bad if I am getting something for eating it!”. It not only places dessert as a food of high value – a trophy that is earned – but also teaches kids to dislike the target food.

A familiar sight for many parents.
www.shutterstock.com/Oksana Kuzmina

Better methods

So what can you do instead? First and foremost, start early. Formation of food preferences start in the womb, and the first months of life are crucial in developing eating habits. The older children get, the more exposures they need to a novel vegetable in order to consume it. Which brings us neatly to the next point.

Vegetables must be offered frequently, without pressure – and you mustn’t get discouraged by the inevitable “no”. Even if you have missed the first window of opportunity, all is not lost. Parents can lose hope after offering the same vegetables between three and five times, but, in reality, toddlers in particular might need up to 15 exposures.

You also need to let your children experience the food with all of their senses – so don’t “hide” vegetables. Yes, sneaking a nutritious veggie into a fussy eater’s food might be one way to get them to eat it, but if the child doesn’t know a cake has courgettes in it, they will never eat courgettes on their own. It can also backfire if children can lose their trust in food when they realise they have been deceived.

Likewise, don’t draw unnecessary attention to specific foods that you might think your child is not going to like. Sometimes our own dislikes get in the way, and create the expectation that our child is not going to like it either. Our food preferences are formed through previous experiences, which children don’t have. Praising and bribing are commonly used, especially when we don’t expect children to like the food offered, but it can be counterproductive. Instead, serve food in a positive environment but keep your reactions neutral.

This isn’t just about what is on the plate, it’s about a relationship with food. So if your children are old enough, let them help in the kitchen. It can be very messy and time consuming, but it is an excellent way to create a positive atmosphere around food.

It is also important to have frequent family meals and consume vegetables yourself. It’s been shown that children who eat with family do eat more vegetables. Kids often copy adult behaviours, so set a good example by routinely serving and consuming vegetables.

There is sadly no single answer as to what will work for your children, and it might be a case of trial and error. But these actions can create positive associations with all kinds of foods, and you can help your kids lead healthier lives – saving yourself a bit of cash while you’re at it.

The Conversation

Sophia Komninou, Lecturer in Infant and Child Public Health, Swansea University

This article was originally published on The Conversation. Read the original article.

Breast milk banking continues an ancient human tradition and can save lives

HealthAround 2000 BC breastfeeding was considered a religious obligation.

Now we understand why breast milk is the ideal food for babies, with evidence showing it provides substantial benefits to health even beyond the period of breastfeeding.

The Australian Dietary Guidelines recommend exclusive breastfeeding for around six months, followed by introduction of solid foods and continued breastfeeding. In Australia, over 90% of infants start breastfeeding, and 39% are exclusively breastfed at four months of age.

Although the majority of mothers can breastfeed, some are unable or choose not to. Infant formula is a readily available option. But as the nutritional and developmental value of breast milk becomes better known, more people are trying to source breast milk from another mother.

Breast milk banking provides a safe source of human breast milk in some states in Australia, but greater accessibility is highly desirable.

What’s so good about human breast milk?

The nutritional value of human breast milk is uniquely matched to the needs of human babies. Nutrients such as iron and zinc are provided in a form that is easily absorbed by baby’s immature digestive system and are highly bioavailable; that is, they are in a form that is usable by the body. This ensures nothing goes to waste, and the demands on the mother’s body are minimal.

Specific kinds of fats known as long-chain fatty acids in breast milk are absorbed by the infant, and incorporated into brain and eye tissues.

Breast milk also contains many other valuable bioactive components such as oligosaccharides, immunoglobulins and a molecule called epidermal growth factor.

Oligosaccharides are carbohydrates that are resistant to digestion, remaining relatively intact as they transit through the gut. An example is “bifidus factor”, which acts as a prebiotic (food for bacteria) to promote the growth of the healthy gut bacteria Lactobacillus bifida. Other oligosaccharides stop disease-causing bacteria from attaching to the surface of the gut and urinary tract.

Immunoglobulins are antibodies that help provide immunity, and coat the lining of the gut to prevent attachment of disease-causing bacteria.

Epidermal growth factor stimulates growth and maturity of the infant gut.

What methods have women used to share milk over history?

Wet nursing is a human tradition that has existed for at least 4000 years. It was the only alternative feeding available for babies before the introduction of bottles and formula. Anecdotally, this is still practised in Australia, with sisters or friends with similar aged babies sharing breast feeding.

A wet nurse breast feeding the Duke of Burgundy, grandson of Louis XIV.
Wellcome Images via Wikimedia Commons, CC BY

An increasing trend is informal milk sharing, where mothers seeking breast milk post a request on a dedicated social media page. Examples include Human Milk 4 Human Babies (HM4HB, which has about 4000 Australian members) and Eats on Feets.

Health professionals warn of the small, but real, potential risks of transmission of diseases such as HIV and Hepatitis C through unscreened breast milk. Some donors provide lifestyle information and antenatal blood screening results to recipients, and these are usually private arrangements between individuals.

Women need to be fully informed about the potential risks of using unscreened milk, and balance these with the decision to use formula. Risks associated with formula include a higher incidence of ear infection, gastroenteritis and respiratory infections in infants, and an increased incidence of diabetes, obesity, leukemia, allergy and asthma in later life.

In circumstances where people pay for breast milk, extra caution must be exercised, as there have been reports of dilution of breast milk with water or cow’s milk.

What is a human milk bank, and where are they in Australia?

A human milk bank collects, stores, processes and dispenses donated human milk.

Milk banking was quite common until the 1980s, when the AIDS epidemic sparked concerns about viral transmission through milk. There is now a re-emergence of milk banking in Australia, with five milk banks currently operating;

Who donates breast milk to banks, and how it is processed?

Donors of human milk are often mothers of a premature infant, women who have milk surplus to their needs, or mothers in the community.

Donors undergo rigorous screening, similar to blood donors. Their blood is collected and tested for diseases that could be transmitted through the milk, such as HIV, hepatitis B and C and syphilis. Lifestyle questions related to drug and alcohol use are also used to identify risk of diseases.

Milk is expressed in the donor’s home, or in the neonatal unit (in the case of mothers of a premature infant) under hygienic conditions. It is then frozen for transport to the milk bank. Milk is then thawed, tested for bacterial count and pasteurised, usually using the Holder method, where the milk is heated to 62.5ºC for 30 minutes and then rapidly cooled. The milk is then re-tested for bacterial count, and frozen for dispensing. The combination of freezing and pasteurisation kills harmful viruses and bacteria.

Sterile collection of human milk with a breast pump.
from www.shutterstock.com

Who can access banked milk in Australia?

Of the five milk banks currently operating in Australia, only the Mothers Milk Bank supplies milk to babies in the community. All other milk banks supply milk exclusively to premature and sick hospitalised infants.

Australian milk banks do not pay their donors, nor can the milk be bought. The cost of milk banks associated with neonatal units is absorbed within the health system. Community milk banks may ask for a donation.

The most recent statistics report there were 7,887 babies who required care in neonatal intensive care units in 2013. Representing 2.6% of all live births, these highly vulnerable infants stand to benefit the most from access to safe human breast milk.

Breast milk protects high risk infants against life threatening conditions such as neonatal sepsis (a dangerous, multi-system infection) and necrotising enterocolitis (a severe disease of the intestine). It has been estimated using donor human milk will provide cost savings of $13 million per year to the Australian health care system by reducing the number of necrotising enterocolitis cases alone.

What happens in states without milk banks?

In Australia, 18 out of 24 neonatal intensive care units do not currently have access to pasteurised donor human milk. As a result, approximately three out of four babies with a high risk of developing necrotising enterocolitis or other neonatal complications do not have access to pasteurised donor human milk.

Although there are significant cost savings linked with preventing necrotising enterocolitis in premature infants on a national basis, the expense of establishing and running a milk bank is often prohibitive for small neonatal intensive care units with relatively low numbers of high risk premature babies.

Recently the Australian Red Cross Blood Service has been considering establishing a national human milk bank, in line with its blood bank service. This would be a welcome adjunct to neonatal units and an important health initiative to improve health outcomes for these vulnerable infants. While this is still in the planning phase, we look forward to hearing more about this possibility soon.

The Conversation

Jacqueline Miller, Senior Lecturer and SAHMRI Fellow, Flinders University and Carmel Collins, Senior Research Fellow and Head of Neonatal Nutrition Research Unit SAHMRI, South Australian Health & Medical Research Institute

This article was originally published on The Conversation. Read the original article.

Eating standing up – is it really bad for you?

These days, many of us are flooded with advice on what to eat, when to eat and how much to eat. Alongside this calorie and nutrient-based advice you may even have heard that you should avoid eating while standing up or lying down, as was common in Ancient Greece or Rome. It may seem to make sense, but how much scientific evidence is there to back this advice?

If we consider these three eating positions: lying down, sitting and standing, what challenges do they present the body with and which should we choose as our standard eating position?

The first of these positions, eating lying down, was fashionable to the ancients. This may not solely have been through laziness or a show of wealth and power – as some researchers have suggested, lying down on your left side reduces the pressure on the antrum or lower portion of the stomach, thus relieving discomfort during a feast. As few of us truly feast nowadays – at least in the Roman sense – this might not be so important.

There is some evidence that we absorb carbohydrates at a slower rate when eating lying down compared to sitting, and this is likely due to the rate of gastric emptying. Slower absorption of carbohydrates is generally considered to be healthy as it avoids large spikes in insulin.

Alternatively, eating lying down may increase the risk of developing gastroesophageal reflux disease (GORD), a condition where the stomach’s contents return back up into the oesophagus through the cardiac or oesophageal sphincter, a ring of muscle that controls the passage of food from the throat to the stomach. This condition is found with increasing prevalence worldwide, and can cause significant discomfort, often being mistaken for a heart attack.

Although there is almost no published research specifically investigating the effect of eating lying down on the symptoms of GORD, the American College of Gastroenterology advises avoiding lying down for two hours after eating, which would suggest that eating lying down itself is probably unwise. As GORD slightly increases the risk of developing more serious conditions including Barrett’s oesophagus and oesophageal cancer, this is probably bad news for those of us who want to adopt the Roman banqueting lifestyle.

Banquet scene from the Casa dei Casti Amanti, Pompeii.
WolfgangRieger/Wikimedia Commons

Sitting v standing – the pros and cons

Whether we sit down or stand up for a range of activities throughout the day is a topical issue. Sitting down, which alongside lying down makes up our sedentary behaviour, is increasingly linked to poor health, although there is some contention around this. But when it comes to eating our meals, it seems for once sitting down might be the preferable choice. People might be more likely to take their time over a meal if seated, although this has not been seriously studied. Eating more slowly is considered to be healthy as it more rapidly increases fullness and decreases appetite, leading to a potential reduction in calorie intake.

As for eating while standing up, there is no real evidence that it has any negative effects on digestion and it isn’t included on any lists of prohibited activity by healthcare professionals. Although gravity isn’t needed for most of the function of the gut, it does help with preventing GORD, which is why many sufferers raise the head of their bed at night.
Standing during eating does have the benefit of promoting more energy expenditure, with estimates of around 50 extra calories an hour burned by standing compared with sitting down. Over an extended period this would add up.

So is it better to eat sitting, standing or lying down? While there is not enough scientific evidence to confidently state that eating in either position is more appropriate, it’s likely that as long as you take your time and eat mindfully, either standing or sitting to eat your meals should be absolutely fine and a healthier alternative to eating lying down.

The Conversation

James Brown, Lecturer in Biology and Biomedical Science, Aston University

This article was originally published on The Conversation. Read the original article.

Top image: Dmytro Zinkevych/Shutterstock.com

 

Spicy food may lead to a longer life – study

Like spicy food? If so, you might live longer, say researchers at the Larner College of Medicine at the University of Vermont, who found that consumption of hot red chili peppers is associated with a 13 percent reduction in total mortality — primarily in deaths due to heart disease or stroke — in a large prospective study.

The study was published recently in PLoS ONE.

Going back for centuries, peppers and spices have been thought to be beneficial in the treatment of diseases, but only one other study — conducted in China and published in 2015 — has previously examined chili pepper consumption and its association with mortality. This new study corroborates the earlier study’s findings.

Using National Health and Nutritional Examination Survey (NHANES) III data collected from more than 16,000 Americans who were followed for up to 23 years, medical student Mustafa Chopan ’17 and Professor of Medicine Benjamin Littenberg, M.D., examined the baseline characteristics of the participants according to hot red chili pepper consumption.

They found that consumers of hot red chili peppers tended to be “younger, male, white, Mexican-American, married, and to smoke cigarettes, drink alcohol, and consume more vegetables and meats . . . had lower HDL-cholesterol, lower income, and less education,” in comparison to participants who did not consume red chili peppers. They examined data from a median follow-up of 18.9 years and observed the number of deaths and then analyzed specific causes of death.

“Although the mechanism by which peppers could delay mortality is far from certain, Transient Receptor Potential (TRP) channels, which are primary receptors for pungent agents such as capsaicin (the principal component in chili peppers), may in part be responsible for the observed relationship,” say the study authors.

There are some possible explanations for red chili peppers’ health benefits, state Chopan and Littenberg in the study. Among them are the fact that capsaicin is believed to play a role in cellular and molecular mechanisms that prevent obesity and modulate coronary blood flow, and also possesses antimicrobial properties that “may indirectly affect the host by altering the gut microbiota.”

“Because our study adds to the generalizability of previous findings, chili pepper — or even spicy food — consumption may become a dietary recommendation and/or fuel further research in the form of clinical trials,” says Chopan.