Newsletter 114

By David Bender

Most of the “easy” questions in nutrition research were answered in the last century: how many calories do we need to maintain body weight and carry out exercise and work?; how much of the vitamins and minerals do we need to prevent deficiency disease?; how much protein do we need? (although this is still controversial).

Over the last 50 years, the emphasis has been on achieving “optimum nutrition” to promote the best possible state of health, rather than preventing deficiency diseases. Optimum health is a more difficult goal. We cannot really define it; it is certainly more than just the absence of disease. If our aim is to devise diets to promote healthy longevity then we are probably looking at studies lasting 70 or more years to test whether they work or not.

There are in fact a number of such long-term cohort studies, in which individuals are followed for many years. The oldest of these is the 1946 birth cohort study in the UK, in which everyone born in the second week of March 1946 has been, and still is being, followed. The Framingham study in USA has followed every resident of the town of Framingham, Massachussetts, since 1948.

It is difficult to draw any conclusions about how the diet in early life in the 1940s and 1950s may give us any information that is relevant to people born in the 21st century. Food was still rationed; the variety of fruits and vegetables available was minute compared with today; yogurt was known only to a few “food cranks”, while nowadays there is a 5m long aisle of fermented milk products in every supermarket.

It is difficult to draw any conclusions about how the diet in early life in the 1940s and 1950s may give us any information that is relevant to people born in the 21st century

There has been a great deal of information from the Nurses’ Health Study in USA, which is following some 85,000 nurses and their diet and health records. This is an observational study, so all we can say is that people who eat more of this, or less of that, are more or less at risk of developing whatever disease we are interested in. It won’t tell us whether the food causes it, only that there’s an association.

For example, people who eat relatively large amounts of processed and preserved meat products are more likely to develop gastric and colorectal cancer. But these people may also eat less fruit and vegetables, less whole grain cereal products, less oily fish, etc. They may also have other behaviours that are, or may be, conducive to ill health, such as smoking, taking little exercise, being obese, etc.

At one time there was a list of some 600 factors associated with the development of atherosclerosis and coronary heart disease, one of which was religious observance! The presence of so many variables in normal life confounds our interpretation of the data. It muddies the water and makes it difficult to draw clear conclusions about what it is about a diet that makes it healthy, or not.

News editors love stories about the discovery of a single potentially harmful (or lifesaving or cancer-fighting) ingredient. (So, also, do the bodies who award grants for expensive long-term observational studies or intervention trials). Unfortunately, “Eat sensibly to live longer” or “a little of what you fancy does you good” will never make a news headline and will struggle for research funding.

The problem with studying what the over-70’s are eating, of course, is that we know nothing at all about what was being eaten by those people who did not reach the age of 70.

Observational studies can lead to very misleading results. Here’s an example. The generally accepted guidelines for a healthy diet are that fat should provide no more than 30% of calories, with saturated fat only 1/3 of total fat, and sugar no more than 10% of calories. The Seneca study, published in 1991, was a Europe-wide study of healthy elderly people aged 70 – 75, living in their own homes. The data from the town of Roskilde in Denmark showed that 5% received more than 50% of calories from fat, and 22% received more than half of their fat as saturated fats. 90% received more than 10% of calories from sugar, and 46% of the men were smokers. The problem here, of course, is that we know nothing at all about what was being eaten by those people who did not reach the age of 70.

In medical research, the randomised controlled trial (RCT) is the gold standard of proof. In nutrition research, RCTs are not feasible if we are interested in long-term health outcomes. No-one can be expected to follow a dietary intervention for many years, just to see the outcome, and we cannot control a group of people to receive a “placebo” diet. In any case, the pattern of foods available will change over the years.

So, the focus of much nutrition research is on relatively short-term interventions in which we measure the experimental diet’s effect on levels of a biochemical or other biomarker that we believe to reflect the likelihood of developing the disease of interest. If we are interested in cancer, then we can measure how eating the food in question affects blood levels of biochemical markers indicating damage to DNA; if we are interested in heart disease then we can measure effects on blood cholesterol in plasma or arterial stiffness. These are indirect signs. Not proofs, just indications of an effect on the likelihood of developing disease.

The body adapts its protein metabolism according to changes in intake. Protein turnover slows if we eat less protein, so we can get by on less.

Furthermore, because these trials are short-term, we cannot know whether or not the changes we see will persist, or whether the body will simply adapt to the changed diet in the longer term. The problem of how much protein we need to eat is a good example here. Relatively short-term experiments suggest a higher protein requirement than studies that continue for several weeks, because the body adapts its protein metabolism according to changes in intake. Protein turnover slows if we eat less protein, so we can get by on less.

There is one nutritional intervention for which there is excellent evidence of efficacy: folic acid supplements taken before conception are effective in preventing neural tube defects in the baby. Of course, this is a short-term experiment (9 – 10 months), with a clear outcome at term.

David Bender, Emeritus Professor of Nutritional Biochemistry, University College London