Not long ago I was with my hairdresser, and as the rubber gloves came out and the bleach went on, she asked me in passing what I do for a living.
“Well, I’m a chef,” I said, “But I’m hoping to go back to university full-time this year, and when I’m done I want to be an epidemiologist.”
To my surprise – because mostly people glaze over at this point – my hairdresser’s face lit up. She pointed out some brightly coloured jars on the shelf and told me they’d just switched brand supplier, she was hoping it would go well. She pointed out that my skin was a bit dry in places but on the whole I was doing pretty well for my 40+ years. Yes, you guessed it. She thought epidemiology was the study of skincare, and that I wanted to be a beauty therapist.
To be fair to my hairdresser, it’s a very common misconception. Most people don’t even notice that epidemiologists exist. Which is odd, considering how pervasive their work is through our lives.
Epidemiology is the study of health in defined populations of people, and what might affect it one way or the other. It’s the base science of public health, and it looks to see what might harm or help the health of a large number of people in the same way that a doctor wants to know what might harm or help an individual.
To me, epidemiology is a wonderful thing precisely because it covers so many areas of life. It straddles both the social sciences and the very cutting edge of medicine. You have vaccinations as a young child thanks to public health interventions. That’s epidemiology, that is. You’re given milk at primary school because an epidemiologist somewhere figured out it would improve your calcium intake, and more calcium in young children is a good thing. If your school or workplace suffer an outbreak of norovirus, it’s an epidemiologist who has figured out how the damn thing spreads and will tell you to wash your hands and sanitise contact surfaces such as door handles. Epidemiologists figured out that SARS came from one bloke feeling a bit poorly who took a flight to Hong Kong, how he got it, and where it went from there in time to quell it. And I’m sorry to have to tell you, but epidemiologists are also the annoying people who tell you that smoking causes lung cancer, that you should only drink so many units of alcohol a week and that red meat isn’t that great for you. Epidemiologists, eh? Sucking all the fun out of those little pleasures in life.
A perfect example of this kind of headline came out with a fanfare last Thursday, and was widely reported. The news that processed meat seems to cause a significant increase in risk to our health was not a surprise to many in these days of horse meat scandals and intensive farming, but all the same journalists seemed caught in a panic as they urged us to drop the bacon butties and pick up the salad instead.
But I’m not writing this to tell you that bacon is bad. Frankly, I’m in no position to preach. What I want to celebrate here is a tremendous monument to international relations and human achievement on a grand scale…. the study that produced those results.
The study was called the European Prospective Investigation into Cancer and Nutrition, or EPIC for short. And it truly was. There are many kinds of study in epidemiology, and EPIC was what we call a Prospective Cohort Study. A cohort study is an observational study where you pick your population, figure out which of them have a particular characteristic and which don’t, then follow them over time – we’re talking years or even decades, here. At the end of it all, you might find that the people with one characteristic fare better or worse than those without it. Simple.
Cohort studies come in two types – a retrospective (historic) cohort, where the result you’re looking at has already happened and you’re looking at medical records to seek correlations, and a prospective cohort. This is where you choose your people first then follow them up over time to see what happens to them.
Recipe for an EPIC Cohort Study
1) Hello, Is It Meat You’re Looking For? Defining Your Objectives
As with any other form of scientific investigation, your hypothesis needs to be clear before you start. The EPIC study worked on the basis that while meat is high in iron and folate, which is good, it’s also high in cholesterol and saturated fat, which isn’t. The meat proportion of diets across the world has been rising steadily since the end of World War 2, and so has coronary heart disease and some forms of cancer. There was clearly a correlation between the rise in consumption of meat and poor health, but nobody knew if one causes the other directly. We can’t just assume, that’s not terribly scientific.
The EPIC report cited a study that took place in Oxford to see if vegetarians were healthier than meat eaters, but it found that in a sample of both vegetarians and moderate meat eaters who lead a healthy lifestyle, there wasn’t a great deal of difference. In other words, vegetarians were healthier than the general population, but that might be due to other factors in their lifestyle – jogging and doing yoga and all that lovely healthy stuff, which meat eaters could do as well. Whether you let meat pass your lips or not didn’t seem to be significant.
But EPIC also cited some large American cohorts where there was a clear risk for those who eat high levels red meat and processed meat, compared to people who still ate meat but not much of it, or mainly poultry. So is it the amount and type of meat that’s making the difference? Or would the lovely people of Europe prove far less prone to this stuff than the USA, due to differences in their lifestyle?
2) Choose your Population
The EPIC study comprised of almost 500 000 people, based in 23 different towns in 10 different European countries. Let me say that again: half a million people, twenty three towns, ten nations. If that isn’t epic then I don’t know what is.
You can recruit your population, or “cohort”, from many different places depending on what you’re looking at. The main thing is that you make sure your cohort represents the people you want to study. It’s no good studying heart disease in, say, 7 year old girls. You want to look at the people where the incidence of heart disease is highest already.
Sometimes your cohort will be quite specific. For instance, if your aim is to study the benefits/harms of radiotherapy as a course of treatment in people with specific cancers, then your sample must come from the total population of people who have that kind of cancer, and you’ll look to specialist clinics and the like to find them. EPIC had a much wider remit, as, let’s face it, most people in Europe eat meat.
People who took part in the EPIC study were mostly taken from the general population. People were recruited from the blood donor programme, from particular companies and health insurance schemes, from the civil service and even a mammogram screening scheme. In the UK, some of the participants were those “health concious” people from the Oxford study we cited earlier. It’s important to have people you can keep track of over time, so you can follow as many of them up as possible at a later date. This is why things like health insurance schemes are handy – they effectively keep track of your cohort for you.
All participants in EPIC were recruited between 1992 and 2000, and all were between 40 and 70 (for the chaps) and 35 and 70 (for the ladies). In total 511, 781 people were recruited. But because cohort studies focus on the development of disease, EPIC needed to ensure that people started off without any of those diseases already – we want to ensure we start from a baseline healthy population. So the next job was to whittle out anyone who reported they’d had cancer, had suffered a stroke or a myocardial infarction, and those who hadn’t reported if they smoked or not. While they were at it, EPIC also cut out the health freaks and couch potatoes, or, as they put it, the top and bottom 1% of ratio for energy-to-intake expenditure (I prefer my way). This left them with 448, 568 participants. Not much to be going on, then.
3) Confound it all! Or not, preferably…
It would be a marvellous thing if you could put 448, 568 people in a sealed room and stop them getting on with their everyday lives. You could force them to stop doing annoying things like drinking too much and going to parties, or having babies, or….
OK, no it wouldn’t. It would be unethical and horrific. But still, cohort studies need some way to cut out as many confounders as possible. Confounders are other things that people might do which could affect the results you’re looking at. People smoke, they drive to work when they could walk instead, they go on diets, they change their eating habits and activity levels when they have children, become unemployed…. the list is almost endless, and any one of these factors might scupper you being able to say your result is significant – definitely down to one or two factors.
In short, life goes on. This is especially a problem with cohorts as they study people for such a long time and, let’s face it, life throws up some pretty vexing confounders. What you can do, however, is think up as many of those confounders as you can in advance and allow for them in your study. You don’t treat all your 448,000 people as an amorphous mass, you divide them into sub-groups called strata and try to group similar people together as much as possible. EPIC stratified people according to their age, their weight and height, which town they were in, their smoking and drinking habits, their overall food intake, their exercise and education levels. These strata could then be analysed separately to try and cancel out the confounding effects. It’s by no means ideal and you’ll never cancel out confounders entirely, but it’s a start.
4) Food, Glorious Food… Measuring the “Unmeasurable”
The methods varied a bit from country to country across the EPIC cohort, but if you really want to there are various ways you can measure people’s food intake. The majority of information was taken from Food Frequency Questionnaires, where people respond to specific questions then submit their answers for computerised scoring. For EPIC purposes they were particularly interested in 3 categories of food intake – red meat, processed meat, and white meat/poultry.
A significant chunk of the EPIC study also went through an occasional 24 hour Recall Interview – this is where an investigator contacts you and asks you to remember what you ate in the last day, and can provide a lot more detail and nuance than a questionnaire. Ideally this method would be used a lot more, but it’s pretty impractical to ask 20 000 people or more at your particular centre to tell you what they ate every single day. You could in theory do, say, a week’s recall interview, but it’s quite surprising how quickly our memory of what we ate fades and most people underestimate their true food intake vastly.
In the UK and Sweden participants usually kept a Food Diary, where the food they ate each day was recorded and they were asked to estimate their portion size using a set of standard photos. It’s a reasonably good method, but under reporting is till a problem.
The absolute gold standard of nutrition studies is called the Weighted Inventory, but is much more useful in smaller groups of people than a massive cohort like EPIC. In this kind of study all food for a week is weighed and recorded, and the waste left on your plate at the end of a meal is also taken into consideration. Food intake is then calculated from Food Composition Tables. But even in this kind of study, there is a tendency to under record those sneaky custard creams you had with your cuppa.
5) Add some HSS (Highly Scientific Stalking)…
When you have a study like EPIC, where almost half a million people are studied on average for 12 1/2 years, keeping track of them all is almost a bigger problem than recruiting them in the first place. People move house, change jobs, some will migrate and some will even die from no reason at all to do with what you’re investigating. You can sign up to a cancer study and still die in a car crash, there’s no guarantee.
Normally, investigators will phone people, write to them, check their medical records and regional health departments, find their change of address through the professional association or university they were recruited through – there’s a reason for choosing participants from professional bodies and the like.
It’s important that as high a proportion of participants as possible is followed up, and every effort should be made to trace the outcomes of all people. EPIC managed to trace the outcomes of 98.4% of their participants, which considering the size of the cohort is pretty darned impressive. In all 26, 344 EPIC participants died during the study – about 6%. 37% of the deaths were from cancer, 21% from heart disease, and 3% from diseases of the digestive tract, but that leaves a remaining 39% of the deaths not caused by anything that EPIC was investigating.
6) Here Comes The “Smug Boffins” Bit…
My dad, a plumber and dedicated tabloid reader, gave a little snort when I told him I was going back to university to study science. “A scientist? Why do you want to be a scientist? They’re always on the news being smug and everything, claiming some study of monkeys’ sex habits is significant. Significant is the economy! Significant is pensions!”
My dad, like many non-sciencey types, had a fundamental misunderstanding of what “significance” means when scientists refer to results. They’re not making a personal judgement about the value of their work, they’re talking statistics. Because for about 5 years after the last outcome of the last person had been sent in to the EPIC study, they were processing all that information.
Statistics is kind of complicated, but suffice to say the EPIC investigators would have sorted out their strata, cancelled out confounders as best they could, divided their meat consumption into categories according to grams consumed, used the Cox Regression Model then decided if the correlation between processed meat consumption and death was significant enough to suggest that processed meat may be a cause.
Scientists are quite finickity about what they consider a probable cause, and what might be just chance. The widely accepted numerical indicator of this is the P-Value. If your P-Value is 0.01, that means you reckon there’s a one in a hundred (1%) chance that your finding was down to…. well, chance. The P-Value most scientists are desperate for in crunching their numbers is one of less than 0.05. That means that there is just a tiny risk that your findings were down to something completely random – 5 in 100. A P-Value of less than 0.05 means you’re really onto something.
The EPIC cohort found an increased risk of death in people who ate more than 20g per day of processed meat, but concluded it was a moderate risk. Their findings on red meat were less conclusive. If you want to find out more about the results you can do so on Henry Scowcroft’s fantastic blog for CRUK here.
As is so often the case, a balanced and varied diet is the key and nobody’s telling you that you should give up your once-a-week-Sunday-morning bacon toastie. The true significance of EPIC to me is that cohort studies like this are going on over years and years all around us, and we have no idea unless either we’re taking part, or the results are published in the future. On my 50th birthday I might take a look at the newspaper projected onto my corneas as I jetpack into work, and see the result of a study that’s taking place right now and I just didn’t know it.
So next time you read a headline saying a new study finds something is bad for your health and that moderation is all-important, maybe go to the link (if there is one) and look at the (hopefully Open Access) paper behind it. Because much as some of us would like to think, this isn’t about busy bodies and the Nanny state making sure we don’t enjoy stuff for their own fun. There is some truly amazing methodology behind these headlines, and it’s worth taking a moment to gaze in organisational wonder.