An independent charity for science and integrity in healthcare

Read the latest HealthWatch newsletter:  Issue 106, Autumn 2017

Tim Harford on behalf of the BBC More or Less team for their clear, honest and entertaining way of educating the public about the meaning of numbers

Junk, jigsaws and zombies: misleading stats in the news

Tim HarfordThe 2012 HealthWatch Award went to Tim Harford and the team behind BBC Radio 4’s “More or Less” programme. Tim received his award at the October AGM, and gave an entertaining presentation to HealthWatch members and patrons on the subject of misleading medical statistics. The article below is prepared from his presentation.

I’d like to begin by setting you a little test. Imagine you’re a doctor discussing a type of cancer screening with a patient. You see that test A increases the patient’s 5-year survival rate from 68 to 99%. Put your hands up if you think that would be a benefit? [audience hands were, hesitantly, raised] Test B, however, will reduce deaths from 2 per 1,000 to 1.6 per 1,000. Most people faced with comparing test A with test B would opt for test A.

In fact, only test B unambiguously saves lives: to be precise it saves 0.4 lives per 1,000 people. But what about the huge survival rate benefit of test A? To explain how this is, imagine a cancer that always strikes at the age of 60, but that shows no symptoms until age 68 … then kills at age 70. A screening test that accurately diagnoses the cancer in 62-year-olds would give them a 5-year survival rate of 100%. Yet they would still die at 70 if there was no treatment.

US psychologists recently put this test to 400 doctors. Eighty-two per cent thought that test “A” saved lives—which it didn’t. Only 60 per cent thought that test “B” saved lives, and fewer than one-third thought the benefit was large or very large—which is intriguing, because of the few people on course to die from cancer, the test saves 20 per cent of them. In short, the doctors simply did not understand the statistics on cancer screening.

In the course of this talk I’d like to share with you some of the things I’ve learnt about statistics in the news.

Mistakes are not always difficult to spot

An advertisement for “U-switch” internet service claims that 49% of British broadband customers are getting below-average broadband speed. Think about it … it’s like saying, “49% of NHS patients are getting below-average treatment”. Of course they are, and the rest are getting average or aboveaverage.

It’s saying nothing.

It’s easy to be wowed by big numbers

In 1997 Gordon Brown pledged to spend £300 million on pre-school provision over the following five years. Now, you need to peer beneath the numbers here. The need for pre-school provision affects about 1 million children every year. That figure boils down to about £1.08 per child per week. What exactly is going to be provided for £1 a week? So, don’t be impressed by the big numbers that aren’t.

Be wary of averages

We have reports of economic upturns—talking about average inflation, average incomes. Remember, on average a rainbow is white, yet it’s the colours that are important. The average is not the only thing you want to know to get the true picture.

A famous example of the misuse or misunderstanding of averages was when the financial crisis broke in August 2007, and the chief financial officer of Goldman Sachs commented that 25-standard deviation events had occurred several days in a row. We asked a professor of finance to calculate for us the likelihood of that actually occurring. He worked out that you might expect to see a 2-standard deviation event once ever 4-5 days. A 3-standard deviation event happens only every 3-4 years. A 4- standard deviation event once every 126 years. There could only have been one 5-standard deviation event since the last ice age. A 25-standard deviation event would be expected to happen only once every x years, where x is a number with 67 zeros.

Watch out for shifting definitions

A recent US report said that one in five students self-harm. Alarming to consider the possibility that a fifth of young people might hurt themselves by burning, slashing wrists, and attempting suicide. But is it true? Read the study behind the news and we find that 8,300 students were surveyed. Of those 3,000 chose to respond—could there be a bias here towards young people who already have an interest in self-harming? But it is the definitions that interested me. They included things like tugging at your own hair, scratching yourself. No doubt smashing your head onto your keyboard when reading an idiotic news story would have been included. According to the same study, only 49 of the responders reported causing themselves serious harm. That is 0.5% of the original sample. The story was not exactly false, but not really true either.

Sometimes you get statistics that have no merit whatsoever

We call them junk stats. We came across this one on the internet—four million US women are battered to death by their husbands or boyfriends. Four million? Common sense tells you that can’t possibly be true. We often find that the more serious the claim and the worthier the issue, the stupider the stats.

The dangers of junk stats

But it’s not only important issues that generate junk stats. We noticed an advertisement for a product that made lips 25% fuller. What does that mean? And a product that is 25% more “berrylicious” … we assume these have been cleared by the Advertising Standards Authority, but on what basis they proved the claims I can’t imagine.

Every year there is a Blue Monday. It’s a concept created to generate publicity for a charity that supports people with depression. A PR firm invented an equation that involves how many days you’ve just had off work and the length of time before your next holiday, and comes up with the most depressing day of the year. Is that OK? I’m inclined to think it makes a mockery of statistics, maths, and journalism. Someone with real evidence can be chucked in the same bucket as this kind of nonsense. Journalists using this kind of material don’t give their readers the tools they need to interpret stories properly.

There’s something we call “zombie stats”

These are the figures that can be shot to pieces over and over again and yet they still keep coming back. Here’s one. Public sector spend could be cut by 20% by reducing waste. Well, the biggest factor in public sector spending is salaries, not waste. So how can cutting waste reduce it by 20%? The figure came from a procurement consultancy that claimed to be able to get better deals—such as cheaper mobile ’phone bills. Of course they wouldn’t be able to make savings like that in all areas of spending, yet the government seemed happy to repeat the claim without any evidence. Ben Goldacre shredded the claim [1] we at More or Less shredded it, but like that zombie it just won’t lie down and die.

Recognise patterns that can fool us

In the cancer screening example that I began with, you could have known that I was going to try to fool you. But you don’t always see it coming. To take speed cameras, for instance. If the government try to put cameras at accident black spots, will they have an effect on the number of accidents that occur?

Some accident black spots are indeed dangerous places. But in some cases a cluster of accidents is just back luck. Bad luck doesn’t last. Put a speed camera there and, chances are, the run of bad luck ends, the accidents are reduced, and it seems to be down to the camera. I’m not saying that speed cameras don’t help, but they might help less than we thought.

What do we in the media need to do?

Sometimes we need expert help. We in the media can always find an expert who is generous enough to explain if we ask. And we need to know when we need to ask.

We need to get better at explaining risk. To take another example from recent news, we hear that if you eat a bacon sandwich every day, you have your risk of cancer increased by 20%. The questions we need to be answering here are, what kind of cancer are we talking about? How likely are you to get it anyway? In this case, we’re talking about cancer of the bowel. Under normal circumstances, 4 people in 100 get bowel cancer. If you eat a bacon sandwich every day, the risk increases to 5 in 100. Explaining risk in this way is helpful.

The BBC website’s top story at the moment is that old people can prolong their useful life not by going for more walks, but by doing jigsaw puzzles. Whether this results from a study or a systematic review, we don’t know. You need context in order to judge this kind of story. And there’s the recession. A deficit of £150 billion—it’s a meaningless big figure for most people. Until you calculate that it adds up to a bill of £2,500 per person per year. Is that higher or lower than the deficit for other European countries? Greece for example? A good party trick during party political conference season is to take all the numbers that appear in the media reports and divide them by the population of the country and see what you get.

And are the sources sound? Is it true? What exactly is being said? What is being compared? If we’re talking of treatments, were they tested on animals or people or in a petri dish? Get a sense of scale and context. If we can do that, we can use maths to tell stories that people can understand.

Tim Harford

Broadcaster, author and journalist

Reference [1] http://www.guardian.co.uk/commentisfree/2011/jun/24/bad‐science‐local‐govermentsavings‐ ben‐goldacre

More about tim harford and “more or less”

The More or Less team

Tim Harford is an author, columnist for the Financial Times and presenter of BBC Radio 4’s “More or Less”. The Royal Statistical Society has commended More or Less for excellence in journalism in 2010, 2011 and 2012; and the programme has won an award from Mensa. As a senior columnist for the Financial Times, Tim’s long-running “Undercover Economist” column reveals the economic ideas behind everyday experiences, while a new column, “Since You Asked”, offers a sceptical look at the news of the week. His first book, “The Undercover Economist” has sold one million copies worldwide in almost 30 languages. His writing has been published by the leading magazines and newspapers on both sides of the Atlantic. Tim won the Bastiat Prize for economic journalism in 2006 and has been named one of the UK’s top 20 tweeters by The Independent.

In BBC Radio 4’s “More or Less” programme, Tim Harford and his team investigate numbers in the news and try to make sense of the statistics which surround us. The half-hour programme is broadcast at 16:00 on Friday afternoons and repeated at 20:00 on Sundays on Radio 4.

For more about “More or Less” see the programme's page on the BBC website.

For Tim’s articles and blog, see his website.

Share