Friday, November 7, 2008

Life is Actually Quite Complicated

In this excellent post, new blogger Mike Eslea (the Punk Psychologist) takes British newspapers to task for their sensationalist coverage of some new statistics about knife crime. For non-British readers, I should explain that knife crime is a hot button issue in this country at the moment, with the narrative that there's a "knife crime epidemic" in progress being widely accepted.

As Eslea explains, when the new crime statistics were released, all of the headlines talked of a "22% rise" in knife incidents. This sounds pretty dramatic, and straightforward - 22% more stabblings, oh no! But in fact the picture is much less clear - most of this rise was probably due to changes in the way such crimes are reported, and even defining knife crime is not as easy as it seems. He also notes that last year the Times managed to extract the headline "Knife Crime Doubles In Two Years", based on a report which found nothing of the sort, through careful cherry-picking of the statistics. You should read the whole of the post - it's enlightening (and if you don't, me and Eslea will stab you up.)

Anyway, what's interesting is that this is just the kind of thing that we also see in much of science journalism. Ben Goldacre's excellent Bad Science is full of examples of the way in which the media mislead in their coverage of scientific and medical research. Reporting on violence, drugs, teenage pregnancy and other social sins often misleads for exactly the same reasons - i.e. statistics are cherry-picked to support the most dramatic conclusions, caveats and methodological weaknesses are ignored, and evidence which doesn't fit with the narrative are not reported on at all (in this case the narrative is "knife crime epidemic!" but we have also had "autism epidemic!", "diet determines health!" etc. etc.) Science, medicine, or crime, the numbers get spun in the same ways.

The basic problem, as I see it, is that people just don't like doubt. We want a clear story, even if the available evidence doesn't support any firm conclusions. Look at the Daily Mail's regular headlines about something causing or preventing cancer - any epidemiologist knows that establishing risk factors for cancer is a very difficult job, and there is a huge amount of uncertainty, and a lot of the research out there is crap. For the Mail, on the other hand, one small study constitutes proof. Until the next small study comes along and proves that what we thought cured cancer actually causes it, and vice versa. Experts despair at this, but they're in a minority.

This great BBC article accuses politicians of being unwilling to admit doubt about whether policies will work. To be fair to them, though, they're in an impossible position, because the public and the media demand certainty. We know that knife crime is skyrocketing and we want someone who knows how to stop it. By which I mean that most of us do. I don't - from what I've read about knife crime, nationwide it probably isn't rising, or maybe it is a bit, but in some parts of the country and among some communities it could be rising a lot, although even if it is, we have no idea why, and we don't have any proven ways of improving things... The point is that it's complicated. There's a lot of evidence to consider and most of it's flawed in one way or another. That's true for neuroscience, and it's also true for public policy.

No comments:

Post a Comment