Karla Lant

Freelance Writer, Editor and Professor

Why Citizens of Murdertown, U.S.A Need to Think Critically to Understand Statistics

Statistics and Critical ThinkingPeople love to talk about how statistics are all a horrible, misleading lie. This is, of course, true when no one understands them well enough to use them properly.

Today the U.S. Conference of Mayors was once again forced to release a statement about the CQ Press “crime rankings by city” publication which has become the bane of their existence; here is last year’s edition. The issue is that each year the guide comes out, spawns a spate of news items, and readers instantly believe they will be victimized by violent criminals in certain cities. It’s sort of like a top ten list no city wants to be on, especially since it is ostensibly based on FBI data which people tend to think is reliable.

The FBI, however, apparently doesn’t cotton to this misuse of its research, and has even issued an official statement about all of the things you shouldn’t do with its statistics. Furthermore, the press release from the Conference of Mayors pointed out the now-infamous words of Scott Morgan, the editor of the Morgan Quitno Press who compiled the data, that he would be “stunned if there is a criminologist out there who will support this.” So much for editorial judgment.

The report itself has been condemned routinely not only by the mayors and the FBI, but also by Criminal Justice Journalists and the American Society of Criminology amongst others. These informed commentators point out that the rankings are arbitrary and skewed by factors unrelated to danger and crime. Some of those factors include geography, zoning, reporting rates and socioeconomic factors.

The fact is, it’s your own unique risk factors as a person — your age, your lifestyle, where exactly in town you live — that are most predictive of risk.

None of this has done anything to stop simplistic reporting on the most dangerous cities in America. Read about “Murderapolis” here and the burned out shell that is Detroit here. This is not to say that there aren’t real differences in crime rates city to city in the United States. But it is to say that these lists are too simplistic to be relied upon at best.

At any rate, this is all interesting to me not only as the daily dose of humor but because the constant misuse and misunderstanding of statistics is a real problem in my mind.

I must hear someone — usually someone who is very intelligent, actually — tell me once a week that statistics are meaningless, fraudulent or phony. The thing is, although the rampant misuse of them in things we read can render those reports meaningless, fraudulent or phony, when properly rendered statistics themselves are in fact meaningful and free of evil intent.

Still, we’re doing it wrong, and it seems like we’re going to go ahead and keep doing that.

The NIH has reported that even guidelines for reporting statistics confuse more than they help. Why are these ideas so hard for us to understand? Well, it’s probably worth noting that even people who produce statistics get confused a lot.

In 2011 psychologist Sander Nieuwenhuis reported that neuroscientists frequently confuse the issue of significance in their work. In fact there are common errors that plague all scientific disciplines.

Misunderstanding (and resulting misuse) of confidence intervals is a good example. Everyone seems to think using them is important, but there are plenty of examples of misunderstanding and misuse of confidence intervals. They’re so easy to find because they’re everywhere.

So what does confidence interval mean? Here’s what the U.S. Census Bureau says, and the Engineering Statistics Handbook of the National Institute of Standards and Technology says it means this.

Confused? I hear you. It’s one thing to say that researchers reporting their statistical findings should endeavor to get it right: they should. But how can writers and readers avoid confusing the statistical information we come across?

As journalists we need to ensure that we aren’t adding in our own (faulty) analysis. Part of the problem is that if we see that a number has risen, we may express in that in many different ways. Say that I am reporting a 30% increase from 2003 to today in the cost of caring for iguanas caused by rising food and care costs. As a writer I hope to create interesting copy, so I might say any of these things:

Costs of Iguana care rise/surge/creep upward/explode/soar 30% from 2003 to 2013. As a writer I might want to try to stick to more neutral language if I’m unsure of what the statistic really means, or I might want to assess it closely myself before choosing my words. Also, did the researchers themselves use this language? If not, I may be inadvertently injecting my own analysis into the piece.

As a reader, I need to consider all of the facts. Was that figure adjusted for inflation? If not, it has different meaning. If there’s an annual inflation rate of 2.5% (I am making up these numbers for the hypothetical, by the way) and that rate is compounded over the ten years the costs haven’t risen as much as it might have seemed at first blush. So as a reader I need to be sure of all of the information, and I need to inform myself about what makes for good and bad statistics.

Statistics can give you excellent insight if they’re reliable. Here is a nice set of techniques for assessing them. Do you need to be a careful, critical reader of statistics? Yes. In fact you need to be a careful, critical reader, period. More fundamentally, an everyday practice of critical thinking will do all of us a great service.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Information

This entry was posted on January 22, 2013 by .
Follow

Get every new post delivered to your Inbox.

%d bloggers like this: