This will be the first installment of many to come where I talk about bogus statistics and their numerical flaws. Today's post will be about crime rate statistics.
After vacationing in Myrtle Beach, SC, from 1979 to 2005 and living in the town full-time from 2005 to 2011, never once did I feel unsafe and at no time was I under the impression that I was living in one of the most dangerous cities in America ... until the following rankings appeared:
Are you kidding me? Myrtle Beach ranked the 21st most dangerous city in the country? More dangerous than Newark, NJ (51st) or Philadelphia, PA (50th)? Surely you can't be serious (I am serious and please don't call me Shirley, LOL).
I grew up in NJ, just minutes away from Newark. No way (and I really mean no way) is Myrtle Beach more dangerous. Not even close. So what gives, how can the statistics lie?
Quite simple. Crime rates are computed as total # of crimes divided by population of the city (i.e., crime rate = total crimes / population). The problem is that Myrtle Beach only has a steady population of 35,000 people; however there are millions of tourists that visit the city through the year. In the Spring and Fall hoards of golfers visit from the north where it gets too cold to play golf and tees up in the warmer climate on one of Myrtle's 100+ golf courses. In the summer, throngs of families visit the coast to enjoy the beach. However, the crime rates don't include this transient population! Therein lies (no pun intended) the flaw of crime rate statistics, they do not include the number of tourists in the denominator.
If the crime rates were computed like this: crime rate = total crimes / (population + tourists) then the true crime rate of Myrtle Beach would drop substantially relative to Newark because, let's face it, not too many people spend a nice family vacation in Newark, LOL. Residents in Myrtle Beach can now sleep easier!