Also, the difference in crime rate might amount to something like “if you walk through the areas once a day, you’ll be mugged on average once every ten years or once every thirty years.”
There may be a difference between the rate at which a resident (who’s probably at similar income to other residents in the area, and perceived as an insider) would be mugged, and the rate at which a visitor (seen as likely to be carrying more money, and perceived as a outsider, and the residents won’t have to come face to face with them or their families again) would be mugged.
That said, as I noted in a previous comment, there are definitely cases where people are prone to designate a place as “dangerous” when it’s not actually statistically more dangerous than where they already live. The fact that visiting these neighborhoods may be more dangerous per time spent than living in them doesn’t make it likely that people who refuse to visit them are assessing risk realistically.
Getting mugged once every thirty years means that there’s a 3.3% chance that you will get mugged in any given year. According to this data, the Robbery rate in Metropolitan Areas in 2009 was 133 per 100,000, meaning that each individaul stood about .133% chance of getting robbed that year. Note that this data likely includes instances of robbery that we wouldn’t think of as mugging, so the actual chance of getting mugged is probably lower.
Edit: apparently Metropolitan Areas include suburbs; To get a better picture of what crime rates are in urban areas, he’s the robbery rates for selected big cities (all per 100,000 people): New York 221; LA 317; Chicago 557; Houston 500; Dallas 426; Detroit 661; San Francisco 423; Boston 365. So, somwhat higher than the .133% number I gave earlier, but still well below the numbes that the Grandparent Post implied.
Also, the difference in crime rate might amount to something like “if you walk through the areas once a day, you’ll be mugged on average once every ten years or once every thirty years.”
There may be a difference between the rate at which a resident (who’s probably at similar income to other residents in the area, and perceived as an insider) would be mugged, and the rate at which a visitor (seen as likely to be carrying more money, and perceived as a outsider, and the residents won’t have to come face to face with them or their families again) would be mugged.
That said, as I noted in a previous comment, there are definitely cases where people are prone to designate a place as “dangerous” when it’s not actually statistically more dangerous than where they already live. The fact that visiting these neighborhoods may be more dangerous per time spent than living in them doesn’t make it likely that people who refuse to visit them are assessing risk realistically.
Either of those would be really high, though!
Getting mugged once every thirty years means that there’s a 3.3% chance that you will get mugged in any given year. According to this data, the Robbery rate in Metropolitan Areas in 2009 was 133 per 100,000, meaning that each individaul stood about .133% chance of getting robbed that year. Note that this data likely includes instances of robbery that we wouldn’t think of as mugging, so the actual chance of getting mugged is probably lower.
Edit: apparently Metropolitan Areas include suburbs; To get a better picture of what crime rates are in urban areas, he’s the robbery rates for selected big cities (all per 100,000 people): New York 221; LA 317; Chicago 557; Houston 500; Dallas 426; Detroit 661; San Francisco 423; Boston 365. So, somwhat higher than the .133% number I gave earlier, but still well below the numbes that the Grandparent Post implied.