Tag Archives: probability

Micromorts and Understanding the Probability of Death

Under­stand­ing prob­ab­il­it­ies is hard (viz.) – and it’s espe­cially so when we try to under­stand and take ration­al decisions based on very small prob­ab­il­it­ies, such as one-in‑a mil­lion chance events. How, then, to com­mu­nic­ate risks on a sim­il­ar level, too?

The answer is to use a more under­stand­able scale, such as micro­morts; “a unit of risk meas­ur­ing a one-in-a-mil­lion prob­ab­il­ity of death”. Some activ­it­ies that increase your risk of death by one micro­mort (accord­ing to, among oth­er sources, the Wiki­pe­dia entry):

  • smoking 1.4 cigar­ettes (can­cer, heart dis­ease)
  • drink­ing 0.5 liter of wine (cir­rhosis of the liv­er)
  • liv­ing 2 days in New York or Boston (air pol­lu­tion)
  • liv­ing 2 months in Den­ver (can­cer from cos­mic radi­ation)
  • liv­ing 2 months with a smoker (can­cer, heart dis­ease)
  • liv­ing 150 years with­in 20 miles of a nuc­le­ar power plant (can­cer from radi­ation)
  • drink­ing Miami water for 1 year (can­cer from chlo­ro­form)
  • eat­ing 100 char­coal-broiled steaks (can­cer from ben­zo­pyrene)
  • eat­ing 40 table­spoons of pea­nut but­ter (liv­er can­cer from Aflatox­in B)
  • eat­ing 1000 bana­nas, (can­cer from radio­act­ive 1 kBED of Potassi­um-40)
  • trav­el­ling 6 miles (10 km) by motor­bike (acci­dent)
  • trav­el­ling 16 miles (26 km) on foot (acci­dent)
  • trav­el­ling 20 miles (32 km) by bike (acci­dent)
  • trav­el­ling 230 miles (370 km) by car (acci­dent)
  • trav­el­ling 6000 miles (9656 km) by train (acci­dent)
  • fly­ing 1000 miles (1609 km) by jet (acci­dent)
  • fly­ing 6000 miles (9656 km) by jet (can­cer from cos­mic radi­ation)
  • tak­ing 1 ecstasy tab­let

Issue fifty-five of Plus magazine looked at micro­morts in more detail, thanks to Dav­id Spiegel­hal­ter (the Win­ton Pro­fess­or of the Pub­lic Under­stand­ing of Risk at the Uni­ver­sity of Cam­bridge) and Mike Pear­son, both of Under­stand­ing Uncer­tainty.

via Schnei­er on Secur­ity

The Numbers in Our Words: Words of Estimative Probability

Toward the end of this month I will almost cer­tainly post the tra­di­tion­al Lone Gun­man Year in Review post. Exactly how likely am I to do this? Am I able to quanti­fy the prob­ab­il­ity that I’ll do this? By using the phrase “almost cer­tainly”, I already have.

To provide unam­bigu­ous, quant­it­at­ive odds of an event occur­ring based solely on word choice, the “fath­er of intel­li­gence ana­lys­is”, Sher­man Kent, developed and defined the Words of Estim­at­ive Prob­ab­il­ity (WEPs): words and phrases we use to sug­gest prob­ab­il­ity and the actu­al numer­ic­al prob­ab­il­ity range to accom­pany each.

Kent’s idea has had a mixed recep­tion in the intel­li­gence com­munity and the dis­reg­ard­ing of the prac­tice has been blamed, in part, for the intel­li­gence fail­ings that lead to 9/11.

The words by decreas­ing prob­ab­il­ity:

  • Cer­tain: 100%
  • Almost Cer­tain: 93% ± 6%
  • Prob­able: 75% ± 12%
  • Chances About Even: 50% ± 10%
  • Prob­ably Not: 30% ± 10%
  • Almost Cer­tainly Not: 7% ± 5%
  • Impossible: 0%

The prac­tice has also gained some advoc­ates in medi­cine, with the fol­low­ing list of defin­i­tions formed:

  • Likely: Expec­ted to hap­pen to more than 50% of sub­jects
  • Fre­quent: Will prob­ably hap­pen to 10–50% of sub­jects
  • Occa­sion­al: Will hap­pen to 1–10% of sub­jects
  • Rare: Will hap­pen to less than 1% of sub­jects

It would be nice if there were such defin­i­tions for the many oth­er ambigu­ous words we use daily.

Learn Statistics, Damn You!

Thanks to my mod­er­ate know­ledge of stat­ist­ics, I know that I have a lot more to learn in the field and should nev­er make assump­tions about data or ana­lyses (even my own).

Because of this I share a griev­ance with Zed Shaw who says that “pro­gram­mers need to learn stat­ist­ics or I will kill them all”. Required read­ing and advice not just for pro­gram­mers, but for every­one who looks at data, cre­ates mod­els, or even reads a news­pa­per.

I have a major pet peeve that I need to con­fess. I go insane when I hear pro­gram­mers talk­ing about stat­ist­ics like they know shit when its clearly obvi­ous they do not. I’ve been study­ing it for years and years and still don’t think I know any­thing. This art­icle is my call for all pro­gram­mers to finally learn enough about stat­ist­ics to at least know they don’t know shit. I have no idea why, but their con­fid­ence in their lack­ing know­ledge is only sur­passed by their lack of con­fid­ence in their per­son­al appear­ance.

My recom­mend­a­tion? Read this art­icle to real­ise that you know noth­ing, and then pick up a copy of John Allen Paulos’ Innu­mer­acy and Dar­rell Huff’s How to Lie with Stat­ist­ics in order to real­ise that you know even less than you thought (but a hell of a lot more than the aver­age per­son).

Being Rational About Risk

Leonard Mlodinow—physicist at Cal­tech and author of The Drunk­ard’s Walk, a highly-praised book look­ing at ran­dom­ness and our inab­il­ity to take it into account—has an inter­view in The New York Times about under­stand­ing risk. Some choice quotes:

I find that pre­dict­ing the course of our lives is like pre­dict­ing the weath­er. You might be able to pre­dict your future in the short term, but the longer you look ahead, the less likely you are to be cor­rect.

I don’t think com­plex situ­ations like [the cur­rent fin­an­cial crisis] can be pre­dicted. There are too many uncon­trol­lable or unmeas­ur­able factors. After­wards, of course, it will appear that some people had got­ten it just right: since there are many people mak­ing many pre­dic­tions, no doubt some of them will get it right, if only by chance. But that does­n’t mean that, if not for some unfore­seen ran­dom turn, things would­n’t have gone the oth­er way. […]

In some sense this idea is encap­su­lated in the cliché that “hind­sight is always 20/20,” but people often behave as if the adage wer­en’t true. In gov­ern­ment, for example, a “should-have-known-it” blame game is played after every tragedy.

As someone who has taken risks in life I find it a com­fort to know that even a coin weighted toward fail­ure will some­times land on suc­cess. Or, as I.B.M. pion­eer Thomas Wat­son said, “If you want to suc­ceed, double your fail­ure rate.”

I haven’t had a chance to watch it, but in May 2008 Mlodinow spoke for the Authors@Google series.

The Birthday Problem

I’ve heard of this ‘prob­lem’ numer­ous times before, as I’m sure many oth­ers have too. Non­ethe­less, everytime I do hear it, it fas­cin­ates me.

The birth­day prob­lem (or para­dox, as it’s often referred), looks at the prob­ab­il­ity of two or more people from a ran­domly chosen set of people shar­ing a birth­day.

In a group of at least 23 ran­domly chosen people, there is more than 50% prob­ab­il­ity that some pair of them will both have been born on the same day. For 57 or more people, the prob­ab­il­ity is more than 99%, and it reaches 100% when the num­ber of people reaches 367[…]. The math­em­at­ics behind this prob­lem leads to a well-known cryp­to­graph­ic attack called the birth­day attack.