My intuition about probability doesn’t match the linear distances between percentage points.
With some percentages, 51% and 49% makes all the difference, such as in company ownership and voting.
But with other percentages, 15% and 30% point estimates of timeline prediction make me wish to have a “plus or minus Knightian uncertainty” emoji, screaming in ignorance “that’s basically the same number, no?!?”.. the feeling is similar to my reaction when I first learned the result of 0.1 + 0.2 and 0.3 are two different numbers in float32[1]. I understand why, but also .. yuck!
If some of the variables in a world-model can change the prediction from “that’s basically impossible” to “oh, it already happened”, I wish smarter people than me invented better communication tools about that. Something less awkward than decibels or bits though.. something that would feel like cubic-bezier(1,0,0,1).
The least I can hope for is more examples and qualitative splits (if/else, scenarios, …) before collapsing an estimate to a single weighted number.
edit: previous version said that I learned 0.1 and 1⁄10 are different numbers in float32, but that’s not the case and I forgot to say that the learning was wrong and I had to unlearn it .. and now I have no idea what I was trying to say with that story
10% ≈ 90%
My intuition about probability doesn’t match the linear distances between percentage points.
With some percentages, 51% and 49% makes all the difference, such as in company ownership and voting.
But with other percentages, 15% and 30% point estimates of timeline prediction make me wish to have a “plus or minus Knightian uncertainty” emoji, screaming in ignorance “that’s basically the same number, no?!?”.. the feeling is similar to my reaction when I first learned the result of
0.1 + 0.2and0.3are two different numbers in float32[1]. I understand why, but also .. yuck!If some of the variables in a world-model can change the prediction from “that’s basically impossible” to “oh, it already happened”, I wish smarter people than me invented better communication tools about that. Something less awkward than decibels or bits though.. something that would feel like cubic-bezier(1,0,0,1).
The least I can hope for is more examples and qualitative splits (if/else, scenarios, …) before collapsing an estimate to a single weighted number.
edit: previous version said that I learned 0.1 and 1⁄10 are different numbers in float32, but that’s not the case and I forgot to say that the learning was wrong and I had to unlearn it .. and now I have no idea what I was trying to say with that story