Expressing probabilities as percents gets a bit weird, because subtraction doesn’t really work like it should.
I don’t understand. The expected value is what you’d get by straight subtraction.
Are you implying that the difference between 2% and 3% should be 1 in 50-33=17, or 5.9%? In that case, the difference between 100% and 99.99999% would be 100%, when they’re really almost exactly the same.
Are you implying that the difference between 2% and 3% should be 1 in 50-33=17, or 5.9%? In that case, the difference between 100% and 99.99999% would be 100%, when they’re really almost exactly the same.