…look, at some point in life we have to try to triage our efforts and give up on what can’t be salvaged. There’s often a logistic curve for success probabilities, you know? The distances are measured in multiplicative odds, not additive percentage points. You can’t take a project like this and assume that by putting in some more hard work, you can increase the absolute chance of success by 10%. More like, the odds of this project’s failure versus success start out as 1,000,000:1, and if we’re very polite and navigate around Mr. Topaz’s sense that he is higher-status than us and manage to explain a few tips to him without ever sounding like we think we know something he doesn’t, we can quintuple his chances of success and send the odds to 200,000:1. Which is to say that in the world of percentage points, the odds go from 0.0% to 0.0%. That’s one way to look at the “law of continued failure”.
If you had the kind of project where the fundamentals implied, say, a 15% chance of success, you’d then be on the right part of the logistic curve, and in that case it could make a lot of sense to hunt for ways to bump that up to a 30% or 80% chance.
Capturing the point that with a strong inside view, it’s not unreasonable to have probabilities which look extreme to someone who’s relying on outside view and fuzzy stuff. Strong Evidence is Common gets some of it, but there’s no nicely linkable doc where you can point someone who says “woah, you have >95%/99/99.99 p(doom)? that’s unjustifiable!”
Ideally the post would also capture the thing about how exchanging updates about the world by swapping gears is vastly more productive than swapping/averaging conclusion-probabilities, so speaking from the world as you see it rather than the mixture of other people’s black box guesses you expect to win prediction markets is the epistemicaly virtuous move.
There’s often a logistic curve for success probabilities, you know? The distances are measured in multiplicative odds, not additive percentage points. You can’t take a project like this and assume that by putting in some more hard work, you can increase the absolute chance of success by 10%. More like, the odds of this project’s failure versus success start out as 1,000,000:1, and if we’re very polite and navigate around Mr. Topaz’s sense that he is higher-status than us and manage to explain a few tips to him without ever sounding like we think we know something he doesn’t, we can quintuple his chances of success and send the odds to 200,000:1. Which is to say that in the world of percentage points, the odds go from 0.0% to 0.0%. That’s one way to look at the “law of continued failure”.
If you had the kind of project where the fundamentals implied, say, a 15% chance of success, you’d then be on the right part of the logistic curve, and in that case it could make a lot of sense to hunt for ways to bump that up to a 30% or 80% chance.
I’m not sure which logistics curve one you mean?
Security Mindset and the Logistic Success Curve
Capturing the point that with a strong inside view, it’s not unreasonable to have probabilities which look extreme to someone who’s relying on outside view and fuzzy stuff. Strong Evidence is Common gets some of it, but there’s no nicely linkable doc where you can point someone who says “woah, you have >95%/99/99.99 p(doom)? that’s unjustifiable!”
Ideally the post would also capture the thing about how exchanging updates about the world by swapping gears is vastly more productive than swapping/averaging conclusion-probabilities, so speaking from the world as you see it rather than the mixture of other people’s black box guesses you expect to win prediction markets is the epistemicaly virtuous move.
Thanks to both you and Zack Stein-Perlman.
One thing I immediately note is “wow the Logistics Success Curve jargon is particularly impenetrable”, I think this could use a normie-friendly name.