If you don’t use a precise method to arrive at your claim, you have no business making a precise claim. Remember significant figures from high school chemistry? Same principle.
That assumes that someone isn’t calibrated. If someone calibrates his intuition via frequent usage of prediction book and by always thinking in terms of probability he might be able to make precise claims without following a precise method.
If someone would claim “1.21% chance of FAI” success by 2100 I would agree with you that the person didn’t learn the lesson about significant figures from high school chemistry. I don’t the that issue with someone claiming 1% chance.
If you want to get calibrated it’s also useful to start putting numbers on a lot of likelihoods that you think about, even if the precision is sometimes to high. It allows you to be wrong and that’s good for learning.
I think it’s likely that calibration is domain-specific, so I’m not sure I buy this unless the calibration has occurred in the same domain, which is rare/impossible for the domains we’re talking about.
I think it’s likely that calibration is domain-specific, so I’m not sure I buy this unless the calibration has occurred in the same domain, which is rare/impossible for the domains we’re talking about.
I think you can argue that the probability is inherently unknowable but I don’t see how a detailed process is much better than an intuitive process.
It’s very useful to have a mental ability to distinguish between 0.01, 0.001 and 0.0001 when it comes to thinking about XRisk events. I don’t think that it’s a good practice to call all of those events unlikely and avoiding to make semantic distinctions between them.
It’s very useful to have a mental ability to distinguish between 0.01, 0.001 and 0.0001 when it comes to thinking about XRisk events. I don’t think that it’s a good practice to call all of those events unlikely and avoiding to make semantic distinctions between them.
But how do you arrive at them? Intuition doesn’t deal with 0.01 and 0.00001. Intuition deals with vague notions of likely and unlikely, which also change depends on what you ate for lunch and the phase of the moon. IOW, your intuition is useless to me unless I can confirm it myself. (But then it’s not intuition anymore.)
But how do you arrive at them? Intuition doesn’t deal with 0.01 and 0.00001. Intuition deals with vague notions of likely and unlikely, which also change depends on what you ate for lunch and the phase of the moon.
I think there are plenty of cases where I can give you a intuitive answer that won’t change from 0.01 to 0.00001 depending on what I ate for lunch.
The chance that I die in the next year is higher than 0.00001 but lower than 0.01.
If you don’t have an intuition that allows you to do so, I think it’s because you don’t have enough exposure to people making distinctions between 0.01 and 0.00001.
If there’s a 0.01 chance that something happens tomorrow, then if everything stays the same you’d expect that thing to happen about three or four times this year, whereas if it’s 0.00001 you’d be quite surprised if it ever happens (EDIT: during your lifetime, assuming no cryonics/antiagathics/uploads). (Of course with stuff like x-risk intuition will be much less reliable.)
And that’s a good assumption, since by my estimate, 99.9537% of people are not calibrated.
Part of lesswrong mission is about moving to a world where more people are calibrated. I don’t think it’s helpful to declare calibration a lost course.
That assumes that someone isn’t calibrated. If someone calibrates his intuition via frequent usage of prediction book and by always thinking in terms of probability he might be able to make precise claims without following a precise method.
If someone would claim “1.21% chance of FAI” success by 2100 I would agree with you that the person didn’t learn the lesson about significant figures from high school chemistry. I don’t the that issue with someone claiming 1% chance.
If you want to get calibrated it’s also useful to start putting numbers on a lot of likelihoods that you think about, even if the precision is sometimes to high. It allows you to be wrong and that’s good for learning.
I think it’s likely that calibration is domain-specific, so I’m not sure I buy this unless the calibration has occurred in the same domain, which is rare/impossible for the domains we’re talking about.
I think you can argue that the probability is inherently unknowable but I don’t see how a detailed process is much better than an intuitive process.
It’s very useful to have a mental ability to distinguish between 0.01, 0.001 and 0.0001 when it comes to thinking about XRisk events. I don’t think that it’s a good practice to call all of those events unlikely and avoiding to make semantic distinctions between them.
But how do you arrive at them? Intuition doesn’t deal with 0.01 and 0.00001. Intuition deals with vague notions of likely and unlikely, which also change depends on what you ate for lunch and the phase of the moon. IOW, your intuition is useless to me unless I can confirm it myself. (But then it’s not intuition anymore.)
I think there are plenty of cases where I can give you a intuitive answer that won’t change from 0.01 to 0.00001 depending on what I ate for lunch.
The chance that I die in the next year is higher than 0.00001 but lower than 0.01.
If you don’t have an intuition that allows you to do so, I think it’s because you don’t have enough exposure to people making distinctions between 0.01 and 0.00001.
If there’s a 0.01 chance that something happens tomorrow, then if everything stays the same you’d expect that thing to happen about three or four times this year, whereas if it’s 0.00001 you’d be quite surprised if it ever happens (EDIT: during your lifetime, assuming no cryonics/antiagathics/uploads). (Of course with stuff like x-risk intuition will be much less reliable.)
And that’s a good assumption, since by my estimate, 99.9537% of people are not calibrated.
Part of lesswrong mission is about moving to a world where more people are calibrated. I don’t think it’s helpful to declare calibration a lost course.