Thank you. Your explanation fits “futurist/decision-maker” distinction, but I just don’t feel calling decision-maker behavior “conservative” is appropriate? If you probability is already 10%, than treating it like 10% without adjustments is not worst-case thinking. It’s certainly not the (only) kind of conservatism that Eliezer’s quote talks about.
There is another perspective that can be called “conservative”, which observes that futurists’ predictions are commonly overdramatic and accordingly says that they should be moderated for the sake of accuracy.
This is perspective I’m mostly interested in. And this is where I would like to see numbers that balance caution about being overdramatic and having safety margin.
Thank you. Your explanation fits “futurist/decision-maker” distinction, but I just don’t feel calling decision-maker behavior “conservative” is appropriate? If you probability is already 10%, than treating it like 10% without adjustments is not worst-case thinking. It’s certainly not the (only) kind of conservatism that Eliezer’s quote talks about.
This is perspective I’m mostly interested in. And this is where I would like to see numbers that balance caution about being overdramatic and having safety margin.