Currently most X-risk reduction resources are directed by a presumption that AGI is coming in less than a decade. I think this “consensus” is somewhat overconfident, and also somewhat unreal (i.e. it’s less of a consensus than it seems). That’s a very usual state of affairs, so I don’t want to be too melodramatic about it, but it still has concrete bad effects. I wish people would say “I don’t have additional clearly-expressible reasons to think AGI is coming very soon, that I’ll defend in a debate, beyond that it seems like everyone else thinks that.”. I also wish people would say “I’m actually mainly thinking that AGI is coming soon because thoughtleaders Alice and Bob say so.”, if that’s the case. Then I could critique Alice’s and/or Bob’s stated position, rather than taking potshots at an amorphous unaccountable ooze.
I’m a bit confused about whether it’s actually good. I think I often run a heuristic counter to it… something like:
“When you act in accordance with a position and someone challenges you on it, it’s healthy for the ecosystem and culture to give the best arguments for it, and find out whether they hold up to snuff (i.e. whether the other person has good counterarguments). You don’t have to change your mind if you lose the argument—because often we hold reasons for illegible but accurate intuitions—but it’s good to help people figure out the state of the best arguments at the time.”
I guess this isn’t in conflict, if you just separately give the cause for your belief? e.g. “I believe it for cause A. But that’s kind of hard to discuss, so let me volunteer the best argument I can think of, B.”
I guess this isn’t in conflict, if you just separately give the cause for your belief? e.g. “I believe it for cause A. But that’s kind of hard to discuss, so let me volunteer the best argument I can think of, B.”
Yes, absolutely—I would suggest giving both if feasible, and I think it would usually be feasible with practice doing so and with less social norms pressuring you to pretend to have already thought for yourself about everything.
I’m a bit confused about whether it’s actually good. I think I often run a heuristic counter to it… something like:
“When you act in accordance with a position and someone challenges you on it, it’s healthy for the ecosystem and culture to give the best arguments for it, and find out whether they hold up to snuff (i.e. whether the other person has good counterarguments). You don’t have to change your mind if you lose the argument—because often we hold reasons for illegible but accurate intuitions—but it’s good to help people figure out the state of the best arguments at the time.”
I guess this isn’t in conflict, if you just separately give the cause for your belief? e.g. “I believe it for cause A. But that’s kind of hard to discuss, so let me volunteer the best argument I can think of, B.”
Yes, absolutely—I would suggest giving both if feasible, and I think it would usually be feasible with practice doing so and with less social norms pressuring you to pretend to have already thought for yourself about everything.