It’s advice that you generally see from LessWrongers and rationality-adjacent people who are not actively working on technical alignment.
I don’t know if that’s true, but it might be. That does not change the fact that there is a lot of “stay realistic”-type advise that you get from people in these circles. I’d wager this type of advice does not generally come from a more lucid view of reality, but rather from (irrationally high) risk aversion.
If I’d summarize this in one sentence: we need to be much more risk-tolerant and signalling-averse if we want a chance at solving the most important problems.
It’s advice that you generally see from LessWrongers and rationality-adjacent people who are not actively working on technical alignment.
I don’t know if that’s true, but it might be. That does not change the fact that there is a lot of “stay realistic”-type advise that you get from people in these circles. I’d wager this type of advice does not generally come from a more lucid view of reality, but rather from (irrationally high) risk aversion.
If I’d summarize this in one sentence: we need to be much more risk-tolerant and signalling-averse if we want a chance at solving the most important problems.