I think this is maybe implied by Aiyen’s comment, but to highlight an element here:
This way of thinking doesn’t have you trade sanity for slight xrisk decrease.
It has you trading sanity for perceived slight xrisk decreases.
If you start making those trades, over time your ability to perceive what’s real decays.
If there’s anything that has anything whatsoever like agency and also benefits from you feeding your sanity into a system like this, it’ll exploit you right into a Hell of your own making.
Any time you’re caught in a situation that asks for you to make a trade like this, it’s worth asking if you’ve somehow slipped into the thrall of an influence like this.
It’s way more common than you might think at first.
On the one hand, that’s literally true. On the other, I feel like the connotations are dangerous. Existential risk is one of the worst possible things, and nearly anything is better than slightly increasing it. However, we should be careful that that mindset doesn’t lead us into Pascal’s Muggings and/or burnout. We certainly aren’t likely to be able to fight existential risk if it drives us insane!
I strongly suspect that it’s not self-sacrificing researchers who will solve alignment and bring us safely through the current crisis, but ones who are able to address the situation calmly and without freaking out, even though freaking out seems potentially justified.
I’d rather go insane than “slightly increase” existential risk.
I think this is maybe implied by Aiyen’s comment, but to highlight an element here:
This way of thinking doesn’t have you trade sanity for slight xrisk decrease.
It has you trading sanity for perceived slight xrisk decreases.
If you start making those trades, over time your ability to perceive what’s real decays.
If there’s anything that has anything whatsoever like agency and also benefits from you feeding your sanity into a system like this, it’ll exploit you right into a Hell of your own making.
Any time you’re caught in a situation that asks for you to make a trade like this, it’s worth asking if you’ve somehow slipped into the thrall of an influence like this.
It’s way more common than you might think at first.
On the one hand, that’s literally true. On the other, I feel like the connotations are dangerous. Existential risk is one of the worst possible things, and nearly anything is better than slightly increasing it. However, we should be careful that that mindset doesn’t lead us into Pascal’s Muggings and/or burnout. We certainly aren’t likely to be able to fight existential risk if it drives us insane!
I strongly suspect that it’s not self-sacrificing researchers who will solve alignment and bring us safely through the current crisis, but ones who are able to address the situation calmly and without freaking out, even though freaking out seems potentially justified.