We run the risk of going extinct, and the irony is, we did it to ourselves. The ‘smarty pants’ brain that created advanced weapons, complex global economics [and more] is routinely bossed around by the brain that shoots from the hip, makes often terrible decisions, and reacts more to fear and greed than to reason....
No one in their right mind would deliberately create the means of their own extinction, but that’s what we seem to be doing. The only conclusion is that we’re not in our right minds...
I dislike this quote because it obscures the true nature of the dilemma, namely the tension between individual and collective action. Being “not in one’s right mind” is a red herring in this context. Each individual action can be perfectly sensible for the individual, while still leading to a socially terrible outcome.
The real problem is not that some genius invents nuclear weapons and then idiotically decides to incite global nuclear war, “shooting from the hip” to his own detriment. The real problem is that incentives can be aligned so that it is in everyone’s interest every step along the way, to do their part in their own ultimate destruction.
Of course, if “right mind” was defined to mean “socially optimal mind,” fine, we aren’t in our right mind. But I don’t think that’s the default interpretation.
If you’re consistently in your right mind you can safely create the means of your own extinction, with the knowledge that you are sufficiently sane not to use it to extinguish yourself. This can come in handy when the means of your own extinction has significant non-extinction related uses.
This is true in theory, but do you think it’s an accurate description of our real world?
(Nuclear power is potentially great, but with a bit more patience and care, we could stretch our non-nuclear resources quite a bit further, which would have given us more time to build stable(r) political systems.)
No, I was responding to the “no one in their right mind” bit. It seems to me that when you are in your right mind is precisely the time to build artifacts that could destroy your civilization, and it doesn’t seem to me that you could conclude from building such artifacts that you are not in your right mind.
Rather, I think there’s other evidence that humanity can’t be trusted with e.g. nuclear weaponry, and this suggests that we should not build it. lukeprog’s quote seems to me to be of the form “Humanity can’t be trusted with nuclear weapons, yet builds them anyway, so it must be crazy, so it can’t be trusted with nuclear weapons.”
I think you set a false dichotomy here—we can generate relatively safe nuclear power (thorium reactors) without existential risk, and without creating the byproducts necessary to create nuclear weapons. This is not an argument against the root comment, however.
Sure, thorium reactors do not appear to immediately allow nuclear weapons—but the scientific and technological advances that lead to thorium reactors are definitely “dual-use”.
I’m not entirely convinced of either the feasibility or the ethics of the “physicists should never have told politicians how to build a nuke” argument that’s been made multiple times on LW (and in HPMOR), but the existence of thorium reactors doesn’t really constitute a valid argument against it—an industry capable of building thorium reactors is very likely able to think up, and eventually build, nukes.
Fallacy of Composition? “We” didn’t create advanced weapons, for example, tiny fractions of “we” did. And if half of humanity nukes the other half to extinction, but not before the other half fires off the nukes that wipe out the first half, then is it really fair to say that “we” committed suicide? The outcome is the same but you can’t begin to understand the problem by oversimplifying it.
K.C. Cole
I dislike this quote because it obscures the true nature of the dilemma, namely the tension between individual and collective action. Being “not in one’s right mind” is a red herring in this context. Each individual action can be perfectly sensible for the individual, while still leading to a socially terrible outcome.
The real problem is not that some genius invents nuclear weapons and then idiotically decides to incite global nuclear war, “shooting from the hip” to his own detriment. The real problem is that incentives can be aligned so that it is in everyone’s interest every step along the way, to do their part in their own ultimate destruction.
Of course, if “right mind” was defined to mean “socially optimal mind,” fine, we aren’t in our right mind. But I don’t think that’s the default interpretation.
If you’re consistently in your right mind you can safely create the means of your own extinction, with the knowledge that you are sufficiently sane not to use it to extinguish yourself. This can come in handy when the means of your own extinction has significant non-extinction related uses.
This is true in theory, but do you think it’s an accurate description of our real world?
(Nuclear power is potentially great, but with a bit more patience and care, we could stretch our non-nuclear resources quite a bit further, which would have given us more time to build stable(r) political systems.)
No, I was responding to the “no one in their right mind” bit. It seems to me that when you are in your right mind is precisely the time to build artifacts that could destroy your civilization, and it doesn’t seem to me that you could conclude from building such artifacts that you are not in your right mind.
Rather, I think there’s other evidence that humanity can’t be trusted with e.g. nuclear weaponry, and this suggests that we should not build it. lukeprog’s quote seems to me to be of the form “Humanity can’t be trusted with nuclear weapons, yet builds them anyway, so it must be crazy, so it can’t be trusted with nuclear weapons.”
I think you set a false dichotomy here—we can generate relatively safe nuclear power (thorium reactors) without existential risk, and without creating the byproducts necessary to create nuclear weapons. This is not an argument against the root comment, however.
Sure, thorium reactors do not appear to immediately allow nuclear weapons—but the scientific and technological advances that lead to thorium reactors are definitely “dual-use”.
I’m not entirely convinced of either the feasibility or the ethics of the “physicists should never have told politicians how to build a nuke” argument that’s been made multiple times on LW (and in HPMOR), but the existence of thorium reactors doesn’t really constitute a valid argument against it—an industry capable of building thorium reactors is very likely able to think up, and eventually build, nukes.
Fallacy of Composition? “We” didn’t create advanced weapons, for example, tiny fractions of “we” did. And if half of humanity nukes the other half to extinction, but not before the other half fires off the nukes that wipe out the first half, then is it really fair to say that “we” committed suicide? The outcome is the same but you can’t begin to understand the problem by oversimplifying it.