Game theory will only help with people who aren’t crazy (if their utility function is the number of dead people, regardless of their own survival, then it is hard to concoct good deterrents).
Perhaps—but /some/ sort of analysis, whatever it is called, seems important to try here. For example, looking around at us today, the number of people whose “utility function is the number of dead people” is very small compared to those whose isn’t, implying that it’s more the exception than the rule—which at least raises the hope that if the causes of people acquiring such a utility function can be understood, then, perhaps, measures could be taken to prevent anyone from acquiring that utility function in the first place.
Or maybe such is impossible—but it seems a problem worth investigating, yesno?
I think we are in agreement (with your above comment, not the entire OP) unless I misunderstand what you are saying.
Trying to stop such people from existing is different from trying to use game theory to deter such people. I am fully in favor of your suggestion, but I think it is naieve to assume that we can deter everyone through things like mutually assured destruction, etc.
I will note that I think that what you suggest is very difficult, but I don’t have any better ideas currently; the advantage of your suggestion is that partial progress buys us time (in that decreasing the number of crazy people decreases the likelihood that one of them has enough technical expertise to create powerful weapons).
Game theory will only help with people who aren’t crazy (if their utility function is the number of dead people, regardless of their own survival, then it is hard to concoct good deterrents).
Yea, the risk of superpowered serial killers is larger than the risk of superpowered terrorists.
Perhaps—but /some/ sort of analysis, whatever it is called, seems important to try here. For example, looking around at us today, the number of people whose “utility function is the number of dead people” is very small compared to those whose isn’t, implying that it’s more the exception than the rule—which at least raises the hope that if the causes of people acquiring such a utility function can be understood, then, perhaps, measures could be taken to prevent anyone from acquiring that utility function in the first place.
Or maybe such is impossible—but it seems a problem worth investigating, yesno?
I think we are in agreement (with your above comment, not the entire OP) unless I misunderstand what you are saying.
Trying to stop such people from existing is different from trying to use game theory to deter such people. I am fully in favor of your suggestion, but I think it is naieve to assume that we can deter everyone through things like mutually assured destruction, etc.
I will note that I think that what you suggest is very difficult, but I don’t have any better ideas currently; the advantage of your suggestion is that partial progress buys us time (in that decreasing the number of crazy people decreases the likelihood that one of them has enough technical expertise to create powerful weapons).