Both genes and minds value dying out very negatively. They agree on the relevant values.
Minds value not dying out because dying out would mean that they can no longer pursue “true values,” not because not dying out is an end in itself. Imagine we were given a choice between:
A) The human race dies out.
B) The human race survives forever, but every human being alive and who will ever live will be tortured 24⁄7 by a sadistic AI.
Any sane person would choose A. That’s because in scenario B the human race, even though it survives, is unable to pursue any of its values, and is forced to pursue one of its major disvalues.
There is no point in the human race surviving if it can’t pursue its values.
I personally think the solution for the species is the same as it is for an individual, mix pursuit of terminal and instrumental values. I do this every day and I assume you do as well. I spend lots of time and effort making sure that I will survive and exist in the future. But I also take minor risks, such as driving a car, in order to lead a more fun and interesting life.
Carl’s proposal sounds pretty good to me. Yes, it has dangers, as you correctly pointed out. But some level of danger has to be accepted in order to live a worthwhile life.
There is no point in the human race surviving if it can’t pursue its values.
It’s likely to not be a binary decision. We may well be able to trade preserving values against a better chance of surviving at all. The more we deviate from universal instrumental values, the greater our chances of being wiped out by accidents or aliens. The more we adhere to universal instrumental values, the more of our own values get lost.
Since I see our values heavily overlapping with universal instrumental values, adopting them doesn’t seem too bad to me—while all our descendants being wiped out seems pretty negative—although also rather unlikely.
How to deal with this tradeoff is a controversial issue. However, it certainly isn’t obvious that we should struggle to preserve our human values—and resist adopting universal instrumental values. That runs a fairly clear risk of screwing up the future for all our descendants.
It’s likely to not be a binary decision. We may well be able to trade preserving values against a better chance of surviving at all......
....How to deal with this tradeoff is a controversial issue. However, it certainly isn’t obvious that we should struggle to preserve our human values—and resist adopting universal instrumental values. That runs a fairly clear risk of screwing up the future for all our descendants.
If that’s the case I don’t think we disagree about anything substantial. We probably just disagree about what percentage of resources should go to UIV and what should go to terminal values.
Since I see our values heavily overlapping with universal instrumental values, adopting them doesn’t seem too bad to me
You might be right to some extent. Human beings tend to place great terminal value on big, impressive achievements, and quickly colonizing the universe would certainly involve doing that.
If that’s the case I don’t think we disagree about anything substantial. We probably just disagree about what percentage of resources should go to UIV and what should go to terminal values.
It’s a tricky and controversial issue. The cost of preserving our values looks fairly small—but any such expense diverts resources away from the task of surviving—and increases the risk of eternal oblivion. Those who are wedded to the idea of preserving their values will need to do some careful accounting on this issue, if they want the world to run such risks.
While the phrase “universal instrumental values” has the word “instrumental” in it, that’s just one way of thinking about them. You could also call them “nature’s values” or “god’s values”. You can contrast them with human values—but it isn’t really an “instrumental vs terminal” issue.
Minds value not dying out because dying out would mean that they can no longer pursue “true values,” not because not dying out is an end in itself. Imagine we were given a choice between:
A) The human race dies out.
B) The human race survives forever, but every human being alive and who will ever live will be tortured 24⁄7 by a sadistic AI.
Any sane person would choose A. That’s because in scenario B the human race, even though it survives, is unable to pursue any of its values, and is forced to pursue one of its major disvalues.
There is no point in the human race surviving if it can’t pursue its values.
I personally think the solution for the species is the same as it is for an individual, mix pursuit of terminal and instrumental values. I do this every day and I assume you do as well. I spend lots of time and effort making sure that I will survive and exist in the future. But I also take minor risks, such as driving a car, in order to lead a more fun and interesting life.
Carl’s proposal sounds pretty good to me. Yes, it has dangers, as you correctly pointed out. But some level of danger has to be accepted in order to live a worthwhile life.
It’s likely to not be a binary decision. We may well be able to trade preserving values against a better chance of surviving at all. The more we deviate from universal instrumental values, the greater our chances of being wiped out by accidents or aliens. The more we adhere to universal instrumental values, the more of our own values get lost.
Since I see our values heavily overlapping with universal instrumental values, adopting them doesn’t seem too bad to me—while all our descendants being wiped out seems pretty negative—although also rather unlikely.
How to deal with this tradeoff is a controversial issue. However, it certainly isn’t obvious that we should struggle to preserve our human values—and resist adopting universal instrumental values. That runs a fairly clear risk of screwing up the future for all our descendants.
If that’s the case I don’t think we disagree about anything substantial. We probably just disagree about what percentage of resources should go to UIV and what should go to terminal values.
You might be right to some extent. Human beings tend to place great terminal value on big, impressive achievements, and quickly colonizing the universe would certainly involve doing that.
It’s a tricky and controversial issue. The cost of preserving our values looks fairly small—but any such expense diverts resources away from the task of surviving—and increases the risk of eternal oblivion. Those who are wedded to the idea of preserving their values will need to do some careful accounting on this issue, if they want the world to run such risks.
While the phrase “universal instrumental values” has the word “instrumental” in it, that’s just one way of thinking about them. You could also call them “nature’s values” or “god’s values”. You can contrast them with human values—but it isn’t really an “instrumental vs terminal” issue.