We should call those the Moral Convergent Values or some other fancy name.
These are the same as Universal Instrumental Values? Or is there a reason to think that something different would be valued?
Incidentally, value convergence could involve multiple attractors. There might be moral symmetry breaking. Value systems as natural attractors doesn’t imply universal convergence on one set of values. This point seems to get lost in this post.
Tim, thanks for that commentary, it will put reading your book on the top of my leisure to do list.
Yes, it could involve multiple attractors. I’m not sure which kind of symmetry you refer to though. Do you mean some sort of radial symmetry comming from everything else towards a unique set of values? Even in that case it would not be symmetric because the acceleration (force) would be different from different regions, due to for instance the stuff in (2008 Boyd Richerson and someone else) .
About your main question, no, that is not the same as instrumental Moral Values. Those who hold that claim would probably prefer to say something like: There are these two sets of convergent values, the instrumental ones, and about those we don’t care much more than Omohundro does. And the Convergent ones, which are named such because they converge despite not being for instrumental reasons.
Yes, it could involve multiple attractors. I’m not sure which kind of symmetry you refer to though. Do you mean some sort of radial symmetry comming from everything else towards a unique set of values? Even in that case it would not be symmetric because the acceleration (force) would be different from different regions, due to for instance the stuff in (2008 Boyd Richerson and someone else).
Imagine the Lefties, who value driving on the left—and the Righties, who value driving on the right. Nature doesn’t care much about this (metaphorically speaking, of course), but the Lefties and the Righties do. I would say that was an example of moral symmetry breaking. It may not be the greatest example (it is more lilkely that they actually care about not being killed) - but I think it illustrates the general idea.
About your main question, no, that is not the same as instrumental Moral Values. Those who hold that claim would probably prefer to say something like: There are these two sets of convergent values, the instrumental ones, and about those we don’t care much more than Omohundro does. And the Convergent ones, which are named such because they converge despite not being for instrumental reasons.
I suspect they are practically the same. Intelligent organisms probably won’t deviate far from Universal Instrumental Values—for fear of meeting agents whose values more closely approximate them—thus losing control of their entire future.
Lefties and righties is just a convention case, if humans had three arms, two on the right, there might have been a matter of fact as to coming from which arm preference things go better.
I think this fear of other agents taking over the world is some form of reminiscent ingroup outgroup bias. To begin with, on the limit, if you value A B and C intrinsically but you have to do D1 D2 and D3 instrumentally, you may initially think of doing D1 D2 and D3. but what use would it be to fill up your future with that instrumental stuff if you nearly never get A B an C. You’d become just one more stupid replicator fighting for resources. You’d be better off by doing nothing and wishing that, by luck, A B an C were being instantiated by someone less instrumental than yourself.
Lefties and righties is just a convention case, if humans had three arms, two on the right, there might have been a matter of fact as to coming from which arm preference things go better.
Sure, but there are cases where rivals are evenly matched. Lions and tigers, for instance, have different—often conflicting—aims. However, it isn’t a walk-over for one team. Of course, you could say whether the lion or tiger genes win is “just a convention”—but to the lions and tigers, it really matters.
To begin with, on the limit, if you value A B and C intrinsically but you have to do D1 D2 and D3 instrumentally, you may initially think of doing D1 D2 and D3. but what use would it be to fill up your future with that instrumental stuff if you nearly never get A B an C [?]
No use. However, our values are not that far from Universal Instrumental Values—because we were built by a process involving a lot of natural selection.
Our choice is more like: do we give up a few of the things we value now—or run the risk of losing many more of them in the future. That leads to the question of how big the risk is—and that turns out to be a tricky issue.
Agreed. That tricky issue I suspect might have enormous consequences if reason ends up being highjacked by in-group out-group biases, and the surviving memes end up being those that make us more instrumental, for fear of someone else doing the same.
I expect that the force that will eventually promote natural values most strongly will be the prospect of encountering unknown aliens. As you say, the stakes are high. If we choose incorrectly, much of our distinctiveness could be permanently obliterated.
Replicator, reproducor, I can cope either way. It seems to be mostlycritics who get into a muddle over this issue—though of course, we should try not to confuse people with misleading terminology.
These are the same as Universal Instrumental Values? Or is there a reason to think that something different would be valued?
Incidentally, value convergence could involve multiple attractors. There might be moral symmetry breaking. Value systems as natural attractors doesn’t imply universal convergence on one set of values. This point seems to get lost in this post.
Tim, thanks for that commentary, it will put reading your book on the top of my leisure to do list.
Yes, it could involve multiple attractors. I’m not sure which kind of symmetry you refer to though. Do you mean some sort of radial symmetry comming from everything else towards a unique set of values? Even in that case it would not be symmetric because the acceleration (force) would be different from different regions, due to for instance the stuff in (2008 Boyd Richerson and someone else) .
About your main question, no, that is not the same as instrumental Moral Values. Those who hold that claim would probably prefer to say something like: There are these two sets of convergent values, the instrumental ones, and about those we don’t care much more than Omohundro does. And the Convergent ones, which are named such because they converge despite not being for instrumental reasons.
Imagine the Lefties, who value driving on the left—and the Righties, who value driving on the right. Nature doesn’t care much about this (metaphorically speaking, of course), but the Lefties and the Righties do. I would say that was an example of moral symmetry breaking. It may not be the greatest example (it is more lilkely that they actually care about not being killed) - but I think it illustrates the general idea.
I suspect they are practically the same. Intelligent organisms probably won’t deviate far from Universal Instrumental Values—for fear of meeting agents whose values more closely approximate them—thus losing control of their entire future.
Lefties and righties is just a convention case, if humans had three arms, two on the right, there might have been a matter of fact as to coming from which arm preference things go better.
I think this fear of other agents taking over the world is some form of reminiscent ingroup outgroup bias. To begin with, on the limit, if you value A B and C intrinsically but you have to do D1 D2 and D3 instrumentally, you may initially think of doing D1 D2 and D3. but what use would it be to fill up your future with that instrumental stuff if you nearly never get A B an C. You’d become just one more stupid replicator fighting for resources. You’d be better off by doing nothing and wishing that, by luck, A B an C were being instantiated by someone less instrumental than yourself.
Sure, but there are cases where rivals are evenly matched. Lions and tigers, for instance, have different—often conflicting—aims. However, it isn’t a walk-over for one team. Of course, you could say whether the lion or tiger genes win is “just a convention”—but to the lions and tigers, it really matters.
No use. However, our values are not that far from Universal Instrumental Values—because we were built by a process involving a lot of natural selection.
Our choice is more like: do we give up a few of the things we value now—or run the risk of losing many more of them in the future. That leads to the question of how big the risk is—and that turns out to be a tricky issue.
Agreed. That tricky issue I suspect might have enormous consequences if reason ends up being highjacked by in-group out-group biases, and the surviving memes end up being those that make us more instrumental, for fear of someone else doing the same.
I expect that the force that will eventually promote natural values most strongly will be the prospect of encountering unknown aliens. As you say, the stakes are high. If we choose incorrectly, much of our distinctiveness could be permanently obliterated.
sorry, in your terminology I should have said “reproductor”?, I forgot your substitute for replicator.…
Replicator, reproducor, I can cope either way. It seems to be mostly critics who get into a muddle over this issue—though of course, we should try not to confuse people with misleading terminology.