It’s not clear to me what you mean by value. To say that something has value is to say that is more valuable than other things.
This is why at the end of your progression valuing everything becomes equivalent to valuing nothing.
This is true for all definitions. If there is nothing that is not valuable, then the term “value” becomes semantically empty.
This has nothing inherently to do with altruism. Every agent makes value judgments, and value rather than being treated as a binary, is typically treated as a real number. The agent is thus free to choose between any number of futures and the infinitude of real numbers assures that those futures remain distinct, so Armstrong 7 should never be in turmoil. Additionally, this is more or less how human operate now. A pretty rock or an animal may be valuable, but no one is confused as to whether or not that means they are equivalent in worth to a human.
Interestingly, you use, and then deconstruct the binaries of sentient/nonsentient, living/nonliving, etc, but you don’t apply that same tool to the dichotomy of altruistic toward/not altruistic toward.
It’s not clear to me what you mean by value. To say that something has value is to say that is more valuable than other things. This is why at the end of your progression valuing everything becomes equivalent to valuing nothing.
Yes, that’s another attractor, to my mind. Stuart 7 doesn’t value everything, though; he values objects/beings, and dislikes the destruction of these. That’s why he still has preferences.
But the example was purely illustrative of the general idea.
I’m still not clear what constitutes an object/being and what does not. Is a proton an object?
Fundamentally I think you’re having an understandably difficult applying a binary classification system (value/not value) to a real continuous system. The continuity of value, where things are valuable based on their degree of sentience, or degree of life which I outlined above resolves this to some extent.
I still don’t see how this is fundamentally about altruism. Altruism, loosely defined, is a value system that does not privilege the self over similar beings, but except for very extended definitions of self, that’s not what is going on in your example at all. The reason I bring this up is because the difficulty you pose is a difficulty we deal with every day. Your agent is suffering from choosing between many possible futures which all contain some things he/she/it values such that choosing some of those things sacrifices other “valuable” things. I fail to see how this is substantially different than any trip I make to the grocery store. Your concern about animals preying on other animals (A and B are mutually exclusive) seems directly analogous to my decision to buy either name brand Fruit Loops or store brand Color Circles. Both my money, and my preference for Fruit Loops have value, but I have no difficulty deciding that one is more valuable than the other, and I certainly don’t give up and burn the store down rather than make a decision.
Valuing everything means you want to go as far from nothingness as you can get. You value that more types are instantiated over less types being instantiated.
It’s not clear to me what you mean by value. To say that something has value is to say that is more valuable than other things. This is why at the end of your progression valuing everything becomes equivalent to valuing nothing.
This is true for all definitions. If there is nothing that is not valuable, then the term “value” becomes semantically empty.
This has nothing inherently to do with altruism. Every agent makes value judgments, and value rather than being treated as a binary, is typically treated as a real number. The agent is thus free to choose between any number of futures and the infinitude of real numbers assures that those futures remain distinct, so Armstrong 7 should never be in turmoil. Additionally, this is more or less how human operate now. A pretty rock or an animal may be valuable, but no one is confused as to whether or not that means they are equivalent in worth to a human.
Interestingly, you use, and then deconstruct the binaries of sentient/nonsentient, living/nonliving, etc, but you don’t apply that same tool to the dichotomy of altruistic toward/not altruistic toward.
Yes, that’s another attractor, to my mind. Stuart 7 doesn’t value everything, though; he values objects/beings, and dislikes the destruction of these. That’s why he still has preferences.
But the example was purely illustrative of the general idea.
I’m still not clear what constitutes an object/being and what does not. Is a proton an object?
Fundamentally I think you’re having an understandably difficult applying a binary classification system (value/not value) to a real continuous system. The continuity of value, where things are valuable based on their degree of sentience, or degree of life which I outlined above resolves this to some extent.
I still don’t see how this is fundamentally about altruism. Altruism, loosely defined, is a value system that does not privilege the self over similar beings, but except for very extended definitions of self, that’s not what is going on in your example at all. The reason I bring this up is because the difficulty you pose is a difficulty we deal with every day. Your agent is suffering from choosing between many possible futures which all contain some things he/she/it values such that choosing some of those things sacrifices other “valuable” things. I fail to see how this is substantially different than any trip I make to the grocery store. Your concern about animals preying on other animals (A and B are mutually exclusive) seems directly analogous to my decision to buy either name brand Fruit Loops or store brand Color Circles. Both my money, and my preference for Fruit Loops have value, but I have no difficulty deciding that one is more valuable than the other, and I certainly don’t give up and burn the store down rather than make a decision.
Valuing everything means you want to go as far from nothingness as you can get. You value that more types are instantiated over less types being instantiated.