I just expected people to expect me to understand basic SIAI arguments like “value is fragile” and “there’s no moral equivalent of a ghost in the machine” et cetera.
“Basic SIAI arguments like “value is fragile”″ …? You mean this...?
The post starts out with:
If I had to pick a single statement that relies on more Overcoming Bias content I’ve written than any other, that statement would be:
Any Future not shaped by a goal system with detailed reliable inheritance from human morals and metamorals, will contain almost nothing of worth.
...it says it isn’t basic—and it also seems pretty bizarre.
For instance, what about the martians? I think they would find worth in a martian future.
For instance, what about the martians? I think they would find worth in a martian future.
Yeah, and paperclippers would find worth in a future full of paperclips, and pebblesorters would find worth in a future full of prime-numbered heaps of pebbles. Fuck ’em.
If the martians are persons and they are doing anything interesting with their civilization, or even if they’re just not harming us, then we’ll keep them around. “Human values” doesn’t mean “valuing only humans”. Humans are capable of valuing all sorts of non-human things.
“Basic SIAI arguments like “value is fragile”″ …? You mean this...?
The post starts out with:
...it says it isn’t basic—and it also seems pretty bizarre.
For instance, what about the martians? I think they would find worth in a martian future.
Yeah, and paperclippers would find worth in a future full of paperclips, and pebblesorters would find worth in a future full of prime-numbered heaps of pebbles. Fuck ’em.
If the martians are persons and they are doing anything interesting with their civilization, or even if they’re just not harming us, then we’ll keep them around. “Human values” doesn’t mean “valuing only humans”. Humans are capable of valuing all sorts of non-human things.