I kind of find the entire notion of utility maximization in the Yudkowskian worldview to be gross. Essentially trying to turn the raw material of the entire universe into stuff that just tickles your primitive drives. It’s bad when a wrapper mind does it, it’s bad when anything does it. Fun theory refutes itself.
But at the same time, I don’t believe in liberalism, I don’t think resources should be allocated to other beings just because, or for the sake of preserving individual liberty. I see other humans as paperclippers. That is equally abhorrent to me, they’re the same thing, in fact.
Either way it’s the gross utility maximization thing. A transhuman utopia would just be some implementation of Fun Theory and it would convert the universe into stuff that tickles the primitive drives of humans.
There’s also something gross about a being which… acts desperately and selfishly for the sake of a utility function that only has meaning to it. “That last human child MUST be converted into paperclips. This is of the utmost importance.”
You wouldn’t wirehead an ant, so why would you try to give a human everything they want. Equally distasteful in a way.
disgust at utility maximization
I kind of find the entire notion of utility maximization in the Yudkowskian worldview to be gross. Essentially trying to turn the raw material of the entire universe into stuff that just tickles your primitive drives. It’s bad when a wrapper mind does it, it’s bad when anything does it. Fun theory refutes itself.
But at the same time, I don’t believe in liberalism, I don’t think resources should be allocated to other beings just because, or for the sake of preserving individual liberty. I see other humans as paperclippers. That is equally abhorrent to me, they’re the same thing, in fact.
Either way it’s the gross utility maximization thing. A transhuman utopia would just be some implementation of Fun Theory and it would convert the universe into stuff that tickles the primitive drives of humans.
There’s also something gross about a being which… acts desperately and selfishly for the sake of a utility function that only has meaning to it. “That last human child MUST be converted into paperclips. This is of the utmost importance.”
You wouldn’t wirehead an ant, so why would you try to give a human everything they want. Equally distasteful in a way.