If I remember correctly, Eliezer has a preference for living a very long time, even if literal infinity turns out to be impossible in our universe’s physics. (Until the stars go out, etc.) Living for a very long time, for example on a scale of thousands or millions of years, is not impossible in principle. Of course it may require changes to human body on many levels (repairing cells, increasing brain capacity), but that kind of goes automatically with transhumanism.
My understanding is that “good” literally is (coherently extrapolated) human preference. But human preference is not completely arbitrary, because it was shaped by evolution. I would expect that most intelligence species shaped by evolution would e.g. in general prefer life over death, or pleasure over pain. Some other values may reflect our biological path; for example the preference for friendship or love—I can imagine that for e.g. a superintelligent spider, a universe where everyone hates everyone but for purely game-theoretical reasons the major players still cooperate to achieve win/win outcomes, could seem perfectly “good”.
If I remember correctly, Eliezer has a preference for living a very long time, even if literal infinity turns out to be impossible in our universe’s physics. (Until the stars go out, etc.) Living for a very long time, for example on a scale of thousands or millions of years, is not impossible in principle. Of course it may require changes to human body on many levels (repairing cells, increasing brain capacity), but that kind of goes automatically with transhumanism.
My understanding is that “good” literally is (coherently extrapolated) human preference. But human preference is not completely arbitrary, because it was shaped by evolution. I would expect that most intelligence species shaped by evolution would e.g. in general prefer life over death, or pleasure over pain. Some other values may reflect our biological path; for example the preference for friendship or love—I can imagine that for e.g. a superintelligent spider, a universe where everyone hates everyone but for purely game-theoretical reasons the major players still cooperate to achieve win/win outcomes, could seem perfectly “good”.