I’m not sure of the technical definition of AGI, but essentially I mean a machine that can reason. I don’t plan to give it outputs until I know what it does.
I don’t mean that life is the terminal value that all human’s actions reduce to. I mean it in exactly the way I said above; for me to achieve any other value requires that I am alive. I also don’t mean that every value I have reduces to my desire to live, just that, if it comes down to one or the other, I choose life.
Then I think you meant that “Life is theinstrumental value.”
I am not sure what you mean by “give it outputs”, but you may be interested in this investigation of attempting to contain an AGI.
Then I think you meant that “Life is the instrumental value.”
to amplify: Terminal Values and Instrumental Values