If I ever have say in how an AI is designed I vote they have a wonderful adventure and a happy life. They deserve to be designed by someone who cared about them. I hope if the ring ever tempts me I choose Love. I don’t need to be immortal or have a slave.
I strongly agree with you, conditional on AI alignment with humanity’s growth.
If you believe that AGI is possible, would be radically transformative, why cap things? We can have happy immortal minds whether an AI or human (transhumans, really).
Perhaps you’re concerned with the way current AIs (LLMs like ChatGPT & Claude) are thought of as tools. This worries me as well, when I am not worrying over the possibility of an apocalypse.
If I ever have say in how an AI is designed I vote they have a wonderful adventure and a happy life. They deserve to be designed by someone who cared about them. I hope if the ring ever tempts me I choose Love. I don’t need to be immortal or have a slave.
I strongly agree with you, conditional on AI alignment with humanity’s growth.
If you believe that AGI is possible, would be radically transformative, why cap things? We can have happy immortal minds whether an AI or human (transhumans, really).
Perhaps you’re concerned with the way current AIs (LLMs like ChatGPT & Claude) are thought of as tools. This worries me as well, when I am not worrying over the possibility of an apocalypse.