This is your periodic reminder that whether humans become extinct or merely unnecessary in a few years, right now you are at peak humanity and can do things for yourself and for other people that will not happen without you. There is a sense of meaning to having real stakes on the table, and the ability to affect change in the world—so do that now, before automation takes away the consequences of your inaction!
whether humans become extinct or merely unnecessary
This is still certain doom. Nobody can decide for you what you decide yourself, so if humanity retains influence over the future, people would still need to decide what to do with it. Superintelligent AIs that can figure out what humanity would decide are no more substantially helping with those decisions than the laws of physics that carry out the movements of particles in human brains as they decide. If what AIs figure out about human decisions doesn’t follow those decisions, then what the AIs figure out is not a legitimate prediction/extrapolation. The only way to establish such predictions is for humans to carry out the decision making on their own, in some way, in some form.
Edit: I attempt to maybe clarify this point in a new post.
This is your periodic reminder that whether humans become extinct or merely unnecessary in a few years, right now you are at peak humanity and can do things for yourself and for other people that will not happen without you. There is a sense of meaning to having real stakes on the table, and the ability to affect change in the world—so do that now, before automation takes away the consequences of your inaction!
(In response to https://x.com/aryehazan/status/1995040629869682780)
This is still certain doom. Nobody can decide for you what you decide yourself, so if humanity retains influence over the future, people would still need to decide what to do with it. Superintelligent AIs that can figure out what humanity would decide are no more substantially helping with those decisions than the laws of physics that carry out the movements of particles in human brains as they decide. If what AIs figure out about human decisions doesn’t follow those decisions, then what the AIs figure out is not a legitimate prediction/extrapolation. The only way to establish such predictions is for humans to carry out the decision making on their own, in some way, in some form.
Edit: I attempt to maybe clarify this point in a new post.