The key intuition about the future might be simply that humans being around is an incredibly weird state of affairs. We shouldn’t expect it to continue by default.
I mean, yes this seems right. In which case, taking it as a premise that this weird state doesn’t last long, it follows that there’s no point trying to plan for a future where human-like things continue to exist. BUT: from where we stand right now, we do actually have some control over whether everybody dies and nothing human-like continues into the future. The simplest plan to avoid extinction by AI is “don’t build the thing that kills us”, but there are more sophisticated options too. As unlikely as it was for such a situation to arise in the first place, as weird as it is to be here, here we are. And we can try to aim, from here, for a future state that is vanishingly unlikely to happen by chance or by default, such as “not human extinction”.
I think the speculation about owning galaxies starts from the assumption that we succeeded in aiming the future in such a direction. And although that assumption may not be what actually happens, it would be unfortunate to get to that future state and then not have thought through what to do next because we didn’t think it was likely so we never planned for the possibility.
The whole thing people are doing when they’re talking about good futures and how to get there, is a process of trying to design a path towards an unlikely future that is emphatically not the default outcome without humans trying to make it happen.
I mean, yes this seems right. In which case, taking it as a premise that this weird state doesn’t last long, it follows that there’s no point trying to plan for a future where human-like things continue to exist. BUT: from where we stand right now, we do actually have some control over whether everybody dies and nothing human-like continues into the future. The simplest plan to avoid extinction by AI is “don’t build the thing that kills us”, but there are more sophisticated options too. As unlikely as it was for such a situation to arise in the first place, as weird as it is to be here, here we are. And we can try to aim, from here, for a future state that is vanishingly unlikely to happen by chance or by default, such as “not human extinction”.
I think the speculation about owning galaxies starts from the assumption that we succeeded in aiming the future in such a direction. And although that assumption may not be what actually happens, it would be unfortunate to get to that future state and then not have thought through what to do next because we didn’t think it was likely so we never planned for the possibility.
The whole thing people are doing when they’re talking about good futures and how to get there, is a process of trying to design a path towards an unlikely future that is emphatically not the default outcome without humans trying to make it happen.