Yep, agree that there is currently a biased coverage towards very short timelines. I think this makes sense in that the worlds where things are happening very soon are the worlds that from the perspective of a reasonable humanity require action now.[1]
I think despite the reasonable justification for focusing on the shorter timelines worlds for decision-making reasons, I do expect this to overall cause a bunch of people to walk away with the impression that people confidently predicted short timelines, and this in turn will cause a bunch of social conflict and unfortunate social accounting to happen in most worlds.
I on the margin would be excited to collaborate with people who would want to do similar things to AI 2027 or Situational Awareness for longer timelines.
I.e. in as much as you model the government as making reasonable risk-tradeoffs in the future, the short timeline worlds are the ones that require intervention to cause changes in decision-making now.
I am personally more pessimistic about humanity doing reasonable things, and think we might just want to grieve over short timeline worlds, but I sure don’t feel comfortable telling other people to not ring the alarm bell on potentially very large risks happening very soon, which seems plausible enough to me that absolutely it should be among the top considerations for most decision-makers out there.
Even if it does make sense strategically to put more attention on shorter timelines, that sure does not seem to be what actually drives the memetic advantage of short forecasts over long forecasts. If you want your attention to be steered in strategically-reasonable ways, you should probably first fully discount for the apparent memetic biases, and then go back and decide how much is reasonable to re-upweight short forecasts. Whatever bias the memetic advantage yields is unlikely to be the right bias, or even the right order of magnitude of relative attention bias.
I mean, I am not even sure it’s strategic given my other beliefs, and I was indeed saying that on the margin more longer-timeline coverage is worth it, so I think we agree.
What’s the longest timeline that you could still consider a short timeline by your own metric, and therefore a world “we might just want to grieve over”? I ask because, in your original comment you mentioned 2037 as a reasonably short timeline, and personally if we had an extra decade I’d be a lot less worried.
Edit: Oops, I responded to the first part of your question, not the second. My guess is timelines with less than 5 years seem really very hard, though we should still try. I think there is lots of hope in the 5-15 year timeline worlds. 15 years is just roughly the threshold of when I would stop considering someone’s timelines “short”, as a category.
I admit, it’s pretty disheartening to hear that, even if we had until 2040 (which seems less and less likely to me anyway), you’d still think there’s not much we could do but grieve in advance.
Yep, agree that there is currently a biased coverage towards very short timelines. I think this makes sense in that the worlds where things are happening very soon are the worlds that from the perspective of a reasonable humanity require action now.[1]
I think despite the reasonable justification for focusing on the shorter timelines worlds for decision-making reasons, I do expect this to overall cause a bunch of people to walk away with the impression that people confidently predicted short timelines, and this in turn will cause a bunch of social conflict and unfortunate social accounting to happen in most worlds.
I on the margin would be excited to collaborate with people who would want to do similar things to AI 2027 or Situational Awareness for longer timelines.
I.e. in as much as you model the government as making reasonable risk-tradeoffs in the future, the short timeline worlds are the ones that require intervention to cause changes in decision-making now.
I am personally more pessimistic about humanity doing reasonable things, and think we might just want to grieve over short timeline worlds, but I sure don’t feel comfortable telling other people to not ring the alarm bell on potentially very large risks happening very soon, which seems plausible enough to me that absolutely it should be among the top considerations for most decision-makers out there.
Even if it does make sense strategically to put more attention on shorter timelines, that sure does not seem to be what actually drives the memetic advantage of short forecasts over long forecasts. If you want your attention to be steered in strategically-reasonable ways, you should probably first fully discount for the apparent memetic biases, and then go back and decide how much is reasonable to re-upweight short forecasts. Whatever bias the memetic advantage yields is unlikely to be the right bias, or even the right order of magnitude of relative attention bias.
I mean, I am not even sure it’s strategic given my other beliefs, and I was indeed saying that on the margin more longer-timeline coverage is worth it, so I think we agree.
What’s the longest timeline that you could still consider a short timeline by your own metric, and therefore a world “we might just want to grieve over”? I ask because, in your original comment you mentioned 2037 as a reasonably short timeline, and personally if we had an extra decade I’d be a lot less worried.
About 15 years, I think?
Edit: Oops, I responded to the first part of your question, not the second. My guess is timelines with less than 5 years seem really very hard, though we should still try. I think there is lots of hope in the 5-15 year timeline worlds. 15 years is just roughly the threshold of when I would stop considering someone’s timelines “short”, as a category.
I admit, it’s pretty disheartening to hear that, even if we had until 2040 (which seems less and less likely to me anyway), you’d still think there’s not much we could do but grieve in advance.