I appreciate you saying that the 25th percentile timeline might be more important. I think that’s right and underappreciated.
One of your recent (excellent) posts also made me notice that AGI timelines probably aren’t normally distributed. Breakthroughs, other large turns of events, or large theoretical misunderstandings at this point probably play a large role, and there are probably only a very few of those that will hit. Small unpredictable events that create normal distributions will play a lesser role.
I don’t know how you’d characterize that mathematically, but I don’t think it’s right to assume it’s normally distributed, or even close.
Back to your comment on the 25th percentile being important: I think there’s a common error where people round to the median and then think “ok, that’s probably when we need to have alignment/strategy figured out.” You’d really want to have it at least somewhat ready far earlier.
That’s both in case it’s on the earlier side of the predicted distribution, and because alignment theory and practice need to be ready far enough in advance of game time to have diffused and be implemented for the first takeover-capable model.
I’ve been thinking of writing a post called something like “why are so few people frantic about alignment?” making those points. Stated timeline distributions don’t seem to match mood IMO and I’m trying to figure out why. I realize that part of it is a very reasonable “we’ll figure it out when/if we get there.” And perhaps others share my emotional dissociation from my intellectual expectations. But maybe we should all be a bit more frantic. I’d like some more halfassed alignment solutions in play and under discussion right now. The 80⁄20 rule probably applies here.
I appreciate you saying that the 25th percentile timeline might be more important. I think that’s right and underappreciated.
One of your recent (excellent) posts also made me notice that AGI timelines probably aren’t normally distributed. Breakthroughs, other large turns of events, or large theoretical misunderstandings at this point probably play a large role, and there are probably only a very few of those that will hit. Small unpredictable events that create normal distributions will play a lesser role.
I don’t know how you’d characterize that mathematically, but I don’t think it’s right to assume it’s normally distributed, or even close.
Back to your comment on the 25th percentile being important: I think there’s a common error where people round to the median and then think “ok, that’s probably when we need to have alignment/strategy figured out.” You’d really want to have it at least somewhat ready far earlier.
That’s both in case it’s on the earlier side of the predicted distribution, and because alignment theory and practice need to be ready far enough in advance of game time to have diffused and be implemented for the first takeover-capable model.
I’ve been thinking of writing a post called something like “why are so few people frantic about alignment?” making those points. Stated timeline distributions don’t seem to match mood IMO and I’m trying to figure out why. I realize that part of it is a very reasonable “we’ll figure it out when/if we get there.” And perhaps others share my emotional dissociation from my intellectual expectations. But maybe we should all be a bit more frantic. I’d like some more halfassed alignment solutions in play and under discussion right now. The 80⁄20 rule probably applies here.