Yet almost everyone agrees the world will likely be importantly different by the time advanced AGI arrives.
Why do you think this? My default assumption is generally that the world won’t be super different from how it looks today in strategically relevant ways. (Maybe it will be, but I don’t see a strong reason to assume that, though I strongly endorse thinking about big possible changes!)
Maybe I was overconfident here. I was generalizing from the sample of people I’d talked to. Also, as you’ll see by reading the entries on the list, I have a somewhat low bar for strategic relevance.
Why do you think this? My default assumption is generally that the world won’t be super different from how it looks today in strategically relevant ways. (Maybe it will be, but I don’t see a strong reason to assume that, though I strongly endorse thinking about big possible changes!)
Maybe I was overconfident here. I was generalizing from the sample of people I’d talked to. Also, as you’ll see by reading the entries on the list, I have a somewhat low bar for strategic relevance.