I agree with most of the individual arguments you make, but this post still gives me “Feynman vibes.” I generally think there should be a stronger prior on things staying the same for longer. I also think that the distribution of how AGI goes is so absurd, it’s hard to reason about things like expectations for humans. (You acknowledge that in the post)
I agree with most of the individual arguments you make, but this post still gives me “Feynman vibes.” I generally think there should be a stronger prior on things staying the same for longer. I also think that the distribution of how AGI goes is so absurd, it’s hard to reason about things like expectations for humans. (You acknowledge that in the post)