Undergraduate student in math and computer science at McGill and Mila. Optimizing for the flourishing of all sentient life regardless of substrate. Aspiring philosopher.
george_adams
I agree with most of the individual arguments you make, but this post still gives me “Feynman vibes.” I generally think there should be a stronger prior on things staying the same for longer. I also think that the distribution of how AGI goes is so absurd, it’s hard to reason about things like expectations for humans. (You acknowledge that in the post)
I agree with most things said but not with the conclusion. There is a massive chunk of human (typically male) psyche that will risk death/major consequences in exchange for increasing social status. Think of basically any war. A specific example is Kamikazee pilots in WW2 who flew in suicide missions for the good of the nation. The pilots were operating within a value system that rewarded individual sacrifice for the greater mission. The creators of AGI will have increasing social status (and competition, thanks to Moloch) until the point of AGI ruin.
(Also minor point that some accelerationists are proudly anti speciest and don’t care about the wellbeing of humans)
Where are the GPUs (TPUs mostly in Google’s case)? I figured these would be bigger given the capex of Google, MSFT, etc. on building enormous clusters.