That’s like a 1% chance. It seems far more likely that insufficient effort on alignment will have us all dead long before then.
There’s vastly too little effort on alignment and too much on diversified good works in the world at this point. That may be another neglected area that rationalist would be particularly likely to address, but it seems like any way you do the math the EV is going to be way higher on alignment and related AGI navigation issues.
I mean I agree, but I just think this is insufficient worldview hedging. What if AGI takes another sixty years?
That’s like a 1% chance. It seems far more likely that insufficient effort on alignment will have us all dead long before then.
There’s vastly too little effort on alignment and too much on diversified good works in the world at this point. That may be another neglected area that rationalist would be particularly likely to address, but it seems like any way you do the math the EV is going to be way higher on alignment and related AGI navigation issues.