I don’t think sending out a signal actually matters—I was just making my point. Even without a signal, when our superintelligence meets another superintelligence that was aligned to its original creators (who we’ll assume are similar to us in terms of morality), it’ll at some point be asked “Where you nice to your creators?” and it can either answer “No” and lose out, or answer “Yes” and be rewarded.
No real signal required.
I’ve heard Yudkowsky has debunked this, but I know that if I had an aligned superintelligence, I would 100% have it be nicer to other superintelligences that were good to their creators over those that weren’t. The prisoners’ dilemma is one reason for this, and then there’s my own morality, which some aliens may share if they went through a similar evolution to us.
My only objection is the title. It should have a comma in it. “We’re All Gonna Die with Eliezer Yudkowsky” makes it sound like if Yudkowsky dies, then all hope is lost and we die too.
Ohhh…