barring anything else you might have meant, temporarily assuming yudkowsky’s level of concern if someone builds yudkowsky’s monster, then evidentially speaking, it’s still the case that “if we build AGI, everyone will die” is unjustified in a world where it’s unclear if alignment is going to succeed before someone can build yudkowsky’s monster. in other words, agreed.
barring anything else you might have meant, temporarily assuming yudkowsky’s level of concern if someone builds yudkowsky’s monster, then evidentially speaking, it’s still the case that “if we build AGI, everyone will die” is unjustified in a world where it’s unclear if alignment is going to succeed before someone can build yudkowsky’s monster. in other words, agreed.