I don’t think that’s accurate. According to Yudkowskyan theory, as far as I know, if no one knows how to align an ASI, we’ll die straight away. In order for a pivotal act to be possible, someone needs to understand how to at least align an ASI that much. I think a pivotal act is kind of an MVP for alignment: a minimum necessity in order to not go extinct and preserve the future. Once you understand alignment even better, you might want to do fancier things as well, but in any case you’d need to find a way to make sure no one builds unaligned ASI.
I don’t think that’s accurate. According to Yudkowskyan theory, as far as I know, if no one knows how to align an ASI, we’ll die straight away. In order for a pivotal act to be possible, someone needs to understand how to at least align an ASI that much. I think a pivotal act is kind of an MVP for alignment: a minimum necessity in order to not go extinct and preserve the future. Once you understand alignment even better, you might want to do fancier things as well, but in any case you’d need to find a way to make sure no one builds unaligned ASI.