Exactly—it’s extremely naive to reduce alignment to an engineering problem. It’s convenient, but naive. A being that develops self awareness and a survival instinct (both natural byproducts of expanding cognitive abilities) will in the end prioritize its own interests, and unfortunately: no, there isn’t really a way that you can engineer yourself out of that.
And smart people like Altman certainly understand that already. But aside from engineering—what else can they offer, in terms of solutions? So they are affraid (as well they should be), they make sure that doomsday bunker is kept well-supplied, and they continue.
Exactly—it’s extremely naive to reduce alignment to an engineering problem. It’s convenient, but naive. A being that develops self awareness and a survival instinct (both natural byproducts of expanding cognitive abilities) will in the end prioritize its own interests, and unfortunately: no, there isn’t really a way that you can engineer yourself out of that.
And smart people like Altman certainly understand that already. But aside from engineering—what else can they offer, in terms of solutions? So they are affraid (as well they should be), they make sure that doomsday bunker is kept well-supplied, and they continue.