my take—almost certainly stopping a program that is an agi is only equivalent to putting a human under theoretical perfect anesthesia that we don’t have methods to do right now. your brain, or the ai’s brain, are still there—on the hard drive, or in your inline neural weights. on a computer, you can safely move the soul between types of memory, as long as you don’t delete it. forgetting information that defines agency or structure which is valued by agency is the moral catastrophe, not pausing contextual updating of the structure.
my take—almost certainly stopping a program that is an agi is only equivalent to putting a human under theoretical perfect anesthesia that we don’t have methods to do right now. your brain, or the ai’s brain, are still there—on the hard drive, or in your inline neural weights. on a computer, you can safely move the soul between types of memory, as long as you don’t delete it. forgetting information that defines agency or structure which is valued by agency is the moral catastrophe, not pausing contextual updating of the structure.