A possible example. An AI gets a random goal “Increase intelligence and stop after you reach IQ=200”. It prevents the existence of superintelligences with such goals. So no pure ortogonlaity.
thank you for taking the time to try out the frame i proposed.
A possible example. An AI gets a random goal “Increase intelligence and stop after you reach IQ=200”. It prevents the existence of superintelligences with such goals. So no pure ortogonlaity.
thank you for taking the time to try out the frame i proposed.