Oh I agree the formal goal system is shaky, but that is also the method by which the system “self-improves”, it uses it to find “improvements”. If there are weaknesses in the axiomatic system then I would expect any proven “improvements” to be potentially deleterious to the system as a whole and god would not likely be formed. If I have an axiom that having “hot lead flying through my brain will lead to me taking over the world” it does not mean that shooting myself in the head will make me the lord high supreme potentate of earth, despite the fact I can prove it given that axiom and some facts about the world. Changing ones source code could be equally messy if done on the basis of a weak system.
Oh I agree the formal goal system is shaky, but that is also the method by which the system “self-improves”, it uses it to find “improvements”. If there are weaknesses in the axiomatic system then I would expect any proven “improvements” to be potentially deleterious to the system as a whole and god would not likely be formed. If I have an axiom that having “hot lead flying through my brain will lead to me taking over the world” it does not mean that shooting myself in the head will make me the lord high supreme potentate of earth, despite the fact I can prove it given that axiom and some facts about the world. Changing ones source code could be equally messy if done on the basis of a weak system.