That backs my theory that anything which is strong enough to do good is strong enough to do harm.
I think that is related to the theory of why idiot-proofing is misguided. If you want to make something completely idiot-proof, you have to make it impossible to make a bad decision, which, in practice, means taking away the ability to make any decisions at all—meaning that anything idiot-proof is also pretty much guaranteed to be completely useless. If something is powerful enough to do good, it has to be powerful enough to change something, and, as in the case of idiot-proofing, it’s really, really hard to prevent every possible bad change without preventing all change whatsoever.
I think that is related to the theory of why idiot-proofing is misguided.
Good theory, but I also quite like the more traditional theory:
Programming today is a race between software engineers striving to build bigger and better idiot- proof programs, and the Universe trying to produce bigger and better idiots. So far, the Universe is winning.
Edit: I am curious if we see the same element, however. It seems to me that that element is aptly summarized as “writing a program that cannot fail spectacularly when used by someone who doesn’t understand it is a tremendous challenge—one which is necessary to face, but one which has stood against the combined best efforts of at least a generation of programmers.”
I think that is related to the theory of why idiot-proofing is misguided. If you want to make something completely idiot-proof, you have to make it impossible to make a bad decision, which, in practice, means taking away the ability to make any decisions at all—meaning that anything idiot-proof is also pretty much guaranteed to be completely useless. If something is powerful enough to do good, it has to be powerful enough to change something, and, as in the case of idiot-proofing, it’s really, really hard to prevent every possible bad change without preventing all change whatsoever.
Good theory, but I also quite like the more traditional theory:
Do you like it, or believe it?
Mostly like it for comedy value, but I think there is an element of truth.
I would agree, on reflection.
Edit: I am curious if we see the same element, however. It seems to me that that element is aptly summarized as “writing a program that cannot fail spectacularly when used by someone who doesn’t understand it is a tremendous challenge—one which is necessary to face, but one which has stood against the combined best efforts of at least a generation of programmers.”