Let me know if you do not find this argument convincing, and I will expand on it somewhat.
Lara, Eliezer-1995 is pre-Bayesian-enlightenment so he wouldn’t have spent a lot of time talking about “utility”. But yes, intelligence was a terminal value to him.
Dear Scott Aaronson:
Pffft.
Let me know if you do not find this argument convincing, and I will expand on it somewhat.
Lara, Eliezer-1995 is pre-Bayesian-enlightenment so he wouldn’t have spent a lot of time talking about “utility”. But yes, intelligence was a terminal value to him.
I found Dr Aaronson’s comment to be wise and witty, whereas Mr YudKowsky’s response was....pfffffft.