when you say ‘smart person’ do you mean someone who knows orthogonality thesis or not? if not, shouldn’t that be the priority and therefore statement 1, instead of ‘hey maybe ai can self improve someday’?
here’s a shorter ver:
“the first AIs smarter than the sum total of the human race will probably be programmed to make the majority of humanity suffer because that’s an acceptable side effect of corporate greed, and we’re getting pretty close to making an AI smarter than the sum total of the human race”
when you say ‘smart person’ do you mean someone who knows orthogonality thesis or not? if not, shouldn’t that be the priority and therefore statement 1, instead of ‘hey maybe ai can self improve someday’?
here’s a shorter ver:
“the first AIs smarter than the sum total of the human race will probably be programmed to make the majority of humanity suffer because that’s an acceptable side effect of corporate greed, and we’re getting pretty close to making an AI smarter than the sum total of the human race”