“I bring this up because Eliezer seems to be proposing large social undertakings [...] which would seem to me to come at an economic cost. If a pill or a treatment is cheaper and accomplishes the same outcome...”
If you’re referring to FAI development, then that’s not the argument at all. The argument is that most people doing AI don’t seem too concerned about the F part, which is very dangerous, so focusing on the F part needs to be the top priority. If you’re referring to something else, then I have no clue what.
“I bring this up because Eliezer seems to be proposing large social undertakings [...] which would seem to me to come at an economic cost. If a pill or a treatment is cheaper and accomplishes the same outcome...”
If you’re referring to FAI development, then that’s not the argument at all. The argument is that most people doing AI don’t seem too concerned about the F part, which is very dangerous, so focusing on the F part needs to be the top priority. If you’re referring to something else, then I have no clue what.