Or “worrying about this possibility would be a poor use of resources, what with the incredible urgency of creating AI before humanity wipes itself out—you’ve got to go with what you have”, this being uttered by people who just basically aren’t interested in the problem.
I think this is unfair. When I first encountered the problem of FAI I reasoned similarly, and I was very interested in the problem. Now I know this argument has huge flaws, but they weren’t caused by lack of interest.
I think this is unfair. When I first encountered the problem of FAI I reasoned similarly, and I was very interested in the problem. Now I know this argument has huge flaws, but they weren’t caused by lack of interest.