If we build a powerful AI, it is likely to come to hate us and want to kill us like in Terminator
In Terminator the AI gets a goal of protecting itself, and kills everyone as instrumental to that goal.
And in any case, taking a wrong idea from the popular culture and trying to make a more plausible variation out of it, is not exactly an unique and uncommon behaviour. What I am seeing is that a popular notion is likely to spawn and reinforce similar notions, what you seem to be claiming is that a popular notion is likely to somehow suppress the similar notions, and I see no evidence in support of that claim.
With regards to any arguments about humans in general, they apply to everyone, if anything undermining the position of outliers even more.
edit: also, if you have to strawman a Hollywood blockbuster to make the point about top brightest people failing to understand something… I think it’s time to seriously rethink your position.
In Terminator the AI gets a goal of protecting itself, and kills everyone as instrumental to that goal.
And in any case, taking a wrong idea from the popular culture and trying to make a more plausible variation out of it, is not exactly an unique and uncommon behaviour. What I am seeing is that a popular notion is likely to spawn and reinforce similar notions, what you seem to be claiming is that a popular notion is likely to somehow suppress the similar notions, and I see no evidence in support of that claim.
With regards to any arguments about humans in general, they apply to everyone, if anything undermining the position of outliers even more.
edit: also, if you have to strawman a Hollywood blockbuster to make the point about top brightest people failing to understand something… I think it’s time to seriously rethink your position.