Nature incentivizes behavior by making it feel good. Yes, humans (and lobsters, etc.) have the dominance instinct (feel good when they are powerful, and feel bad when they are oppressed), and the AI has not.
That is unrelated to whether AI will gain power instrumentally, as the most likely way to achieve its goals.
As an analogy, humans do not have an instinct to poison ants they find at their homes. Most of them probably do not even derive pleasure from doing that; more likely they are annoyed that they had to solve this problem in the first place. And yet, humans can do this quite effectively. The concern is that the same might be true for AI.
I think even most humans don’t have a “dominance” instinct. The reasons we want to gain money and power are also mostly instrumental: we want to achieve other goals (e.g., as a CEO, getting ahead of a competitor to increases shareholder value and make a “good job”), impress our neighbors, generally want to be admired and loved by others, live in luxury, distract ourselves from other problems like getting older, etc. There are certainly people who want to dominate just for the feeling of it, but I think that explains only a small part of the actual dominant behavior in humans. I myself have been a CEO of several companies, but I never wanted to “dominate” anyone. I wanted to do what I saw as a “good job” at the time, achieving the goals I had promised our shareholders I would try to achieve.
Nature incentivizes behavior by making it feel good. Yes, humans (and lobsters, etc.) have the dominance instinct (feel good when they are powerful, and feel bad when they are oppressed), and the AI has not.
That is unrelated to whether AI will gain power instrumentally, as the most likely way to achieve its goals.
As an analogy, humans do not have an instinct to poison ants they find at their homes. Most of them probably do not even derive pleasure from doing that; more likely they are annoyed that they had to solve this problem in the first place. And yet, humans can do this quite effectively. The concern is that the same might be true for AI.
I think even most humans don’t have a “dominance” instinct. The reasons we want to gain money and power are also mostly instrumental: we want to achieve other goals (e.g., as a CEO, getting ahead of a competitor to increases shareholder value and make a “good job”), impress our neighbors, generally want to be admired and loved by others, live in luxury, distract ourselves from other problems like getting older, etc. There are certainly people who want to dominate just for the feeling of it, but I think that explains only a small part of the actual dominant behavior in humans. I myself have been a CEO of several companies, but I never wanted to “dominate” anyone. I wanted to do what I saw as a “good job” at the time, achieving the goals I had promised our shareholders I would try to achieve.