There’s also better arguments, like
“We wouldn’t build a god AI and put it in charge of the world”
“We would make some sort of attempt at installing safety overrides”
″ Tool AI is safer and easier, and easier to make safe, and wouldn’t need goals to be aligned with ours”
“Well be making ourselves smarter in parallel”
There’s also better arguments, like
“We wouldn’t build a god AI and put it in charge of the world”
“We would make some sort of attempt at installing safety overrides”
″ Tool AI is safer and easier, and easier to make safe, and wouldn’t need goals to be aligned with ours”
“Well be making ourselves smarter in parallel”