Engineers don’t need to build the same bridge 100 times to ensure that it will be safe, so long as the blueprints are sufficient.
That’s what the argument that “AI’s are grown, not made” counters.
It’s easier to know what an AI can or can’t do than it is to build the AI.
Not if it’s intentionally general.
AGI isn’t the most efficient way to solve most specialized problems, and other AIs can often give us results that are as good as needed in most industries.
Yes, that’s why we are getting AI to write code , instead of do the job directly.
Companies don’t want to make AI more powerful so much as they want to make it more useful to humans, and it’s often more efficient to invest resources into understanding than into abstract “power.”
I almost agree. More power is saleable, but only if it’s controllable.
That’s what the argument that “AI’s are grown, not made” counters.
Not if it’s intentionally general.
Yes, that’s why we are getting AI to write code , instead of do the job directly.
I almost agree. More power is saleable, but only if it’s controllable.