By “AI that does what we need AI to do” I meant “AI that can specifically do the things people ask of it” (as opposed to an AI that has already mastered every possible skill as preparation).
Perhaps I should have been more clear and modest in my description of the target… Like I think that N LLMs that each know 1 subject/skill is 95% as useful as 1 LLM that knows N subjects. And I think the N-LLMs setup is 20% as dangerous. But I come empty handed in terms of real evidence. Mixture of experts still being SOTA is some weak evidence.
I have nothing against tools, but for many desired outcomes, I don’t want tools, I want someone who knows the tools and can do the work.
Same here but I end up going for the tool over the human more often than not because I can see & trust the tool. If I had a magic shapeshifting tool then I think I would have little need for the general agent.
By “AI that does what we need AI to do” I meant “AI that can specifically do the things people ask of it” (as opposed to an AI that has already mastered every possible skill as preparation).
Perhaps I should have been more clear and modest in my description of the target… Like I think that N LLMs that each know 1 subject/skill is 95% as useful as 1 LLM that knows N subjects. And I think the N-LLMs setup is 20% as dangerous. But I come empty handed in terms of real evidence. Mixture of experts still being SOTA is some weak evidence.
Same here but I end up going for the tool over the human more often than not because I can see & trust the tool. If I had a magic shapeshifting tool then I think I would have little need for the general agent.