I think that’s true, but the addition of LLMs at their current level of capability has added some new dynamics, resulting in a lot of people believing they have a breakthrough who previously wouldn’t. For people who aren’t intimately familiar with the failure modes of LLMs, it’s easy to believe them when they say your work is correct and important — after all, they’re clearly very knowledgeable about science. And of course, confirmation bias makes that much easier to fall for. Add to that the tendency for LLMs to be sycophantic, and it’s a recipe for a greatly increased number of people (wild guess: maybe an order of magnitude more?) believing they’ve got a breakthrough.
I think that’s true, but the addition of LLMs at their current level of capability has added some new dynamics, resulting in a lot of people believing they have a breakthrough who previously wouldn’t. For people who aren’t intimately familiar with the failure modes of LLMs, it’s easy to believe them when they say your work is correct and important — after all, they’re clearly very knowledgeable about science. And of course, confirmation bias makes that much easier to fall for. Add to that the tendency for LLMs to be sycophantic, and it’s a recipe for a greatly increased number of people (wild guess: maybe an order of magnitude more?) believing they’ve got a breakthrough.