I think the blank moral panic is a bit unfounded my ratio of LLM : Search engine is 4:1 , I think LLMs just explain things faster a lot of times, I am likely to land on the same thing after searching 3-4 sites, because I kind of know what type of info I am looking for, I think if LLM is damaging to use the way I am using, it would be similarly damaging to use google on regular basis, which I don’t think it is.
I think the unhealthy usecases of LLMs don’t quite scale as uniformly as people might expect at least not currently, there was a recent research from METR on how experienced programmers actually lose time using LLMs.
I think the blank moral panic is a bit unfounded my ratio of LLM : Search engine is 4:1 , I think LLMs just explain things faster a lot of times, I am likely to land on the same thing after searching 3-4 sites, because I kind of know what type of info I am looking for, I think if LLM is damaging to use the way I am using, it would be similarly damaging to use google on regular basis, which I don’t think it is.
I think the unhealthy usecases of LLMs don’t quite scale as uniformly as people might expect at least not currently, there was a recent research from METR on how experienced programmers actually lose time using LLMs.