We probably don’t disagree that much. What “original seeing” means is just going and investigating things you’re interested in. So doing lengthy research is actually a much more central example of this than coming up with a bold new idea is.
As I say above: “There’s not any principled reason why an AI system, even a LLM in particular, couldn’t do this.”
I think some of it is that I find the term “original seeing” to be off-putting. I’m not sure if I got the point of the corresponding blog post.
In general, going forward, I’d recommend people try to be very precise on what they mean here. I’m suspicious that “original seeing” will mean different things to different people. I’d expect that trying to more precisely clarify what tasks or skills involved would make it easier to pinpoint which parts of it are good/bad for LLMs.
We probably don’t disagree that much. What “original seeing” means is just going and investigating things you’re interested in. So doing lengthy research is actually a much more central example of this than coming up with a bold new idea is.
As I say above: “There’s not any principled reason why an AI system, even a LLM in particular, couldn’t do this.”
Thanks for the clarification!
I think some of it is that I find the term “original seeing” to be off-putting. I’m not sure if I got the point of the corresponding blog post.
In general, going forward, I’d recommend people try to be very precise on what they mean here. I’m suspicious that “original seeing” will mean different things to different people. I’d expect that trying to more precisely clarify what tasks or skills involved would make it easier to pinpoint which parts of it are good/bad for LLMs.