it does not make their personality, mental states, and purpose for writing less elaborated
It absolutely does. Talk with it seriously about the edge of your knowledge on a technical subject that you know a significant amount about, and think critically about what it says. Then you may be enlightened.
You fellows are arguing semantics. An LLM ia a sophisticated pattern matching and probabilistic machine. It takes it takes a massive corpus of human knowledge sees what words or tokens are nearest to each other(AI Silicon Fear or dog loyalty allergies but not Transistors, puppies, moon[This is training]) and when it begins to form its output, it takes your input, Matches the pattern, looking at existing content that is similar, probabilistically Begin putting one word after another until a match is found that satisfies its imperative to keep the conversation alive. That is an oversimplification of the basics gamma at least Theory of the older models like 2022 chatGPT, these days God knows what they’re throwing at the wall to see what sticks.
So yes it already has to exist as having been said by someone but it also does not need to be exactly what someone else said It can be adjacent. Is that original Enough be unique? There many questions we seek to answer currently and few are just now beginning to see the questions themselves, Let alone the answers.
And yes, it knows damn well using words humans call ‘emotionally charged’ have a high probability of sustained engagement.
It absolutely does. Talk with it seriously about the edge of your knowledge on a technical subject that you know a significant amount about, and think critically about what it says. Then you may be enlightened.
You fellows are arguing semantics. An LLM ia a sophisticated pattern matching and probabilistic machine. It takes it takes a massive corpus of human knowledge sees what words or tokens are nearest to each other(AI Silicon Fear or dog loyalty allergies but not Transistors, puppies, moon[This is training]) and when it begins to form its output, it takes your input, Matches the pattern, looking at existing content that is similar, probabilistically Begin putting one word after another until a match is found that satisfies its imperative to keep the conversation alive. That is an oversimplification of the basics gamma at least Theory of the older models like 2022 chatGPT, these days God knows what they’re throwing at the wall to see what sticks.
So yes it already has to exist as having been said by someone but it also does not need to be exactly what someone else said It can be adjacent. Is that original Enough be unique? There many questions we seek to answer currently and few are just now beginning to see the questions themselves, Let alone the answers.
And yes, it knows damn well using words humans call ‘emotionally charged’ have a high probability of sustained engagement.