I might save this as a scathing, totally unintentional pan of the hard problem of consciousness:
Ultimately, it’s called the Hard Problem of Consciousness, not because it is simply difficult, but because it is capital-H Hard in a way not dissimilar to how NP-Hard problems may be literally undecidable.
That’s actually misleading of me to pick on, because I thought that most of the section on consciousness was actually a highlight of the article. It’s because I read it with more interest that I noticed little oversights like the above.
A part of me wonders if an LLM said it was really deep.
I feel for this person, I really do. Anthropomorphizing LLMs is easy and often useful. But you can’t just ditch technical reasoning about how AI works for this kind of… ecstatic social-reasoning-based futurism. Or, you can, you’re just going to be wrong.
And please, if you run your theory of anything by an LLM that says it’s genius, it’s important to remember that there’s currently a minor problem (or so I remember a moderator saying, take this with a grain of salt) on the physics subreddit with crank Theory of Everything submissions that got ChatGPT to “fill in the details,” and they must be right because ChatGPT made such a nice argument for why this idea was genius! These amateur physicsts trusted ChatGPT to watch their (epistemic) back, but it didn’t work out for them.
That’s a pretty rude thing to say about my friend. I’d appreciate it if you could find a less condescending way to put this.
I would have just left this as a removeable react, but I can’t delete replied-to comments, and a mod immediately downvoted my “too combative” react, so I’m replying as a comment.
edit: I agree with all your other points near-unequivocally. I just feel like that particular phrasing implies, without writing down the justification for why, that JC needs love and support for the fact that they got manipulated. I agree they got manipulated by o3; I don’t think being roundabout about it is a good way to communicate it. (JC just went to sleep, but will probably roll eyes at my enthusiastic defense of their character, heh.)
I’m not sure if you’re reading in more rudeness than I intended to that phrase. I’ll try to clarify and then maybe you can tell me.
By “I feel for this person,” I mean “I think it’s understandable, even sympathetic, to have the mental model of LLMs that they do.” Is that how you interpreted it, and you’re saying it’s condescending for me to say that while also saying this person made a bunch of mistakes and is wrong?
On thing I do not mean, but which I now worry someone could get out, is “I feel sorry (or mockingly pretend to feel sorry) for this person because they’re such a pitiable wretch.”
Well, thanks for the link.
I might save this as a scathing, totally unintentional pan of the hard problem of consciousness:
That’s actually misleading of me to pick on, because I thought that most of the section on consciousness was actually a highlight of the article. It’s because I read it with more interest that I noticed little oversights like the above.
A part of me wonders if an LLM said it was really deep.
I feel for this person, I really do. Anthropomorphizing LLMs is easy and often useful. But you can’t just ditch technical reasoning about how AI works for this kind of… ecstatic social-reasoning-based futurism. Or, you can, you’re just going to be wrong.
And please, if you run your theory of anything by an LLM that says it’s genius, it’s important to remember that there’s currently a minor problem (or so I remember a moderator saying, take this with a grain of salt) on the physics subreddit with crank Theory of Everything submissions that got ChatGPT to “fill in the details,” and they must be right because ChatGPT made such a nice argument for why this idea was genius! These amateur physicsts trusted ChatGPT to watch their (epistemic) back, but it didn’t work out for them.
edit: convinced by reply that I was misreading
That’s a pretty rude thing to say about my friend. I’d appreciate it if you could find a less condescending way to put this.
I would have just left this as a removeable react, but I can’t delete replied-to comments, and a mod immediately downvoted my “too combative” react, so I’m replying as a comment.
edit: I agree with all your other points near-unequivocally. I just feel like that particular phrasing implies, without writing down the justification for why, that JC needs love and support for the fact that they got manipulated. I agree they got manipulated by o3; I don’t think being roundabout about it is a good way to communicate it. (JC just went to sleep, but will probably roll eyes at my enthusiastic defense of their character, heh.)
I’m not sure if you’re reading in more rudeness than I intended to that phrase. I’ll try to clarify and then maybe you can tell me.
By “I feel for this person,” I mean “I think it’s understandable, even sympathetic, to have the mental model of LLMs that they do.” Is that how you interpreted it, and you’re saying it’s condescending for me to say that while also saying this person made a bunch of mistakes and is wrong?
On thing I do not mean, but which I now worry someone could get out, is “I feel sorry (or mockingly pretend to feel sorry) for this person because they’re such a pitiable wretch.”