I understand all this logically, but my emotional brain asks, “Yeah, but why should I care about any of that? I want what I want. I don’t want to grow, or improve myself, or learn new perspectives, or bring others joy. I want to feel good all the time with minimal effort.”
When wireheading—real wireheading, not the creepy electrode in the brain sort that few people would actually accept—is presented to you, it is very hard to reject it, particularly if you have a background of trauma or neurodivergence that makes coping with “real life” difficult to begin with, which is why so many people with brains like mine end up as addicts. Actually, by some standards, I am an addict, just not of any physical substance.
And to be honest, as a risk-averse person, it’s hard for me to rationally argue for why I ought to interact with other people when AIs are better, except the people I already know, trust, and care about. Like, where exactly is my duty to “grow” (from other people’s perspective, by other people’s definitions, because they tell me I ought to do it) supposed to be coming from? The only thing that motivates me, sometimes, to try to do growth-and-self-improvement things is guilt. And I’m actually a pretty hard person to guilt into doing things.
I understand all this logically, but my emotional brain asks, “Yeah, but why should I care about any of that? I want what I want. I don’t want to grow, or improve myself, or learn new perspectives, or bring others joy. I want to feel good all the time with minimal effort.”
When wireheading—real wireheading, not the creepy electrode in the brain sort that few people would actually accept—is presented to you, it is very hard to reject it, particularly if you have a background of trauma or neurodivergence that makes coping with “real life” difficult to begin with, which is why so many people with brains like mine end up as addicts. Actually, by some standards, I am an addict, just not of any physical substance.
And to be honest, as a risk-averse person, it’s hard for me to rationally argue for why I ought to interact with other people when AIs are better, except the people I already know, trust, and care about. Like, where exactly is my duty to “grow” (from other people’s perspective, by other people’s definitions, because they tell me I ought to do it) supposed to be coming from? The only thing that motivates me, sometimes, to try to do growth-and-self-improvement things is guilt. And I’m actually a pretty hard person to guilt into doing things.