This is the ultimate example of… there should be a name for this.
You figure out that something is true, like utilitarianism. Then you find a result that seems counter intuitive. Rather than going “huh, I guess my intuition was wrong, interesting” you go “LET ME FIX THAT” and change the system so that it does what you want...
man, if you trust your intuition more than the system, then there is no reason to have a system in the first place. Just do what is intuitive.
The whole point of having a system like utilitarinism is that we can figure out the correct answers in an abstarct, general way, but not necessarily for each particular situation. Having a system tells us what is correct in each situation, not vice versa.
The utility monster is nothing to be fixed. It’s a natural consequence of doing the right thing, that just happens to make some people uncomfortable. It’s hardly the only uncomfortable consequence of utilitarianism, either.
Sometimes when explicit reasoning and intuition conflict, intuition turns out to be right, and there is a flaw in the reasoning. There’s nothing wrong with using intuition to guide yourself in questioning a conclusion you reached through explicit reasoning. That said, DragonGod did an exceptionally terrible job of this.
Yeah, you’re of course right. In the back of my mind I realized that the point I was making was flawed even as I was writing it. A much weaker version of the same would have been correct, “you should at least question whether your intuition is wrong.” In this case it’s just very obvious to me me that there is nothing to be fixed about utilitarianism.
This is the ultimate example of… there should be a name for this.
You figure out that something is true, like utilitarianism. Then you find a result that seems counter intuitive. Rather than going “huh, I guess my intuition was wrong, interesting” you go “LET ME FIX THAT” and change the system so that it does what you want...
man, if you trust your intuition more than the system, then there is no reason to have a system in the first place. Just do what is intuitive.
The whole point of having a system like utilitarinism is that we can figure out the correct answers in an abstarct, general way, but not necessarily for each particular situation. Having a system tells us what is correct in each situation, not vice versa.
The utility monster is nothing to be fixed. It’s a natural consequence of doing the right thing, that just happens to make some people uncomfortable. It’s hardly the only uncomfortable consequence of utilitarianism, either.
That looks like a category error. What does it mean for utilitarianism to be “true”? It’s not a feature of the territory.
Trust is not all-or-nothing. Putting ALL your trust into the system—no sanity checks, no nothing—seems likely to lead to regular epic fails.
The term you’re looking for is “apologist”.
I think the name you are looking for is ad hoc hypothesis.
Sometimes when explicit reasoning and intuition conflict, intuition turns out to be right, and there is a flaw in the reasoning. There’s nothing wrong with using intuition to guide yourself in questioning a conclusion you reached through explicit reasoning. That said, DragonGod did an exceptionally terrible job of this.
Yeah, you’re of course right. In the back of my mind I realized that the point I was making was flawed even as I was writing it. A much weaker version of the same would have been correct, “you should at least question whether your intuition is wrong.” In this case it’s just very obvious to me me that there is nothing to be fixed about utilitarianism.
Anyway, yeah, it wasn’t a good reply.
No. I am improving the existing system. All individuals have the same capacity for desire.
This seems very unlikely, if “capacity for desire” corresponds to anything in the real universe.