In far mode most people think in terms of good and evil first, correct and incorrect second. They might think that their enemies are evil mutants, but most sense that their enemies still have their own unique truth (evil mutant truth). This leads to hatred and aggression, but it’s less bad than an impersonal, clinical, mechanistic approach.
I agree with the first sentence, but not with the second. Good and evil, for most people, implies correct and incorrect—ideological enemies are both wrong and evil, and they’re wrong because they’re evil. Also evil because they’re wrong, if you back them into a corner on that one. Christian conceptions of sin are tied pretty closely to correctness, for example—the etymology implies “missing the mark”.
I’m honestly not sure unemotional, subjectively-objective hatred exists in neurotypical folks, human psychology being what it is. I’ve gotten pretty angry at software bugs before.
Might be mind projection on my part, true. However, it genuinely looks to me that many people do feel like this, for example, in the trolley problem: the math might say it’s more “correct” to end up with +4 saved lives, yet it’s still an “evil” act to them—they’d say that a solution can be the only technically correct one and still less moral than alternatives.
I agree with the first sentence, but not with the second. Good and evil, for most people, implies correct and incorrect—ideological enemies are both wrong and evil, and they’re wrong because they’re evil. Also evil because they’re wrong, if you back them into a corner on that one. Christian conceptions of sin are tied pretty closely to correctness, for example—the etymology implies “missing the mark”.
I’m honestly not sure unemotional, subjectively-objective hatred exists in neurotypical folks, human psychology being what it is. I’ve gotten pretty angry at software bugs before.
Might be mind projection on my part, true. However, it genuinely looks to me that many people do feel like this, for example, in the trolley problem: the math might say it’s more “correct” to end up with +4 saved lives, yet it’s still an “evil” act to them—they’d say that a solution can be the only technically correct one and still less moral than alternatives.