Some thoughts I had while reading (part of) the FAQ:
...our moral intuition...that we should care about other people.
Is it an established fact that this is a natural human intuition, or is it a cultural induced disposition?
If it is a natural human tendency, can we draw the line at caring about other people or do we also care about cute kittens?
Other moral systems are more concerned with looking good than being good...
Signaling is a natural human tendency. Just like caring about other people, humans care how they appear to other people.
Why should a moral theory satisfy one intuition but not the other?
You’ll have to differentiate what it means to be good from what it means to look good.
...but in fact people are generally very moral: they feel intense moral outrage at the suffering in the world...
People also feel intense moral outrage about others burning paper books or eating meat.
You can’t establish a fact about morality by cherry-picking certain behaviors.
So no particular intuition can be called definitely correct until a person has achieved a reflective equilibrium of their entire morality, which can only be done through careful philosophical consideration.
I am very skeptical of trying to apply game theoretic concepts to human values.
Human values are partly inconsistent. Can you show that forced consistency won’t destroy what it means to be human?
People enjoy watching a large moon, even though it is an optical illusion. If you were to argue that we generally assign more importance to not falling prey to optical illusions and therefore shouldn’t conclude that it is desireable to see a large moon, you’re messing with human nature, you’re creating the sort of rational agent that is assumed by economic and game theoretic theories rather than protecting human nature.
It’s my moral intuition that if I failed to reflect on my disgust over homosexuality, and ended out denying homosexuals the right to marry based on that disgust, then later when I thought about it more I would wish I had reflected earlier.
You expect hindsight bias and therefore conclude that you should discard all your complex values in favor of...what?
I am not saying it is wrong, but we’ll have to decide if we want to replace human nature with alien academic models or swallow the bitter pill and accept that we are inconsistent beings without stable utility functions that are constantly reborn.
...that morality must live in the world, and that morality must weight people equally.
Define “people”? What is it that assigns less weight to a chimpanzee than me?
Why should we assign a nonzero value to other people?
If I only assign weight to certain people then other people with more power might do the same and I will lose. So everyone except those who can overpower all others together would be better off to weigh each other equally.
But just because there is an equilibrium doesn’t mean that it is desirable. Humans don’t work like that. Humans refuse blackmail and are willing to sacrifice their own life’s rather than accepting a compromise.
But guilt is a faulty signal; the course of action which minimizes our guilt is not always the course of action that is morally right. A desire to minimize guilt is no more noble than any other desire to make one’s self feel good at the expense of others, and so a morality that follows the principle of according value to other people must worry about more than just feeling guilty.
You are wrapping your argument in terms your are trying to explain. You are begging the question.
But just as guilt is not a perfect signal, neither are warm fuzzies. As Eliezer puts it, you might well get more warm fuzzy feelings from volunteering for an afternoon at the local Shelter For Cute Kittens With Rare Diseasess than you would from developing a new anti-malarial drug, but that doesn’t mean that playing with kittens is more important than curing malaria.
You don’t explain why warm fuzzies aren’t more important than curing malaria.
If you go on and argue that what one really wants by helping cute kittens is to minimize suffering you are introducing and inducing an unwarranted proposition. You need empirical evidence to show that what humans really want isn’t warm fuzzies.
Ironically, although these sorts of decisions are meant to prove the signaler is moral, they are not in themselves moral decisions: they demonstrate interest only in a good to the signaler...
So? What is a moral decision? You still didn’t show why signaling is less important than saving your friend. All you do is telling people to feel guilty by signaling that it is immoral to be more concerned with signaling...
...there’s a big difference between promoting your own happiness by promoting the happiness of others, and promoting your own happiness instead of promoting the happiness of others.
Some thoughts I had while reading (part of) the FAQ:
Is it an established fact that this is a natural human intuition, or is it a cultural induced disposition?
If it is a natural human tendency, can we draw the line at caring about other people or do we also care about cute kittens?
Signaling is a natural human tendency. Just like caring about other people, humans care how they appear to other people.
Why should a moral theory satisfy one intuition but not the other?
You’ll have to differentiate what it means to be good from what it means to look good.
People also feel intense moral outrage about others burning paper books or eating meat.
You can’t establish a fact about morality by cherry-picking certain behaviors.
I am very skeptical of trying to apply game theoretic concepts to human values.
Human values are partly inconsistent. Can you show that forced consistency won’t destroy what it means to be human?
People enjoy watching a large moon, even though it is an optical illusion. If you were to argue that we generally assign more importance to not falling prey to optical illusions and therefore shouldn’t conclude that it is desireable to see a large moon, you’re messing with human nature, you’re creating the sort of rational agent that is assumed by economic and game theoretic theories rather than protecting human nature.
You expect hindsight bias and therefore conclude that you should discard all your complex values in favor of...what?
I am not saying it is wrong, but we’ll have to decide if we want to replace human nature with alien academic models or swallow the bitter pill and accept that we are inconsistent beings without stable utility functions that are constantly reborn.
Define “people”? What is it that assigns less weight to a chimpanzee than me?
If I only assign weight to certain people then other people with more power might do the same and I will lose. So everyone except those who can overpower all others together would be better off to weigh each other equally.
But just because there is an equilibrium doesn’t mean that it is desirable. Humans don’t work like that. Humans refuse blackmail and are willing to sacrifice their own life’s rather than accepting a compromise.
You are wrapping your argument in terms your are trying to explain. You are begging the question.
You don’t explain why warm fuzzies aren’t more important than curing malaria.
If you go on and argue that what one really wants by helping cute kittens is to minimize suffering you are introducing and inducing an unwarranted proposition. You need empirical evidence to show that what humans really want isn’t warm fuzzies.
So? What is a moral decision? You still didn’t show why signaling is less important than saving your friend. All you do is telling people to feel guilty by signaling that it is immoral to be more concerned with signaling...
You are signaling again...