Origin of Morality

By ori­gin, I’m refer­ring to the source of the need for moral­ity, and it’s clear that it’s mostly about suffer­ing. We don’t like suffer­ing and would rather not ex­pe­rience it, al­though we are pre­pared to put up with some (or even a lot) of it if that suffer­ing leads to greater plea­sure that out­weighs it. We re­al­ised long ago that if we do a deal with the peo­ple around us to avoid caus­ing each other suffer­ing, we could all suffer less and have bet­ter lives—that’s far bet­ter than spend­ing time hit­ting each over the head with clubs and steal­ing the fruits of each other’s labour. By do­ing this deal, we ended up with greater fruits from our work and re­moved most of the bru­tal­ity from our lives. Mo­ral­ity is clearly pri­mar­ily about man­age­ment of suffer­ing.

You can’t tor­ture a rock, so there’s no need to have rules about pro­tect­ing it against peo­ple who might seek to harm it. The same ap­plies to a com­puter, even if it’s run­ning AGI—if it lacks sen­tience and can­not suffer, it doesn’t need rules to pro­tect it from harm (other than to try to pre­vent the owner from suffer­ing any loss if it was to be dam­aged, or other peo­ple who might be harmed by the loss of the work the com­puter was car­ry­ing out). If we were able to make a sen­tient ma­chine though, and if that sen­tient ma­chine could suffer, it would then have to be brought into the range of things that need to be pro­tected by moral­ity. We could make an un­in­tel­li­gent sen­tient ma­chine like a calcu­la­tor and give it the abil­ity to suffer, or we could make a ma­chine with hu­man-level in­tel­li­gence with the same abil­ity to suffer, and to suffer to the same de­gree as the less in­tel­li­gent calcu­la­tor. Tor­tur­ing both of these to gen­er­ate the same amount of suffer­ing in each would be equally wrong for both. It is not the in­tel­li­gence that pro­vides the need for moral­ity, but the sen­tience and the de­gree of suffer­ing that may be gen­er­ated in it.

With peo­ple, our suffer­ing can per­haps be am­plified be­yond the suffer­ing that oc­curs in other an­i­mals be­cause there are many ways to suffer, and they can com­bine. When an an­i­mal is chased, brought down and kil­led by a preda­tor, it most likely ex­pe­riences fear, then pain. The pain may last for a long time in some cases, such as when wolves eat a musk ox from the rear end while it’s still al­ive, but the vic­tim lacks any real un­der­stand­ing of what’s hap­pen­ing to it. When peo­ple are at­tacked and kil­led though, there are am­plifi­ca­tions of the suffer­ing caused by the vic­tim un­der­stand­ing the situ­a­tion and know­ing just how much they are los­ing. The many peo­ple who care deeply about that vic­tim will also suffer be­cause of this loss, and many will suffer deeply for many decades. This means that peo­ple need greater pro­tec­tion from moral­ity, al­though when scores are be­ing put to the de­gree of suffer­ing caused by pain and fear to an an­i­mal vic­tim and a hu­man vic­tim, those should be mea­sured us­ing the same scale, so in that re­gard these sen­tiences are be­ing treated as equals.

No nominations.
No reviews.