Your counterargument to Pinker is pretty central to this thing, but as it stands it seems to boil down to a not yet very convincing “I don’t care for vegetarianism. Violence is occasionally entertaining.” This part should be the one that makes the reader go, hm, maybe there’s a point there, but it’s currently doing nothing to make me stop classifying factory farming food industry and a preoccupation with violence as problems instead of things to cherish.
Moving on to
This leads me to suspect Homer’s FAI is probably different from my own FAI, is different from the FAI of 2400 AD values. If FAI2400 gets to play with the universe around forever, instead of FAI2012 I’d be rather pissed.
this is also confusing. You’re basically restating the exact problem CEV is for, without mentioning that CEV is for this problem. This also really only makes sense if you antropomorphize FAI into basically an equivalent of the cultural norms of the era. There are way too many unknown unknowns in how the basic cultural backdrop would come out in the end when operated on by an AI as compared to when operated on by collective human minds for the equating to outcomes of a culture run by humans to make much sense. I’m basically assuming that the hopefully better understanding of just how intelligence works at 2400 would dominate over whatever the human cultural norms are like for how FAI2400 as opposed to FAI2012 would come out.
If I wanted to attack the thesis that we’re experiencing moral progress due to cultural evolution, I’d go for looking at how we currently have unprecedented energy resources at our disposal, and can afford a great deal more social signaling of every sort than in pretty much any other point in history, and how the past 300 years we’ve been on a rising gradient towards the current level of resource use.
From historical perspective, I’d be interested if we can quantify any sort of differences in moral progress separate from material progress in the various geographically and culturally separate historical large civilizations, and what we can make of the collapse of the Roman Empire into the Early Middle Ages.
The article might also try to say something about what it could mean for a society to be moral, independent of how technologically advanced and resource-rich the society is.
Your counterargument to Pinker is pretty central to this thing, but as it stands it seems to boil down to a not yet very convincing “I don’t care for vegetarianism. Violence is occasionally entertaining.” This part should be the one that makes the reader go, hm, maybe there’s a point there, but it’s currently doing nothing to make me stop classifying factory farming food industry and a preoccupation with violence as problems instead of things to cherish.
Moving on to
this is also confusing. You’re basically restating the exact problem CEV is for, without mentioning that CEV is for this problem. This also really only makes sense if you antropomorphize FAI into basically an equivalent of the cultural norms of the era. There are way too many unknown unknowns in how the basic cultural backdrop would come out in the end when operated on by an AI as compared to when operated on by collective human minds for the equating to outcomes of a culture run by humans to make much sense. I’m basically assuming that the hopefully better understanding of just how intelligence works at 2400 would dominate over whatever the human cultural norms are like for how FAI2400 as opposed to FAI2012 would come out.
If I wanted to attack the thesis that we’re experiencing moral progress due to cultural evolution, I’d go for looking at how we currently have unprecedented energy resources at our disposal, and can afford a great deal more social signaling of every sort than in pretty much any other point in history, and how the past 300 years we’ve been on a rising gradient towards the current level of resource use.
From historical perspective, I’d be interested if we can quantify any sort of differences in moral progress separate from material progress in the various geographically and culturally separate historical large civilizations, and what we can make of the collapse of the Roman Empire into the Early Middle Ages.
The article might also try to say something about what it could mean for a society to be moral, independent of how technologically advanced and resource-rich the society is.