I don’t know if worrying about animal rights should count if we simultaneously also do factory farming...
And as for the trends in human rights, democratic inclusion, expansion of welfare states, decline in violence, etc., they are real, but unfortunately they also correlate in time with increasing demand for skilled human labor. In a hypothetical future world where human labor wouldn’t matter anymore because of full automation by superhuman AGI, I fear that these trends could easily reverse (though we may actually become extinct before that takes place).
As professor of economics and self-proclaimed doomer myself, I greatly appreciate this post! These are almost exactly my feelings when talking to fellow economists who typically think that, by an unspoken assumption, all AI will always be normal technology, a tool in people’s hands.
I think your capital/labor point is particularly spot on. I’ve had a problem with that framing for several years now. That’s why I proposed a “hardware-software” framework, which I elaborated in a few of my papers and one book. The idea is simple: just divide production factors differently! The key distinction is not whether it’s man or machine, it’s whether it’s physical work or information processing.
More in a LW post: The Hardware-Software Framework: A New Perspective on Economic Growth with AI — LessWrong, and in my 2022 book, Accelerating Economic Growth: Lessons From 200,000 Years of Technological Progress and Human Development | SpringerLink.