To me this looks like a knockdown argument to any non-solipsistic morality. I really do just care about my qualia.
In some sense it’s the same mistake the deontologists make, on a deeper level. A lot their proposed rules strike me as heavily correlated with happiness. How were these rules ever generated? Whatever process generated them must have been a consequentialist process.
If deontology is just applied consequentialism, then maybe “happiness” is just applied “0x7fff5694dc58″.
Your post still leaves the possibility that “quality of life”, “positive emotions” or “meaningfulness” are objectively existing variables, and people differ only in their weighting. But I think the problem might be worse than that.
I think this makes the problem less bad, because if you get people to go up their chain of justification, they will all end up at the same point. I think that point is just predictions of the valence of their qualia.
It’s not. You may only care about your qualia, but I care about more than just my qualia. Perhaps what exactly I care about is not well-defined, but sure as shit my behavior is best modelled and explained as trying to achieve something in the world outside of my mind. Nozick’s experience machine argument shows all this. There’s also a good post by Nate Soares on the subject IIRC.
To me this looks like a knockdown argument to any non-solipsistic morality. I really do just care about my qualia.
In some sense it’s the same mistake the deontologists make, on a deeper level. A lot their proposed rules strike me as heavily correlated with happiness. How were these rules ever generated? Whatever process generated them must have been a consequentialist process.
If deontology is just applied consequentialism, then maybe “happiness” is just applied “0x7fff5694dc58″.
I think this makes the problem less bad, because if you get people to go up their chain of justification, they will all end up at the same point. I think that point is just predictions of the valence of their qualia.
It’s not. You may only care about your qualia, but I care about more than just my qualia. Perhaps what exactly I care about is not well-defined, but sure as shit my behavior is best modelled and explained as trying to achieve something in the world outside of my mind. Nozick’s experience machine argument shows all this. There’s also a good post by Nate Soares on the subject IIRC.