An illuminating (no pun intended) example of when the adjustment to the ambient level of sense-data affects what people think they want would be nice. Without it the whole section seems to detract from your point.
But I’m not raising a puzzle about how people think they want things even when they are behavioristic machines. I’m raising a puzzle about how we can be said to actually want things even when they are behavioristic machines that, for example, exhibit framing effects and can’t use neurons to encode value for the objective intensities of stimuli.
An illuminating (no pun intended) example of when the adjustment to the ambient level of sense-data affects what people think they want would be nice. Without it the whole section seems to detract from your point.
I wrote a response here.
But I’m not raising a puzzle about how people think they want things even when they are behavioristic machines. I’m raising a puzzle about how we can be said to actually want things even when they are behavioristic machines that, for example, exhibit framing effects and can’t use neurons to encode value for the objective intensities of stimuli.