nitpick: it’s still a probabilistic argument because, as you point out, there could be two totally unrelated mechanisms that produce speech talking about qualia and the actual experience of qualia. Obviously that’s super unlikely, but it’s still probabilistic.
Exactly my point. Because there should be no probabilistic element concerning qualia if there is no informational uncertainty about the physics of a given brain. Extremely (extremely extremely) likely to have qualia doesn’t cut it.
So I corrected Eliezer’s argument to state that it is us who are merely extremely confident that other human minds have qualia, and our hypothetical omniscient intelligence is certain about whatever qualia that exist in the same way that it is certain about physics. My best understanding of Eliezer’s argument is that the omniscient intelligence is merely extremely confident about qualia existing in given minds, which is not the same thing. But it’s easy to mix the two up by arguing imprecisely.
Comments have focused on my speculation about the irreducibility of qualia, but that was intended as more or less an aside to the main focus of the article.
To be fair, I came in to this thread expecting you to argue my nitpick and then I read some of your post and got confused about what you were arguing. I’m also a but confused about your comment. Where are you drawing the boundary around “a given brain”? Does it include any inaccessible qualia physics?
Where are you drawing the boundary around “a given brain”?
I’m not as such. But the existence of clusters in thingspace that correspond to the referent of our word “brain” is one of many (naturally) unspoken premises in the Yudkowsky-Chalmers debate. As such, I don’t believe that it is my duty to define “brain” precisely nor do I think that it is particularly relevant to the debate to do so.
Exactly my point. Because there should be no probabilistic element concerning qualia if there is no informational uncertainty about the physics of a given brain. Extremely (extremely extremely) likely to have qualia doesn’t cut it.
So I corrected Eliezer’s argument to state that it is us who are merely extremely confident that other human minds have qualia, and our hypothetical omniscient intelligence is certain about whatever qualia that exist in the same way that it is certain about physics. My best understanding of Eliezer’s argument is that the omniscient intelligence is merely extremely confident about qualia existing in given minds, which is not the same thing. But it’s easy to mix the two up by arguing imprecisely.
Comments have focused on my speculation about the irreducibility of qualia, but that was intended as more or less an aside to the main focus of the article.
To be fair, I came in to this thread expecting you to argue my nitpick and then I read some of your post and got confused about what you were arguing. I’m also a but confused about your comment. Where are you drawing the boundary around “a given brain”? Does it include any inaccessible qualia physics?
I’m not as such. But the existence of clusters in thingspace that correspond to the referent of our word “brain” is one of many (naturally) unspoken premises in the Yudkowsky-Chalmers debate. As such, I don’t believe that it is my duty to define “brain” precisely nor do I think that it is particularly relevant to the debate to do so.