I would like to emphasize that there is a difference between intelligence and reliability.
In that light, I would suggest implementing some kind of system to try to account for A and B’s reliability, and in what subjects one is more reliable than the other. This can be intractable if you don’t have a lot of history to go off of, so maybe you can toss out questions to A and B on other issues to try and determine where their opinions cluster. As long as the Scientologist was talking about something outside the cluster of crazy (if that is possible), you can probably take them seriously.
I was actually recently thinking of a series of conversations I had with Person A and Person B regarding the nature of Eliezer. I shared an article from a Sequence (not sure which one) to A and B, and Person B told me that Eliezer’s writing struck him as someone who was pretending to be smart. I asked him to defend the claim, and he was very vague (this could be a good test, too. If B can defend their beliefs better than A, maybe you should treat their arguments with more respect). I spoke with Person A about it, and he pointed out several reasons why Person B wasn’t terribly reliable in situations involving disagreement, rationality, intellectual honesty, curiosity, etc. In this case, I already had a long history of analyzing A, but not much to go off of with B. Combined with A having excellent examples and arguments, it was pretty obvious that B was just wrong. So instead of judging Person B as less intelligent, I just updated my assessment of his reliability in issues relating to skepticism and rationality.
Of course, in practice, this isn’t neat or easy. It was painful to entertain the seemingly zero-sum conflict between Eliezer’s and Person B’s reliability. Especially when I was independently reading about cults and death spirals.
But reliablilty in different areas correlate. Someone who is wrong about one thing is more likely to be wrong about other things, relative to someone who is right, in my experience.
(However, I bet you that there’d be a few cases where the people who believed the truth about something were less correct than average, like conspiracy theorists, who are right occasionally.)
But yeah. That article is really good; it pretty much discusses just what I’m talking about.
I agree. I was just trying to help come up with a way to find where that correlation breaks, or is countered by other external factors. A trivial example: A mormon dentist who is extremely irrational about religion, and yet exceedingly rational about tooth decay.
I like to think of myself as a protoBorg: I seek the strengths of other people, and add them to my own. Except since I’m much more bounded than they are, I often just make a note that if I am interested in increasing my uniqueness in the field of X, then I could start by talking to Person B.
“When assessing someone’s reliability, do you ignore the issue you seek >knowledge about?”
So getting back to that, I would say that instead of ignoring the issue, I try to see how the person fits in a network of subjects, and mark their reliability up or down in issues where they most closely touch the issue at hand. I still think of Person B as excellent with programming, history, current events, general knowledge. I just don’t trust his self-awareness as much as I did before discussing the Greater Deity’s nature with the heathen.
I would like to emphasize that there is a difference between intelligence and reliability.
In that light, I would suggest implementing some kind of system to try to account for A and B’s reliability, and in what subjects one is more reliable than the other. This can be intractable if you don’t have a lot of history to go off of, so maybe you can toss out questions to A and B on other issues to try and determine where their opinions cluster. As long as the Scientologist was talking about something outside the cluster of crazy (if that is possible), you can probably take them seriously.
I was actually recently thinking of a series of conversations I had with Person A and Person B regarding the nature of Eliezer. I shared an article from a Sequence (not sure which one) to A and B, and Person B told me that Eliezer’s writing struck him as someone who was pretending to be smart. I asked him to defend the claim, and he was very vague (this could be a good test, too. If B can defend their beliefs better than A, maybe you should treat their arguments with more respect). I spoke with Person A about it, and he pointed out several reasons why Person B wasn’t terribly reliable in situations involving disagreement, rationality, intellectual honesty, curiosity, etc. In this case, I already had a long history of analyzing A, but not much to go off of with B. Combined with A having excellent examples and arguments, it was pretty obvious that B was just wrong. So instead of judging Person B as less intelligent, I just updated my assessment of his reliability in issues relating to skepticism and rationality.
Of course, in practice, this isn’t neat or easy. It was painful to entertain the seemingly zero-sum conflict between Eliezer’s and Person B’s reliability. Especially when I was independently reading about cults and death spirals.
That does it. Eliezer is now a Greater Deity, and his followers argue about the Nature of God and exegesis of the sacred texts—I foresee a schism!
:-)
But reliablilty in different areas correlate. Someone who is wrong about one thing is more likely to be wrong about other things, relative to someone who is right, in my experience.
(However, I bet you that there’d be a few cases where the people who believed the truth about something were less correct than average, like conspiracy theorists, who are right occasionally.)
But yeah. That article is really good; it pretty much discusses just what I’m talking about.
I agree. I was just trying to help come up with a way to find where that correlation breaks, or is countered by other external factors. A trivial example: A mormon dentist who is extremely irrational about religion, and yet exceedingly rational about tooth decay.
I like to think of myself as a protoBorg: I seek the strengths of other people, and add them to my own. Except since I’m much more bounded than they are, I often just make a note that if I am interested in increasing my uniqueness in the field of X, then I could start by talking to Person B.
So getting back to that, I would say that instead of ignoring the issue, I try to see how the person fits in a network of subjects, and mark their reliability up or down in issues where they most closely touch the issue at hand. I still think of Person B as excellent with programming, history, current events, general knowledge. I just don’t trust his self-awareness as much as I did before discussing the Greater Deity’s nature with the heathen.