If you make a claim about the character of another person or the state of reality do you or do you not need some evidence to support it?
I can make claims about anything without supporting it, whether or not it’s about someone’s character. The moon is made of green cheese. George Washington was more akratic than my mother. See, there, I did it twice.
It can often be rational to do so. For example, if someone trustworthy offers me a million dollars for making the claim “two plus two equals five”, I will assert “two plus two equals five” and accept my million dollars.
If it helps resolve the confusion at all, my working theory is that HT believes unjustified and negative claims have been made about his/her character, and is trying to construct a formal structure that allows such claims to be rejected on formal grounds, rather than by evaluation of available evidence.
FWIW, I tend to respond to comments ignoring the context, as my main goal here is to improve the quality of the site by correcting minor mistakes (aside from cracking jokes and discussing Harry Potter).
Pretty sure he means epistemically irrational, not instrumentally.
Probably. But I’m finding myself more and more in the “epistemic rationality is a case of instrumental rationality” camp, though not to any particular effect personally since I rate epistemic rationality very highly for its own sake.
I understand what you are saying; you are saying that for the speaker of the statement it is not irrational, because the false statement might meet their motives. Or in other words, that rationality is completely dependent on the motives of the actor. Is this the rationality that your group idealizes? That as long as what I say or do works towards my personal motives it is rational? So if I want to convince the world that God is real, it is rational to make up whatever lies I see fit to delegitimize other belief systems?
So religious zealots are rational because they have a goal that their lies and craziness is helping them achieve? That is what you are arguing.
If someone told you that the moon was made of cheese, being a rational person, without providing any evidence of the fact, if they had no reason to believe that, they just believed it, you would think they were being irrational. And you know it. You just want to pick a fight.
Or in other words, that rationality is completely dependent on the motives of the actor.
In the sense I think you mean it, yes. Two equally rational actors with different motives will perform different acts.
That as long as what I say or do works towards my personal motives it is rational?
Yes.
So if I want to convince the world that God is real, it is rational to make up whatever lies I see fit to delegitimize other belief systems?
If that’s the most effective way to convince the world that God is real, and you value the world being convinced that God is real, yes.
So religious zealots are rational because they have a goal that their lies and craziness is helping them achieve?
Not necessarily, in that religious zealots don’t necessarily have such goals. But yes, if a religious zealot who in fact values things that are in fact best achieved through lies and craziness chooses to engage in those lies and craziness, that’s a rational act in the sense we mean it here.
If someone told you that the moon was made of cheese, being a rational person, without providing any evidence of the fact, if they had no reason to believe that, they just believed it, you would think they were being irrational.
Sure, that’s most likely true.
You just want to pick a fight.
You may be right about thomblake’s motives, though I find it unlikely. That said, deciding how likely I consider it is my responsibility. You are not obligated to provide evidence for it.
(nods) I was taking the “if they had no reason to believe that, they just believed it” part of the problem specification literally. (e.g., it’s not a joke, etc.)
Aha—I glossed over that bit as irrelevant since the scenario is someone saying some words, which is clearly a case for instrumental rather than epistemic rationality. I should probably have read the “someone told you” as the irrelevant bit and answered as though we were talking about epistemic rationality.
(nods) Of course in the real world you’re entirely correct. That said, I find a lot of thought experiments depend on positing a situation I can’t imagine any way of getting into and asking what follows from there.
I understand what you are saying; you are saying that for the speaker of the statement it is not irrational, because the false statement might meet their motives. Or in other words, that rationality is completely dependent on the motives of the actor.
Yes.
Is this the rationality that your group idealizes?
Do not ask whether it is “the Way” to do this or that. Ask whether the sky is blue or green. If you speak overmuch of the Way you will not attain it.
You may try to name the highest principle with names such as “the map that reflects the territory” or “experience of success and failure” or “Bayesian decision theory”. But perhaps you describe incorrectly the nameless virtue. How will you discover your mistake? Not by comparing your description to itself, but by comparing it to that which you did not name.
.
If someone told you that the moon was made of cheese, being a rational person, without providing any evidence of the fact, if they had no reason to believe that, they just believed it, you would think they were being irrational. And you know it.
No, I would generally not think someone was “being irrational” without specific reference to their motivations. If I must concern myself with the fulfillment of someone else’s utility function, it would usually take the form of “You should not X in order to Z because Y will more efficiently Z.” ETA: I would more likely think that their statement was a joke, and failing that think that it’s false and try to correct it. In case anyone’s curious, “the moon is made of green cheese” was a paradigm of a ridiculous, unproveable statement before humans went to the moon; and “green cheese” in this context means “new cheese”, not the color green.
You just want to pick a fight.
No, I’d rather be working on my dissertation, but I have a moral obligation to correct mistakes and falsehoods posted on this site.
I understand what you are saying; you are saying that for the speaker of the statement it is not irrational, because the false statement might meet their motives. Or in other words, that rationality is completely dependent on the motives of the actor.
Correct. As noted on another branch of this comment tree, this interpretation characterizes “instrumental rationality”, though a similar case could be made for “epistemic rationality”.
So religious zealots are rational because they have a goal that their lies and craziness is helping them achieve? That is what you are arguing.
That is not what I was arguing. If I understand you correctly however, you mean to say that what I’m arguing applies equally well to that case.
The important part of that statement is “X is rational”, where X is a human. Inasmuch as that predicate indicates that the subject behaves rationally most of the time, I would deny that it should be applied to any human. Humans are exceptionally bad at rationality.
That said, if a person X decided that course of action Y was the most efficient way to fulfill their utility function, then Y is rational by definition. (Of course, this applies equally well to non-persons with utility functions). Even if Y = “lies and craziness” or “religious belief” or “pin an aubergine to your lapel”.
So if I want to convince the world that God is real, it is rational to make up whatever lies I see fit to delegitimize other belief systems?
That’s a difficult empirical question, and outside my domain of expertise. You might want to consult an expert on lying, though I’d first question whether the subgoal of convincing the world that God is real, really advances your overall goals.
I can make claims about anything without supporting it, whether or not it’s about someone’s character. The moon is made of green cheese. George Washington was more akratic than my mother. See, there, I did it twice.
It can often be rational to do so. For example, if someone trustworthy offers me a million dollars for making the claim “two plus two equals five”, I will assert “two plus two equals five” and accept my million dollars.
I’m confused that you do not understand this.
If it helps resolve the confusion at all, my working theory is that HT believes unjustified and negative claims have been made about his/her character, and is trying to construct a formal structure that allows such claims to be rejected on formal grounds, rather than by evaluation of available evidence.
Thanks. That helps if true.
FWIW, I tend to respond to comments ignoring the context, as my main goal here is to improve the quality of the site by correcting minor mistakes (aside from cracking jokes and discussing Harry Potter).
Pretty sure he means epistemically irrational, not instrumentally.
That he’s wrong about it, for the reasons you’ve listed, is another matter.
Probably. But I’m finding myself more and more in the “epistemic rationality is a case of instrumental rationality” camp, though not to any particular effect personally since I rate epistemic rationality very highly for its own sake.
I understand what you are saying; you are saying that for the speaker of the statement it is not irrational, because the false statement might meet their motives. Or in other words, that rationality is completely dependent on the motives of the actor. Is this the rationality that your group idealizes? That as long as what I say or do works towards my personal motives it is rational? So if I want to convince the world that God is real, it is rational to make up whatever lies I see fit to delegitimize other belief systems?
So religious zealots are rational because they have a goal that their lies and craziness is helping them achieve? That is what you are arguing.
If someone told you that the moon was made of cheese, being a rational person, without providing any evidence of the fact, if they had no reason to believe that, they just believed it, you would think they were being irrational. And you know it. You just want to pick a fight.
In the sense I think you mean it, yes. Two equally rational actors with different motives will perform different acts.
Yes.
If that’s the most effective way to convince the world that God is real, and you value the world being convinced that God is real, yes.
Not necessarily, in that religious zealots don’t necessarily have such goals. But yes, if a religious zealot who in fact values things that are in fact best achieved through lies and craziness chooses to engage in those lies and craziness, that’s a rational act in the sense we mean it here.
Sure, that’s most likely true.
You may be right about thomblake’s motives, though I find it unlikely. That said, deciding how likely I consider it is my responsibility. You are not obligated to provide evidence for it.
Thanks—much more concise than my reply. Though I disagree about this bit:
for reasons I’ve stated in a sibling.
(nods) I was taking the “if they had no reason to believe that, they just believed it” part of the problem specification literally. (e.g., it’s not a joke, etc.)
Aha—I glossed over that bit as irrelevant since the scenario is someone saying some words, which is clearly a case for instrumental rather than epistemic rationality. I should probably have read the “someone told you” as the irrelevant bit and answered as though we were talking about epistemic rationality.
(nods) Of course in the real world you’re entirely correct. That said, I find a lot of thought experiments depend on positing a situation I can’t imagine any way of getting into and asking what follows from there.
Yes.
See the twelfth virtue:
.
No, I would generally not think someone was “being irrational” without specific reference to their motivations. If I must concern myself with the fulfillment of someone else’s utility function, it would usually take the form of “You should not X in order to Z because Y will more efficiently Z.” ETA: I would more likely think that their statement was a joke, and failing that think that it’s false and try to correct it. In case anyone’s curious, “the moon is made of green cheese” was a paradigm of a ridiculous, unproveable statement before humans went to the moon; and “green cheese” in this context means “new cheese”, not the color green.
No, I’d rather be working on my dissertation, but I have a moral obligation to correct mistakes and falsehoods posted on this site.
Correct. As noted on another branch of this comment tree, this interpretation characterizes “instrumental rationality”, though a similar case could be made for “epistemic rationality”.
That is not what I was arguing. If I understand you correctly however, you mean to say that what I’m arguing applies equally well to that case.
The important part of that statement is “X is rational”, where X is a human. Inasmuch as that predicate indicates that the subject behaves rationally most of the time, I would deny that it should be applied to any human. Humans are exceptionally bad at rationality.
That said, if a person X decided that course of action Y was the most efficient way to fulfill their utility function, then Y is rational by definition. (Of course, this applies equally well to non-persons with utility functions). Even if Y = “lies and craziness” or “religious belief” or “pin an aubergine to your lapel”.
That’s a difficult empirical question, and outside my domain of expertise. You might want to consult an expert on lying, though I’d first question whether the subgoal of convincing the world that God is real, really advances your overall goals.
I think the idea that you are grasping for (and which I don’t necessarily agree with) is that calling someone disingenuous is a dark side tool.