In many issues like this, my opinion is formed automatically without my conscious will or any kind of deliberate reasoning.
I don’t have opinions on highly technical issues I don’t really care about, but on highly publicized issues like global warming it’s hard for me not to develop any kind of preference to one direction or another. I had a certain feeling about global warming before I even knew enough about all the relevant facts and the science behind it, and this feeling was probably formed using very straightforward subconscious heuristics. So the question then is not “should I form an opinion on this”, but rather “should I trust my gut feeling enough to call it an opinion and make it a small part of my identity” or “should I investigate this more to get a better feeling on the subject”.
But this is just me, I’m not sure how other people’s minds work. So do you mean you don’t have even a slight preference to either direction? Or do you mean that this preference is not on such a firm ground that you should pay attention to it?
I don’t have even a slight preference to either direction, but this is the result of a deliberate attempt at deprogramming, noticing beliefs that I can’t back up and that appear to originate in downstream-from-politics status concerns and trying to remove them: I grew up with a vague sense that ‘climate deniers’ were stupid and inferior, but 1) vague senses are worse than zero information, 2) I grew up with other vague senses that turned out to be totally wrong, 3) the vague senses I grew up with are mostly historically recent enough [this one included, of course] that I can’t even use Burke to justify taking them as priors without being able to argue any side of them.
My original opinion did form automatically without my conscious will or any kind of deliberate reasoning; I had to notice that I didn’t know the first thing about the issue and stop having opinions about it. I decided a few years ago to root out and reject received status/authority-concerns masquerading as knowledge, to stop having opinions about factual matters that I don’t know enough about to form an opinion on.
I still have a vague emotional sense that AGW is more likely than its absence, but I try not to pay attention to that, and it certainly doesn’t qualify as a belief or a view: it’s a stupid thing that my brain does when I don’t tell it not to, since completely deprogramming is a lot harder than just removing beliefs.
Your beliefs in anthropogenic global warming affect your political position, and if you live in a democratic country, your political position affect government policy: you can vote, you can campaign with various level of commitment, you can run for office. Even if you choose to do no such thing, you are still making a political decision.
And by the way, what do you mean by “I don’t have even a slight preference to either direction”? It’s not like it is a binary question where you can assume a 50% prior probability. You must be necessarily using a non-trivial prior. Then why are you deliberately ignoring evidence that could be used to update?
What I mean to imply by “I don’t even have a slight preference to either direction” is not that I assume a 50% prior probability to each side; it’s that I don’t assume a prior probability. Assuming one would be uninformative noise, since I don’t know enough about it to have an opinion—and it would be both harmful and irresponsible. Harmful since it would commit me to a prior for something that I don’t want to have any priors about—once I write down a percentage, I’ve committed to that percentage, and I’ll be more likely to update based on it, even though it has next to nothing behind it and I don’t want to go anywhere near it—and irresponsible because, if I do it in public, it’ll give other people something to update on, even though it’s little better than getting a percentage out of a random number generator.
I’m deliberately ignoring evidence that could be used to update because I don’t think that I’m obligated to have an opinion on literally everything in the world. My time is better spent elsewhere. And I’m deliberately ignoring the non-trivial priors I would base a probability estimate on if I ended up in an absurd counterfactual world where I absolutely had to give one because they’re a product of the environment in which I was raised, they’re not old enough to be justified in the Burkean sense, and I don’t think that the processes by which those environmental opinions are formed have anything more than the most tenuous connection to the actual truth, whatever it is.
Democracy has the wonderful property that it obeys the Central Limit Theorem. Each person concentrating on the areas they know well and care about produces an overall bell curve that makes a fair bit of sense, even if no individual voter sees the big picture very well.
In a very handwavey sense. Voters are not, strictly speaking, IID. But it works out pretty similarly. I find that politically active folks(whether seriously or casually) mostly tend to care passionately about certain issues, but the actual election results are vastly smoother.
I don’t think this comparison works at all. Not only the voters are not IID, but the actual election results are a discrete outcome and there is nothing “vastly smoother” about them. The political process is full of threshold functions.
Oh, certainly. But the overall motion is much less than one might think. A real blowout defeat of a party in the US is getting 45% of the vote to your opponent’s 54%. Most countries are similar, if with more parties. The government swings, but voters as a whole don’t do so too heavily.
But the overall motion is much less than one might think.
So you are saying that the political views of populations are largely stable. Sure. But what does it have to with either democracy or the Central Limit Theorem?
Not just stable. Moderate, rarely obsessed with single issues, and in retrospect they’re usually pretty good at making wise decisions on the broad strokes(even if they’re bad at micropolicy—which makes sense, because so few people care about it). It’s for the same reasons as the CLT, which is why I named it—the distinguishing characteristics and individual madnesses of voters cancel each other out, and you’re left with a signal whose characteristics are defined by broad statistical characteristics, and not the quirks of a small group.
In many issues like this, my opinion is formed automatically without my conscious will or any kind of deliberate reasoning. I don’t have opinions on highly technical issues I don’t really care about, but on highly publicized issues like global warming it’s hard for me not to develop any kind of preference to one direction or another. I had a certain feeling about global warming before I even knew enough about all the relevant facts and the science behind it, and this feeling was probably formed using very straightforward subconscious heuristics. So the question then is not “should I form an opinion on this”, but rather “should I trust my gut feeling enough to call it an opinion and make it a small part of my identity” or “should I investigate this more to get a better feeling on the subject”.
But this is just me, I’m not sure how other people’s minds work. So do you mean you don’t have even a slight preference to either direction? Or do you mean that this preference is not on such a firm ground that you should pay attention to it?
I don’t have even a slight preference to either direction, but this is the result of a deliberate attempt at deprogramming, noticing beliefs that I can’t back up and that appear to originate in downstream-from-politics status concerns and trying to remove them: I grew up with a vague sense that ‘climate deniers’ were stupid and inferior, but 1) vague senses are worse than zero information, 2) I grew up with other vague senses that turned out to be totally wrong, 3) the vague senses I grew up with are mostly historically recent enough [this one included, of course] that I can’t even use Burke to justify taking them as priors without being able to argue any side of them.
My original opinion did form automatically without my conscious will or any kind of deliberate reasoning; I had to notice that I didn’t know the first thing about the issue and stop having opinions about it. I decided a few years ago to root out and reject received status/authority-concerns masquerading as knowledge, to stop having opinions about factual matters that I don’t know enough about to form an opinion on.
I still have a vague emotional sense that AGW is more likely than its absence, but I try not to pay attention to that, and it certainly doesn’t qualify as a belief or a view: it’s a stupid thing that my brain does when I don’t tell it not to, since completely deprogramming is a lot harder than just removing beliefs.
Your beliefs in anthropogenic global warming affect your political position, and if you live in a democratic country, your political position affect government policy: you can vote, you can campaign with various level of commitment, you can run for office. Even if you choose to do no such thing, you are still making a political decision.
And by the way, what do you mean by “I don’t have even a slight preference to either direction”? It’s not like it is a binary question where you can assume a 50% prior probability. You must be necessarily using a non-trivial prior. Then why are you deliberately ignoring evidence that could be used to update?
What’s your opinion on the situation in Mali?
What I mean to imply by “I don’t even have a slight preference to either direction” is not that I assume a 50% prior probability to each side; it’s that I don’t assume a prior probability. Assuming one would be uninformative noise, since I don’t know enough about it to have an opinion—and it would be both harmful and irresponsible. Harmful since it would commit me to a prior for something that I don’t want to have any priors about—once I write down a percentage, I’ve committed to that percentage, and I’ll be more likely to update based on it, even though it has next to nothing behind it and I don’t want to go anywhere near it—and irresponsible because, if I do it in public, it’ll give other people something to update on, even though it’s little better than getting a percentage out of a random number generator.
I’m deliberately ignoring evidence that could be used to update because I don’t think that I’m obligated to have an opinion on literally everything in the world. My time is better spent elsewhere. And I’m deliberately ignoring the non-trivial priors I would base a probability estimate on if I ended up in an absurd counterfactual world where I absolutely had to give one because they’re a product of the environment in which I was raised, they’re not old enough to be justified in the Burkean sense, and I don’t think that the processes by which those environmental opinions are formed have anything more than the most tenuous connection to the actual truth, whatever it is.
Background reading: http://www.gwern.net/Mistakes#mu
Democracy has the wonderful property that it obeys the Central Limit Theorem. Each person concentrating on the areas they know well and care about produces an overall bell curve that makes a fair bit of sense, even if no individual voter sees the big picture very well.
Whaaaaaat? 8-0
In a very handwavey sense. Voters are not, strictly speaking, IID. But it works out pretty similarly. I find that politically active folks(whether seriously or casually) mostly tend to care passionately about certain issues, but the actual election results are vastly smoother.
I don’t think this comparison works at all. Not only the voters are not IID, but the actual election results are a discrete outcome and there is nothing “vastly smoother” about them. The political process is full of threshold functions.
Oh, certainly. But the overall motion is much less than one might think. A real blowout defeat of a party in the US is getting 45% of the vote to your opponent’s 54%. Most countries are similar, if with more parties. The government swings, but voters as a whole don’t do so too heavily.
So you are saying that the political views of populations are largely stable. Sure. But what does it have to with either democracy or the Central Limit Theorem?
Not just stable. Moderate, rarely obsessed with single issues, and in retrospect they’re usually pretty good at making wise decisions on the broad strokes(even if they’re bad at micropolicy—which makes sense, because so few people care about it). It’s for the same reasons as the CLT, which is why I named it—the distinguishing characteristics and individual madnesses of voters cancel each other out, and you’re left with a signal whose characteristics are defined by broad statistical characteristics, and not the quirks of a small group.