But, but .. 13!
RobinHanson
The crazier a thing you believe as a result of trusting your community, the stronger a tie to your community that shows. So when we signal loyalty via beliefs, those beliefs can get pretty crazy.
“That’s what happens to a field when it unbinds itself from the experimental evidence”—so the million dollar question for Less Wrong is: what experimental evidence can this community bind itself to, to avoid the same outcome?
I was once that young and naive. But I’d never heard of this book Moral Mazes. Seems great, and I intend to read it. https://twitter.com/robinhanson/status/1136260917644185606
There need not be just one “true objection”; there can be many factors that together lead to an estimate. Whether you have a Ph.D., and whether folks with Ph.D. have reviewed your claims, and what they say, can certainly be relevant. Also remember that you should care lots more about the opinions of experts that could build on and endorse your work, than about average Joe opinions. Very few things ever convince average folks of anything unusual; target a narrower audience.
You say don’t try to use game theory to figure out how to best “make a difference” but admit you will have virtually no influence in this election and instead just vote for the person you like best, among the candidates listed on the ballot. But why not continue with this logic, and “write-in” the person in the world you like best? Why not write them in even if write-ins aren’t officially allowed in this election? Why not skip the official elections and make up your own polling place to vote at? Why not just declare your vote for them in a blog post?
- 11 Dec 2012 13:18 UTC; 4 points) 's comment on Voting is like donating thousands of dollars to charity by (
The problem here of course is how selective to be about rules to let into this protected level of “rules almost no one should think themselves clever enough to know when to violate.” After all, your social training may well want you to include “Never question our noble leader” in that set. Many a Christian has been told the mysteries of God are so subtle that they shouldn’t think themselves clever enough to know when they’ve found evidence that God isn’t following a grand plan to make this the best of all possible worlds.
Your faith in math is misplaced. The sort of math smarts you are obsessed with just isn’t that correlated with intellectual accomplishment. For accomplishment outside of math, you must sacrifice time that could be spent honing your math skills, to actually think about other things. You could be nearly the smartest math type guy anyone you meet know, and still not accomplish if math is not the key to your chosen subject.
You are a bit too quick to allow the reader the presumption that they have more algorithmic faith than the other folks they talk to. Yes if you are super rational and they are not, you can ignore them. But how did you come to be confident in that description of the situation?
To be clear, Foresight asked each speakers to offer a topic for participants to forecast on, related to our talks. This was the topic I offered. That is NOT the same as my making a prediction on that topic. Instead, that is to say that the chance on this question seemed an unusual combination of verifiable in a year and relevant to the chances on other topics I talked about.
I’m not sure the point of outlining a research program in an area where you are not an expert and there are now many experts. First you just want to find out what the current experts think they know. Then if you want to know more, I’d think you’d either want to ask those experts to outline a research program, or you’d want to become an expert yourself and then outline a program.
The entries in a payoff matrix are supposed to sum up everything you care about, including whatever you care about the outcomes for the other player. Most every game theory text and lecture I know gets this right, but even when we say the right thing to students over and over, they mostly still hear it the wrong way you initially heard it. This is just part of the facts of life of teaching game theory.
This post has not at all misunderstood my suggestion from long ago, though I don’t think I thought about it very much at the time. I agree with the thrust of the post that a leverage factor seems to deal with the basic problem, though of course I’m also somewhat expecting more scenarios to be proposed to upset the apparent resolution soon.
To prohibit generalizations about gender without overwhelming hard data is usually to in effect silence the topic. We are all very interested in gender, and many of us have made interesting and relevant observations about the gender we see around us, but few of us have much in the way of overwhelming hard data. This post seems to be making generalizations about gender aspects of LW posts and comments without itself offering overwhelming hard data—why hold this meta gender discussion to a lower standard?
- 20 Jul 2009 1:42 UTC; 3 points) 's comment on Sayeth the Girl by (
Wow Hal. You don’t post often, but when you do it’s a doozy! And that is amazing, that 90% choose to die. I’m glad you won’t, and now I’m more impressed that Hawking didn’t.
OK, well given this clarification, it seems to me just fine to objectify people, and in fact I recommend doing so when what one is trying to do is neutral analysis about the facts of some matter. Objectify your teacher when deciding if school is worth the effort, and objectify your doctor when deciding if medicine is worth the cost.
OK, now take the next step. Since most people who are choosing love, belonging, and esteem over accuracy are not aware they are giving up accuracy, then you have to wonder how you can tell when you are doing so. If you are tempted to think that you are an exception who is willing to choose accuracy instead, ask if this is just another kind of group you want to join, or another kind of esteem you hope to acquire. If so, when would this lead you to actually choose more accuracy, vs. just to tell yourself you so choose?
It may be enough if we find common cause in wanting to be rational in some shared topic areas. As long as we can clearly demarcate off-limit topics, we might productively work on our rationality on other topics. We’ve heard that politics is the mind killer, and that we will do better working on rationality if we stay away from politics. You might argue similarly about religion. That all said, I can also see a need for a place for people to gather who want to be rational about all topics. So, the question for this community to decide is, what if any topics should be off-limits here?
I’m an economics professor in one of the few departments where people specialize in Austrian economics, and after years of exposure to them: I still don’t understand what they are claiming. Of course each person ends up with many specific beliefs, and many of those beliefs are correlated with being in that group. But that is not the same as core claims that they share because they are in that group.
If I can’t understand someone’s claims, and I’m not sure they even have clear claims, then I can’t exactly say they are wrong. In contrast, theism does make relatively understandable claims.
You seem to be relying almost entirely on your intuitive sense of people being smart, fast, “sparkly” etc. Yes, people at the top are good at giving other top people the impression they are smart. The question though is whether they are actually more productive in other ways. To evaluate that you need to look at metrics other than how sparkly the seem to you.