I don’t think people who feel comfortable posting average youtube comments are going to be welcome or useful at LessWrong, I don’t think this is a problem, and there are a lot of people like that.
Raising the sanity waterline on a grand scale should affect the comments on youtube, but we’re a long way from that.
This being said, I’d like to see more rationality materials for people of average intelligence, but that’s another long term possibility. Not does there not seem to be huge interest in the project, figuring out simple explanations for new ideas is work, and it seems to be be a relatively rare talent.
I only recently ran into a good simple explanation for Bayes—that the more detailed a prediction becomes, the less likely it is to be true. And I got it from a woman who doesn’t post on LW because she thinks the barriers to entry are too high. (It’s possible that this explanation was on LW, and I didn’t see it or it didn’t register—has anyone seen it here?)
There’s some degree of natural sorting on LW—I’m not the only person who doesn’t read the more mathematical or technical material here, and I’m not commenting on that material, either.
I don’t think having separate ranked areas is going to solve the problem of people living down to expectations.
that the more detailed a prediction becomes, the less likely it is to be true
That looks like part of the definition of probability.
Bayes would be more like ‘If you’ve got two ideas about what’s going on, and one of them says one thing’s going to happen, and the other says a different thing, but in the event it’s the first thing that happens, then you should believe the first idea more and the second idea less’.
Or to get a bit less abstract, say you’re playing dungeons and dragons, and and orc hits you with a sword, and you’re pretty sure that orcs do either 1D12 or 2D6 of damage, then if the orc does 2 damage, you should think ‘probably 1D12’, but if she does 7 damage, you should think ‘probably 2D6’.
Does anyone know of any games that normal human beings play that could be used in this sort of example? I mean, apart from, you know, life.
I’d like to see more rationality materials for people of average intelligence
Actually, search for “Center for Modern Rationality” and a post along the lines of “Name new Rationality Inst.”—the latter describes an organization Eliezer is making as a spinoff and explains that it’s going to have materials for high school students. They ARE trying to branch out to the rest of the population! This is exciting! (: I wonder how far they’ve gotten.
As for your other suggestions, I’ve begun talking about that again in my preventing endless September thread. You’re invited to check out the cliff notes version and request new pros and cons be added.
I only recently ran into a good simple explanation for Bayes—that the more detailed a prediction becomes, the less likely it is to be true.
That looks like a good way of explaining the conjunction and narrative fallacies, too. They could easily be looked at as adding details to a simpler argument. I wonder what other fallacies could be “generalized” similarly?
One thing I think we should be working on is a way of organizing the mass of fallacies and heuristics. There are too many to keep straight without some sort of organizing principles.
It was an in person conversation. Her phrasing may actually have been tighter. My extension to her explanation is that if you have probabilities for aspects of a prediction, then there’s math so that you can derive a probability for the whole prediction.
Her specific problem is that she’d like to post articles, but she’s put off by having to make sufficiently upvoted comments to do so. I’ve told her that if she writes an article I’ll post it with attribution for her.
I don’t have a general problem with the 20 karma requirement for posting articles.
I don’t think people who feel comfortable posting average youtube comments are going to be welcome or useful at LessWrong, I don’t think this is a problem, and there are a lot of people like that.
Raising the sanity waterline on a grand scale should affect the comments on youtube, but we’re a long way from that.
This being said, I’d like to see more rationality materials for people of average intelligence, but that’s another long term possibility. Not does there not seem to be huge interest in the project, figuring out simple explanations for new ideas is work, and it seems to be be a relatively rare talent.
I only recently ran into a good simple explanation for Bayes—that the more detailed a prediction becomes, the less likely it is to be true. And I got it from a woman who doesn’t post on LW because she thinks the barriers to entry are too high. (It’s possible that this explanation was on LW, and I didn’t see it or it didn’t register—has anyone seen it here?)
There’s some degree of natural sorting on LW—I’m not the only person who doesn’t read the more mathematical or technical material here, and I’m not commenting on that material, either.
I don’t think having separate ranked areas is going to solve the problem of people living down to expectations.
That looks like part of the definition of probability.
Bayes would be more like ‘If you’ve got two ideas about what’s going on, and one of them says one thing’s going to happen, and the other says a different thing, but in the event it’s the first thing that happens, then you should believe the first idea more and the second idea less’.
Or to get a bit less abstract, say you’re playing dungeons and dragons, and and orc hits you with a sword, and you’re pretty sure that orcs do either 1D12 or 2D6 of damage, then if the orc does 2 damage, you should think ‘probably 1D12’, but if she does 7 damage, you should think ‘probably 2D6’.
Does anyone know of any games that normal human beings play that could be used in this sort of example? I mean, apart from, you know, life.
Actually, search for “Center for Modern Rationality” and a post along the lines of “Name new Rationality Inst.”—the latter describes an organization Eliezer is making as a spinoff and explains that it’s going to have materials for high school students. They ARE trying to branch out to the rest of the population! This is exciting! (: I wonder how far they’ve gotten.
As for your other suggestions, I’ve begun talking about that again in my preventing endless September thread. You’re invited to check out the cliff notes version and request new pros and cons be added.
That looks like a good way of explaining the conjunction and narrative fallacies, too. They could easily be looked at as adding details to a simpler argument. I wonder what other fallacies could be “generalized” similarly?
One thing I think we should be working on is a way of organizing the mass of fallacies and heuristics. There are too many to keep straight without some sort of organizing principles.
Can you give a link to that explanation?
It was an in person conversation. Her phrasing may actually have been tighter. My extension to her explanation is that if you have probabilities for aspects of a prediction, then there’s math so that you can derive a probability for the whole prediction.
Her specific problem is that she’d like to post articles, but she’s put off by having to make sufficiently upvoted comments to do so. I’ve told her that if she writes an article I’ll post it with attribution for her.
I don’t have a general problem with the 20 karma requirement for posting articles.