I will start with: +1 for caring about the community etiquette
Less Wrong is a diverse community, but I was by and large under the impression that it was biased towards a growth mindset. Indeed, it seems in many ways the raison d’etre of LW relies on the assumption that it is possible to improve your intelligence.
Intelligence (IQ) is more or less static. If you have a scientifically proven method of increasing IQ, please post it here, and I am sure many people will try it. But at this moment, LW is not about increasing human intelligence. It is about increasing human rationality—learning a better way to use the intelligence (brain) we already have—and about machine intelligence. A hypothetical intelligent machine could increase its intelligence by changing its code or adding new hardware. For humans, similar change would require surgery or implants beyond our current knowledge.
if it’s ridiculous to believe in “mutants born with unnaturally high anger levels” then why the rush to believe in mutants with unnaturally high levels of intelligence?
How high is unnaturally high? The intelligence is on the Bell curve. One in two persons has IQ above 100. One in ten has IQ above 115. One in fifty has IQ above 130; one in hundred above 135; one in thousand above 146; one in ten thousands above 156… this is all within the Bell curve. It is possible to search for people with this level of intelligence. (Speaking about someone with IQ 300, that would be unnatural.)
The question is, how much real-world effect do these levels of intelligence have. Clearly, intelligence is not enough to make people smart—a person with a high IQ can still believe and do stupid things. (This is why we usually don’t obsess about IQ, and discuss rationality instead.) On the other hand, some IQ may be necessary for some outcome, or at least could make the same person get the same outcome significantly faster. (This is easier to understand by imagining people with very low IQs. Even the best rationality training is not going to make them new Einsteins.) Being faster does not seem like a critical difference, but for sufficiently complex tasks the difference between years and decades, or maybe decades and centuries, can determine whether a human is able or unable to ever complete the task.
Is it possible that what we perceive as greater intelligence, as “the level above mine” is just someone who has spent more time working on something, or working on something similar to it?
In the article, Eliezer considers the alternative explanations. (Maybe Conway had more opportunities to show his mastery. Maybe he specializes in doing something different. Maybe Conway used the time of his youth better.) But maybe… it is the difference in general intelligence. All these explanations deserve to be considered.
What is the prior probability that someone picks up a new idea quickly because they’ve been exposed to a similar idea before, versus the prior probability that they are of mutant intelligence?
Depends on circumstances. Did it happen once, or does it happen all the time? Does it happen consistently in a field where both persons spent a lot of time learning? Does it happen in different fields? The prior probability of someone having higher intelligence is not so small that evidence like this couldn’t change the result.
\2. Does our love of static hiearchies, esp. one that priveleges intelligence affect our answer to 1? I’m not sure about question 1, but I’m pretty sure the answer to question 2 is yes.
Just because we have a bias for X, it does not automatically mean non-X must be true. People do love hierarchies. People are bad at estimating their skills, or skills of others. That does not mean different people can’t really have different traits.
Intelligence (IQ) is more or less static. If you have a scientifically proven method of increasing IQ, please post it here, and I am sure many people will try it. But at this moment, LW is not about increasing human intelligence. It is about increasing human rationality—learning a better way to use the intelligence (brain) we already have—and about machine intelligence.
Is it solid that IQ tests can distinguish between the intelligence we already have, and our ability to use that intelligence?
I will start with: +1 for caring about the community etiquette
Intelligence (IQ) is more or less static. If you have a scientifically proven method of increasing IQ, please post it here, and I am sure many people will try it. But at this moment, LW is not about increasing human intelligence. It is about increasing human rationality—learning a better way to use the intelligence (brain) we already have—and about machine intelligence. A hypothetical intelligent machine could increase its intelligence by changing its code or adding new hardware. For humans, similar change would require surgery or implants beyond our current knowledge.
How high is unnaturally high? The intelligence is on the Bell curve. One in two persons has IQ above 100. One in ten has IQ above 115. One in fifty has IQ above 130; one in hundred above 135; one in thousand above 146; one in ten thousands above 156… this is all within the Bell curve. It is possible to search for people with this level of intelligence. (Speaking about someone with IQ 300, that would be unnatural.)
The question is, how much real-world effect do these levels of intelligence have. Clearly, intelligence is not enough to make people smart—a person with a high IQ can still believe and do stupid things. (This is why we usually don’t obsess about IQ, and discuss rationality instead.) On the other hand, some IQ may be necessary for some outcome, or at least could make the same person get the same outcome significantly faster. (This is easier to understand by imagining people with very low IQs. Even the best rationality training is not going to make them new Einsteins.) Being faster does not seem like a critical difference, but for sufficiently complex tasks the difference between years and decades, or maybe decades and centuries, can determine whether a human is able or unable to ever complete the task.
In the article, Eliezer considers the alternative explanations. (Maybe Conway had more opportunities to show his mastery. Maybe he specializes in doing something different. Maybe Conway used the time of his youth better.) But maybe… it is the difference in general intelligence. All these explanations deserve to be considered.
Depends on circumstances. Did it happen once, or does it happen all the time? Does it happen consistently in a field where both persons spent a lot of time learning? Does it happen in different fields? The prior probability of someone having higher intelligence is not so small that evidence like this couldn’t change the result.
Just because we have a bias for X, it does not automatically mean non-X must be true. People do love hierarchies. People are bad at estimating their skills, or skills of others. That does not mean different people can’t really have different traits.
Is it solid that IQ tests can distinguish between the intelligence we already have, and our ability to use that intelligence?