Epistemic status: Practising thinking aloud. There might be an important question here, but I might be making a simple error.
There is a lot of variance in general competence between species. Here is the standard Bostrom/Yudkowsky graph to display this notion.
There’s a sense that while some mice are more genetically fit than others, they’re broadly all just mice, bound within a relatively narrow range of competence. Chimps should not be worried about most mice, in the short or long term, but they also shouldn’t worry especially so about peak mice—there’s no incredibly strong or cunning mouse they ought to look out for.
However, my intuition is very different for humans. While I understand that humans are all broadly similar, that a single human cannot have a complex adaptation that is not universal , I also have many beliefs that humans differ massively in cognitive capacities in ways that can lead to major disparities in general competence. The difference between someone who does understand calculus and someone who does not, is the difference between someone who can build a rocket and someone who cannot. And I think I’ve tried to teach people that kind of math, and sometimes succeeded, and sometimes failed to even teach basic fractions.
I can try to operationalise my hypothesis: if the average human intelligence was lowered to be equal to an IQ of 75 in present day society, that society could not have built rockets or do a lot of other engineering and science.
(Sidenote: I think the hope of iterated amplification is that this is false. That if I have enough humans with hard limits to how much thinking they can do, stacking lots of them can still produce all the intellectual progress we’re going to need. My initial thought is that this doesn’t make sense, because there are many intellectual feats like writing a book or coming up with special relativity that I generally expect individuals (situated within a conducive culture and institutions) to be much better at than groups of individuals (e.g. companies).
This is also my understanding of Eliezer’s critique, that while it’s possible to get humans with hard limits on cognition to make mathematical progress, it’s by running an algorithm on them that they don’t understand, not running an algorithm that they do understand, and only if they understand it do you get nice properties about them being aligned in the same way you might feel many humans are today.
It’s likely I’m wrong about the motivation behind Iterated Amplification though.)
This hypothesis doesn’t imply that someone who can do successful abstract reasoning is strictly more competent than a whole society of people who cannot. The Secret of our Success talks about how smart modern individuals stranded in forests fail to develop basic food preparation techniques that other, primitive cultures were able to build.
I’m saying that a culture with no people who can do calculus will in the long run score basically zero against the accomplishments of a culture with people who can.
One question is why we’re in a culture so precariously balanced on this split between “can take off to the stars” and “mostly cannot”. An idea I’ve heard is that if a culture is easily able to reach technologically maturity, it will come later than a culture who is barely able to become technologically maturity, because evolution works over much longer time scales than culture + technological innovation. As such, if you observe yourself to be in a culture that is able to reach technologically maturity, you’re probably “the stupidest such culture that could get there, because if it could be done at a stupider level then it would’ve happened there first.”
As such, we’re a species whereby if we try as hard as we can, if we take brains optimised for social coordination and make them do math, then we can just about reach technical maturity (i.e. build nanotech, AI, etc).
That may be true, but the question I want to ask about is what is it about humans, culture and brains that allows for such high variance within the species, that isn’t true about mice and chimps? Something about this is still confusing to me. Like, if it is the case that some humans are able to do great feats of engineering like build rockets that land, and some aren’t, what’s the difference between these humans that causes such massive changes in outcome? Because, as above, it’s not some big complex genetic adaptation some have and some don’t. I think we’re all running pretty similar genetic code.
Is there some simple amount of working memory that’s required to do complex recursion? Like, 6 working memory slots makes things way harder than 7?
I can imagine that there are many hacks, and not a single thing. I’m reminded of the story of Richard Feynman learning to count time, where he’d practice being able to count a whole minute. He’d do it while doing the laundry, while cooking breakfast, and so on. He later met the mathematician John Tukey, who could do the same, but they had some fierce disagreements. Tukey said you couldn’t do it while reading the newspaper, and Feynman said he could. Feynman said you couldn’t do it while having a conversation, and Tukey said they could. They then both surprised each other by doing exactly what they said they could.
It turned out Feynman was hearing numbers being spoken, whereas Tukey was visualising the numbers ticking over. So Feynman could still read at the same time, and his friend could still listen and talk.
The idea here is that if you’re unable to use one type of cognitive resource, you may make up for it with another. This is probably the same situation as when you make trade-offs between space and time in computational complexity.
So I can imagine different humans finding different hacky ways to build up the skill to do very abstract truth-tracking thinking. Perhaps you have a little less working memory than average, but you have a great capacity for visualisation, and primarily work in areas that lend themselves to geometric / spacial thinking. Or perhaps your culture can be very conducive to abstract thought in some way.
But even if this is right I’m interested in the details of what the key variables actually are.
What are your thoughts?
 Note: humans can lack important pieces of machinery.