AIFoom Debate—conclusion?

I’ve been go­ing through the AIFoom de­bate, and both sides makes sense to me. I in­tend to con­tinue, but I’m won­der­ing if there’re already in­sights in LW cul­ture I can get if I just ask for them.

My un­der­stand­ing is as fol­lows:

The differ­ence be­tween a chimp and a hu­man is only 5 mil­lion years of evolu­tion. That’s not time enough for many changes.

Eliezer takes this as proof that the differ­ence be­tween the two in the brain ar­chi­tec­ture can’t be much. Thus, you can have a chimp-in­tel­li­gent AI that doesn’t do much, and then with some very small changes, sud­denly get a hu­man-in­tel­li­gent AI and FOOM!

Robin takes the 5-mil­lion year gap as proof that the sig­nifi­cant differ­ence be­tween chimps and hu­mans is only partly in the brain ar­chi­tec­ture. Evolu­tion sim­ply can’t be re­spon­si­ble for most of the rele­vant differ­ence; the differ­ence must be el­se­where.

So he con­cludes that when our an­ces­tors got smart enough for lan­guage, cul­ture be­came a thing. Our species stum­bled across var­i­ous lit­tle in­sights into life, and these got passed on. An in­creas­ingly mas­sive base of cul­tural con­tent, made of very many small im­prove­ments is largely re­spon­si­ble for the differ­ence be­tween chimps and hu­mans.

Cul­ture as­similated new in­for­ma­tion into hu­mans much faster than evolu­tion could.

So he con­cludes that you can get a chimp-level AI, and to get up to hu­man-level will take, not a very few in­sights, but a very great many, each one slowly im­prov­ing the com­puter’s in­tel­li­gence. So no Foom, it’ll be a grad­ual thing.

So I think I’ve figured out the ques­tion. Is there a com­monly known an­swer, or are there in­sights to­wards the same?