My Childhood Death Spiral

My par­ents always used to down­play the value of in­tel­li­gence. And play up the value of—effort, as recom­mended by the lat­est re­search? No, not effort. Ex­pe­rience. A nicely unattain­able ham­mer with which to smack down a bright young child, to be sure. That was what my par­ents told me when I ques­tioned the Jewish re­li­gion, for ex­am­ple. I tried lay­ing out an ar­gu­ment, and I was told some­thing along the lines of: “Logic has limits, you’ll un­der­stand when you’re older that ex­pe­rience is the im­por­tant thing, and then you’ll see the truth of Ju­daism.” I didn’t try again. I made one at­tempt to ques­tion Ju­daism in school, got slapped down, didn’t try again. I’ve never been a slow learner.

When­ever my par­ents were do­ing some­thing ill-ad­vised, it was always, “We know bet­ter be­cause we have more ex­pe­rience. You’ll un­der­stand when you’re older: ma­tu­rity and wis­dom is more im­por­tant than in­tel­li­gence.”

If this was an at­tempt to fo­cus the young Eliezer on in­tel­li­gence uber alles, it was the most wildly suc­cess­ful ex­am­ple of re­verse psy­chol­ogy I’ve ever heard of.

But my par­ents aren’t that cun­ning, and the re­sults weren’t ex­actly pos­i­tive.

For a long time, I thought that the moral of this story was that ex­pe­rience was no match for sheer raw na­tive in­tel­li­gence. It wasn’t un­til a lot later, in my twen­ties, that I looked back and re­al­ized that I couldn’t pos­si­bly have been more in­tel­li­gent than my par­ents be­fore pu­berty, with my brain not even fully de­vel­oped. At age eleven, when I was already nearly a full-blown athe­ist, I could not have defeated my par­ents in any fair con­test of mind. My SAT scores were high for an 11-year-old, but they wouldn’t have beaten my par­ents’ SAT scores in full adult­hood. In a fair fight, my par­ents’ in­tel­li­gence and ex­pe­rience could have stomped any pre­pubescent child flat. It was dys­ra­tiona­lia that did them in; they used their in­tel­li­gence only to defeat it­self.

But that un­der­stand­ing came much later, when my in­tel­li­gence had pro­cessed and dis­til­led many more years of ex­pe­rience.

The moral I de­rived when I was young, was that any­one who down­played the value of in­tel­li­gence didn’t un­der­stand in­tel­li­gence at all. My own in­tel­li­gence had af­fected ev­ery as­pect of my life and mind and per­son­al­ity; that was mas­sively ob­vi­ous, seen at a back­ward glance. “In­tel­li­gence has noth­ing to do with wis­dom or be­ing a good per­son”—oh, and does self-aware­ness have noth­ing to do with wis­dom, or be­ing a good per­son? Model­ing your­self takes in­tel­li­gence. For one thing, it takes enough in­tel­li­gence to learn evolu­tion­ary psy­chol­ogy.

We are the cards we are dealt, and in­tel­li­gence is the un­fairest of all those cards. More un­fair than wealth or health or home coun­try, un­fairer than your hap­piness set-point. Peo­ple have difficulty ac­cept­ing that life can be that un­fair, it’s not a happy thought. “In­tel­li­gence isn’t as im­por­tant as X” is one way of turn­ing away from the un­fair­ness, re­fus­ing to deal with it, think­ing a hap­pier thought in­stead. It’s a temp­ta­tion, both to those dealt poor cards, and to those dealt good ones. Just as down­play­ing the im­por­tance of money is a temp­ta­tion both to the poor and to the rich.

But the young Eliezer was a tran­shu­man­ist. Giv­ing away IQ points was go­ing to take more work than if I’d just been born with ex­tra money. But it was a fix­able prob­lem, to be faced up to squarely, and fixed. Even if it took my whole life. “The strong ex­ist to serve the weak,” wrote the young Eliezer, “and can only discharge that duty by mak­ing oth­ers equally strong.” I was an­noyed with the Ran­dian and Niet­szchean trends in SF, and as you may have grasped, the young Eliezer had a ten­dency to take things too far in the other di­rec­tion. No one ex­ists only to serve. But I tried, and I don’t re­gret that. If you call that teenage folly, it’s rare to see adult wis­dom do­ing bet­ter.

Every­one needed more in­tel­li­gence. In­clud­ing me, I was care­ful to pro­nounce. Be it far from me to de­clare a new world or­der with my­self on top—that was what a stereo­typed sci­ence fic­tion villain would do, or worse, a typ­i­cal teenager, and I would never have al­lowed my­self to be so cliched. No, ev­ery­one needed to be smarter. We were all in the same boat: A fine, up­lift­ing thought.

Eliezer1995 had read his sci­ence fic­tion. He had morals, and ethics, and could see the more ob­vi­ous traps. No screeds on Homo no­vis for him. No line drawn be­tween him­self and oth­ers. No elab­o­rate philos­o­phy to put him­self at the top of the heap. It was too ob­vi­ous a failure mode. Yes, he was very care­ful to call him­self stupid too, and never claim moral su­pe­ri­or­ity. Well, and I don’t see it so differ­ently now, though I no longer make such a dra­matic pro­duc­tion out of my ethics. (Or maybe it would be more ac­cu­rate to say that I’m tougher about when I al­low my­self a mo­ment of self-con­grat­u­la­tion.)

I say all this to em­pha­size that Eliezer1995 wasn’t so undig­nified as to fail in any ob­vi­ous way.

And then Eliezer1996 en­coun­tered the con­cept of the Sin­gu­lar­ity. Was it a thun­der­bolt of rev­e­la­tion? Did I jump out of my chair and shout “Eurisko!“? Nah. I wasn’t that much of a drama queen. It was just mas­sively ob­vi­ous in ret­ro­spect that smarter-than-hu­man in­tel­li­gence was go­ing to change the fu­ture more fun­da­men­tally than any mere ma­te­rial sci­ence. And I knew at once that this was what I would be do­ing with the rest of my life, cre­at­ing the Sin­gu­lar­ity. Not nan­otech­nol­ogy like I’d thought when I was eleven years old; nan­otech would only be a tool brought forth of in­tel­li­gence. Why, in­tel­li­gence was even more pow­er­ful, an even greater bless­ing, than I’d re­al­ized be­fore.

Was this a happy death spiral? As it turned out later, yes: that is, it led to the adop­tion even of false happy be­liefs about in­tel­li­gence. Per­haps you could draw the line at the point where I started be­liev­ing that surely the light­speed limit would be no bar­rier to su­per­in­tel­li­gence. (It’s not un­think­able, but I wouldn’t bet on it.)

But the real wrong turn came later, at the point where some­one said, “Hey, how do you know that su­per­in­tel­li­gence will be moral? In­tel­li­gence has noth­ing to do with be­ing a good per­son, you know—that’s what we call wis­dom, young prodigy.”

And lo, it seemed ob­vi­ous to the young Eliezer, that this was mere de­nial. Cer­tainly, his own painstak­ingly con­structed code of ethics had been put to­gether us­ing his in­tel­li­gence and rest­ing on his in­tel­li­gence as a base. Any fool could see that in­tel­li­gence had a great deal to do with ethics, moral­ity, and wis­dom; just try ex­plain­ing the Pri­soner’s Dilemma to a chim­panzee, right?

Surely, then, su­per­in­tel­li­gence would nec­es­sar­ily im­ply su­per­moral­ity.

Thus is it said: “Par­ents do all the things they tell their chil­dren not to do, which is how they know not to do them.” To be con­tinued, hope­fully to­mor­row.

Post Scrip­tum: How my views on in­tel­li­gence have changed since then… let’s see: When I think of poor hands dealt to hu­mans, these days, I think first of death and old age. Every­one’s got to have some in­tel­li­gence level or other, and the im­por­tant thing from a fun-the­o­ret­i­cal per­spec­tive is that it should ought to in­crease over time, not de­crease like now. Isn’t that a clever way of feel­ing bet­ter? But I don’t work so hard now at down­play­ing my own in­tel­li­gence, be­cause that’s just an­other way of call­ing at­ten­tion to it. I’m smart for a hu­man, if the topic should arise, and how I feel about that is my own busi­ness. The part about in­tel­li­gence be­ing the lever that lifts wor­lds is the same. Ex­cept that in­tel­li­gence has be­come less mys­te­ri­ous unto me, so that I now more clearly see in­tel­li­gence as some­thing em­bed­ded within physics. Su­per­in­tel­li­gences may go FTL if it hap­pens to be per­mit­ted by the true phys­i­cal laws, and if not, then not.