That Magical Click

Fol­lowup to: Nor­mal Cryonics

Yes­ter­day I spoke of that cry­on­ics gath­er­ing I re­cently at­tended, where travel by young cry­on­i­cists was fully sub­si­dized, lead­ing to ex­tremely differ­ent de­mo­graph­ics from con­ven­tions of self-funded ac­tivists. 34% fe­male, half of those in cou­ples, many cou­ples with kids—THAT HAD BEEN SIGNED UP FOR CRYONICS FROM BIRTH LIKE A GODDAMNED SANE CIVILIZATION WOULD REQUIRE − 25% com­puter in­dus­try, 25% sci­en­tists, 15% en­ter­tain­ment in­dus­try at a rough es­ti­mate, and in most ways seem­ing (for smart peo­ple) pretty damned nor­mal.

Ex­cept for one thing.

Dur­ing one con­ver­sa­tion, I said some­thing about there be­ing no magic in our uni­verse.

And an or­di­nary-seem­ing woman re­sponded, “But there are still lots of things sci­ence doesn’t un­der­stand, right?”

Sigh. We all know how this con­ver­sa­tion is go­ing to go, right?

So I wearily replied with my usual, “If I’m ig­no­rant about a phe­nomenon, that is a fact about my state of mind, not a fact about the phe­nomenon it­self; a blank map does not cor­re­spond to a blank ter­ri­tory—”

“Oh,” she in­ter­rupted ex­cit­edly, “so the con­cept of ‘magic’ isn’t even con­sis­tent, then!”

Click.

She got it, just like that.

This was some­one else’s de­scrip­tion of how she got in­volved in cry­on­ics, as best I can re­mem­ber it, and it was pretty much typ­i­cal for the younger gen­er­a­tion:

“When I was a very young girl, I was watch­ing TV, and I saw some­thing about cry­on­ics, and it made sense to me—I didn’t want to die—so I asked my mother about it. She was very dis­mis­sive, but tried to ex­plain what I’d seen; and we talked about some of the other things that can hap­pen to you af­ter you die, like burial or cre­ma­tion, and it seemed to me like cry­on­ics was bet­ter than that. So my mother laughed and said that if I still felt that way when I was older, she wouldn’t ob­ject. Later, when I was older and sign­ing up for cry­on­ics, she ob­jected.”

Click.

It’s… kinda frus­trat­ing, ac­tu­ally.

There are man­i­fold bad ob­jec­tions to cry­on­ics that can be raised and coun­tered, but the core logic re­ally is sim­ple enough that there’s noth­ing im­plau­si­ble about get­ting it when you’re eight years old (eleven years old, in my case).

Freez­ing dam­age? I could go on about mod­ern cry­opro­tec­tants and how you can see un­der a micro­scope that the tis­sue is in great shape, and there are ex­per­i­ments un­der­way to see if they can get spon­ta­neous brain ac­tivity af­ter vit­rify­ing and de­vit­rify­ing, and with molec­u­lar nan­otech­nol­ogy you could go through the whole vit­rified brain atom by atom and do the same sort of in­for­ma­tion-the­o­ret­i­cal tricks that peo­ple do to re­cover hard drive in­for­ma­tion af­ter “era­sure” by any means less ex­treme than a blow­torch...

But even an eight-year-old can vi­su­al­ize that freez­ing a sand­wich doesn’t de­stroy the sand­wich, while cre­ma­tion does. It so hap­pens that this naive an­swer re­mains true af­ter learn­ing the ex­act de­tails and defeat­ing ob­jec­tions (a few of which are even worth con­sid­er­ing), but that doesn’t make it any less ob­vi­ous to an eight-year-old. (I ac­tu­ally did un­der­stand the con­cept of molec­u­lar nan­otech at eleven, but I could be a spe­cial case.)

Similarly: yes, re­ally, life is bet­ter than death—just be­cause tran­shu­man­ists have huge ar­gu­ments with bio­con­ser­va­tives over this is­sue, doesn’t mean the eight-year-old isn’t mak­ing the right judg­ment for the right rea­sons.

Or: even an eight-year-old who’s read a cou­ple of sci­ence-fic­tion sto­ries and who’s ever cracked a his­tory book can guess—not for the full rea­sons in full de­tail, but still for good rea­sons—that if you wake up in the Fu­ture, it’s prob­a­bly go­ing to be a nicer place to live than the Pre­sent.

In short—though it is the sort of thing you ought to re­view as a teenager and again as an adult—from a ra­tio­nal­ist stand­point, there is noth­ing alarm­ing about click­ing on cry­on­ics at age eight… any more than I should worry about my first schism with Ortho­dox Ju­daism com­ing at age five, when they told me that I didn’t have to un­der­stand the prayers in or­der for them to work so long as I said them in He­brew. It re­ally is ob­vi­ous enough to see as a child, the right thought for the right rea­sons, no mat­ter how much adult de­bate sur­rounds it.

And the frus­trat­ing thing was that—judg­ing by this group—most cry­on­i­cists are peo­ple to whom it was just ob­vi­ous. (And who then ac­tu­ally fol­lowed through and signed up, which is prob­a­bly a fac­tor-of-ten or worse filter for Con­scien­tious­ness.) It would have been con­ve­nient if I’d dis­cov­ered some par­tic­u­lar key in­sight that con­vinced peo­ple. If peo­ple had said, “Oh, well, I used to think that cry­on­ics couldn’t be plau­si­ble if no one else was do­ing it, but then I read about Asch’s con­for­mity ex­per­i­ment and plu­ral­is­tic ig­no­rance.” Then I could just em­pha­size that ar­gu­ment, and peo­ple would sign up.

But the av­er­age ex­pe­rience I heard was more like, “Oh, I saw a movie that in­volved cry­on­ics, and I went on Google to see if there was any­thing like that in real life, and found Al­cor.”

In one sense this shouldn’t sur­prise a Bayesian, be­cause the base rate of peo­ple who hear a brief men­tion of cry­on­ics on the ra­dio and have an op­por­tu­nity to click, will be vastly higher than the base rate of peo­ple who are ex­posed to de­tailed ar­gu­ments about cry­on­ics...

Yet the up­shot is that—judg­ing from the gen­er­a­tion of young cry­on­i­cists at that event I at­tended—cry­on­ics is sus­tained pri­mar­ily by the abil­ity of a tiny, tiny frac­tion of the pop­u­la­tion to “get it” just from hear­ing a ca­sual men­tion on the ra­dio. What­ever part of one-in-a-hun­dred-thou­sand isn’t ac­counted for by the Con­scien­tious­ness filter.

If I suffered from the sin of un­der­con­fi­dence, I would feel a dull sense of obli­ga­tion to doubt my­self af­ter reach­ing this con­clu­sion, just like I would feel a dull sense of obli­ga­tion to doubt that I could be more ra­tio­nal about the­ol­ogy than my par­ents and teach­ers at the age of five. As it is, I have no prob­lem with shrug­ging and say­ing “Peo­ple are crazy, the world is mad.”

But it re­ally, re­ally raises the ques­tion of what the hell is in that click.

There’s this mag­i­cal click that some peo­ple get and some peo­ple don’t, and I don’t un­der­stand what’s in the click. There’s the con­se­quen­tial­ist/​util­i­tar­ian click, and the in­tel­li­gence ex­plo­sion click, and the life-is-good/​death-is-bad click, and the cry­on­ics click. I my­self failed to click on one no­table oc­ca­sion, but the topic was prob­a­bly just as click­able.

(In fact, it took that par­tic­u­lar em­bar­rass­ing failure in my own his­tory—failing to click on metaethics, and see­ing in ret­ro­spect that the an­swer was click­able—be­fore I was will­ing to trust non-click Sin­gu­lar­i­tar­i­ans.)

A ra­tio­nal­ist faced with an ap­par­ently ob­vi­ous an­swer, must as­sign some prob­a­bil­ity that a non-ob­vi­ous ob­jec­tion will ap­pear and defeat it. I do know how to ex­plain the above con­clu­sions at great length, and defeat ob­jec­tions, and I would not be nearly as con­fi­dent (I hope!) if I had just clicked five sec­onds ago. But some­times the fi­nal an­swer is the same as the ini­tial guess; if you know the full math­e­mat­i­cal story of Peano Arith­metic, 2 + 2 still equals 4 and not 5 or 17 or the color green. And some peo­ple very quickly ar­rive at that same fi­nal an­swer as their best ini­tial guess; they can swiftly guess which an­swer will end up be­ing the fi­nal an­swer, for what seem even in ret­ro­spect like good rea­sons. Like be­com­ing an athe­ist at eleven, then listen­ing to a the­ist’s best ar­gu­ments later in life, and con­clud­ing that your ini­tial guess was right for the right rea­sons.

We can define a “click” as fol­low­ing a very short chain of rea­son­ing, which in the vast ma­jor­ity of other minds is de­railed by some de­tour and proves strongly re­sis­tant to re-railing.

What makes it hap­pen? What goes into that click?

It’s a ques­tion of life-or-death im­por­tance, and I don’t know the an­swer.

That gen­er­a­tion of cry­on­i­cists seemed so nor­mal apart from that...

What’s in that click?

The point of the open­ing anec­dote about the Mind Pro­jec­tion Fal­lacy (blank map != blank ter­ri­tory) is to show (anec­do­tal) ev­i­dence that there’s some­thing like a gen­eral click-fac­tor, that some­one who clicked on cry­on­ics was able to click on mys­te­ri­ous­ness=pro­jec­tivism as well. Of course I didn’t ex­pect that I could just stand up amid the con­fer­ence and de­scribe the in­tel­li­gence ex­plo­sion and Friendly AI in a cou­ple of sen­tences and have ev­ery­one get it. That high of a gen­eral click fac­tor is ex­tremely rare in my ex­pe­rience, and the peo­ple who have it are not oth­er­wise nor­mal. (Michael Vas­sar is one ex­am­ple of a “su­perclicker”.) But it is still true AFAICT that peo­ple who click on one prob­lem are more likely than av­er­age to click on an­other.

My best guess is that click­i­ness has some­thing to do with failure to com­part­men­tal­ize—miss­ing, or failing to use, the men­tal gear that lets hu­man be­ings be­lieve two con­tra­dic­tory things at the same time. Clicky peo­ple would tend to be peo­ple who take all of their be­liefs at face value.

The Han­so­nian ex­pla­na­tion (not nec­es­sar­ily en­dorsed by Robin Han­son) would say some­thing about clicky peo­ple tend­ing to op­er­ate in Near mode. (Why?)

The naively straight­for­ward view would be that the or­di­nary-seem­ing peo­ple who came to the cry­on­ics did not have any ex­tra gear that mag­i­cally en­abled them to fol­low a short chain of ob­vi­ous in­fer­ences, but rather, ev­ery­one else had at least one ex­tra in­san­ity gear ac­tive at the time they heard about cry­on­ics.

Is that re­ally just it? Is there no spe­cial san­ity to add, but only or­di­nary mad­ness to take away? Where do su­perclick­ers come from—are they just born lack­ing a whole lot of dis­trac­tions?

What the hell is in that click?