The Reality of Emergence

  • Re­ply to The Fu­til­ity of Emergence

    In The Fu­til­ity of Emer­gence, Eliezer takes an overly crit­i­cal po­si­tion on emer­gence as a the­ory. In this (short) ar­ti­cle, I hope to challenge that view.


    Emer­gence is not an empty phrase. The state­ments “con­scious­ness is an emer­gent phe­nomenon” and “con­scious­ness is a phe­nomenon” are not the same thing; the former con­veys in­for­ma­tion that the lat­ter does not. When we say some­thing is emer­gent, we have a well defined con­cept that we re­fer to.
    From Wikipe­dia:

    emer­gence is a phe­nomenon whereby larger en­tities arise through in­ter­ac­tions among smaller or sim­pler en­tities such that the larger en­tities ex­hibit prop­er­ties the smaller/​sim­pler en­tities do not ex­hibit.

    A is an emer­gent prop­erty of X, means that A arises from X in a way in which it is con­tin­gent on the in­ter­ac­tion of the con­stituents of X (and not on those con­stituents them­selves). If A is an emer­gent prop­erty of X, then the con­stituents of X do not pos­sess A. A comes into ex­is­tence as cat­e­go­rial novum at the in­cep­tion of X. The differ­ence be­tween sys­tem X and its con­stituent com­po­nents in re­gards to prop­erty A is a differ­ence of kind and not of de­gree; X’s con­stituents do not pos­sess A in some tiny mag­ni­tude—they do not pos­sess A at all.

    Taken liter­ally, that de­scrip­tion fits ev­ery phe­nomenon in our uni­verse above the level of in­di­vi­d­ual quarks, which is part of the problem

    This is blatantly not true; size and mass for ex­am­ple are prop­er­ties of com­po­nen­tary par­ti­cles.

    You can make no new pre­dic­tions. You do not know any­thing about the be­hav­ior of real-world minds that you did not know be­fore. It feels like you be­lieve a new fact, but you don’t an­ti­ci­pate any differ­ent out­comes. Your cu­ri­os­ity feels sated, but it has not been fed. The hy­poth­e­sis has no mov­ing parts—there’s no de­tailed in­ter­nal model to ma­nipu­late. Those who proffer the hy­poth­e­sis of “emer­gence” con­fess their ig­no­rance of the in­ter­nals, and take pride in it; they con­trast the sci­ence of “emer­gence” to other sci­ences merely mun­dane.

    I repect­fully dis­agree.

    When we say A is an emer­gent prop­erty of X, we say that X is more than a sum of its parts. Ag­gre­ga­tion and am­plifi­ca­tion of the prop­er­ties of X’s con­stituents does not pro­duce the prop­er­ties of X. The prox­i­mate cause of A is not the con­stituents of X them­selves—it is the in­ter­ac­tion be­tween those con­stituents.

    Emer­gence is testable and falsifi­able, emer­gence makes ad­vance pre­dic­tions; if I say A is an emer­gent prop­erty of sys­tem X, then I say that none of the con­stituent com­po­nents of sys­tem A pos­sess A (in any form or mag­ni­tude).
    State­ment: “con­scious­ness (in hu­mans) is an emer­gent prop­erty of the brain.”
    Pre­dic­tion: “in­di­vi­d­ual neu­rons are not con­scious to any de­gree.”″

    Ob­serv­ing a sup­posed emer­gent prop­erty in con­stituent com­po­nents falsifies the the­ory of emer­gence (as far as that the­ory/​phe­nomenon is con­cerned).

    The strength of a the­ory is not what it can pre­dict, but what it can’t. Emer­gence ex­cludes a lot of things; size and mass are not emer­gent prop­er­ties of atoms (el­e­men­tary phys­i­cal pat­i­cles pos­sess both of them). Any prop­erty that the con­stituents of X pos­sess (even to an as­tro­nom­i­cally lesser de­gree) is not emer­gent. This ex­cludes a whole lot of prop­er­ties; size, mass, den­sity, elec­tri­cal charge, etc. In fact, based on my (vir­tu­ally non-ex­is­tent knowl­edge of physics), I sus­pect that all fun­da­men­tal and de­rived quan­tities are not emer­gent prop­er­ties (I once again re­it­er­ate that I don’t know physics).

    Emer­gence does not func­tion as a se­man­tic stop­sign or cu­ri­os­ity stop­per for me. When I say con­scious­ness is emer­gent, I have pro­vided a skele­tal ex­pla­na­tion (at the high­est ab­stract lev­els) of the mechanism of con­si­cous­ness. I have nar­rowed my search; I now know that con­scious­ness is not a prop­erty of neu­rons, but arises from the in­ter­ac­tion thereof. To use an anal­ogy that I am (some­what) fa­mil­iar with, say­ing a prop­erty is emer­gent, is like say­ing an al­gorithm is re­cur­sive; we are pro­vid­ing a high level ab­stract de­scrip­tion of both the phe­nom­ena and the al­gorithm. We are con­vey­ing (non-triv­ial) in­for­ma­tion about both phe­nom­ena and al­gorithm. In the former case, we con­vey that the prop­erty arises as a re­sult of the in­ter­ac­tion of the con­stituent com­po­nents of a sys­tem (and is not re­ducible to the prop­er­ties of those con­stituents). In the lat­ter case, we spec­ify that the al­gorithm op­er­ates by tak­ing as in­put the out­put of the al­gorithm for other in­stances of the prob­lem (op­er­at­ing on it­self). When we say a phe­nomenon is an emer­gent prop­erty of a sys­tem, it is analo­gous to say­ing that an al­gorithm is re­cur­sive; you do not have enough in­for­ma­tion to con­struct ei­ther phe­nom­ena or al­gorithm, but you now know more about both than you did be­fore, and the knowl­edge you have gained is non-triv­ial.

    Be­fore: Hu­man in­tel­li­gence is an emer­gent product of neu­rons firing. After: Hu­man in­tel­li­gence is a product of neu­rons firing.

    How about this:
    Be­fore: “The quick­sort al­gorithm is a re­cur­sive al­gorithm.”
    After: “The quick­sort al­gorithm is an al­gorithm.”

    Be­fore: Hu­man in­tel­li­gence is an emer­gent product of neu­rons firing. After: Hu­man in­tel­li­gence is a mag­i­cal product of neu­rons firing.

    This seems to work just as fine:
    Be­fore: “The quick­sort al­gorithm is a re­cur­sive al­gorithm.”
    After: “The quick­sort al­gorithm is a mag­i­cal al­gorithm.”

    Does not each state­ment con­vey ex­actly the same amount of knowl­edge about the phe­nomenon’s be­hav­ior? Does not each hy­poth­e­sis fit ex­actly the same set of out­comes?

    It seems clear to me that in both cases, the origi­nal state­ment con­veys more in­for­ma­tion than the ed­ited ver­sion. I ar­gue that this is the same for “emer­gence”; say­ing a phe­nomenon is an emer­gent prop­erty does con­vey use­ful non-triv­ial in­for­ma­tion about that phe­nomenon.

    I shall an­swer the be­low ques­tion:

    If I showed you two con­scious be­ings, one which achieved con­scious­ness through emer­gence and one that did not, would you be able to tell them apart?

    Yes. For the be­ing which achieved con­scious­ness through means other than emer­gence, I know that the con­stituents of that be­ing are con­scious.

    Emer­gent con­scious­ness: A hu­man brain.
    Non-emer­gent con­scious­ness: A hive mind.

    The con­stituents of the hive mind are by them­selves con­scious, and I think that’s a use­ful dis­tinc­tion.

    “Emer­gence” has be­come very pop­u­lar, just as say­ing “magic” used to be very pop­u­lar. “Emer­gence” has the same deep ap­peal to hu­man psy­chol­ogy, for the same rea­son. “Emer­gence” is such a won­der­fully easy ex­pla­na­tion, and it feels good to say it; it gives you a sa­cred mys­tery to wor­ship. Emer­gence is pop­u­lar be­cause it is the junk food of cu­ri­os­ity. You can ex­plain any­thing us­ing emer­gence, and so peo­ple do just that; for it feels so won­der­ful to ex­plain things. Hu­mans are still hu­mans, even if they’ve taken a few sci­ence classes in col­lege. Once they find a way to es­cape the shack­les of set­tled sci­ence, they get up to the same shenani­gans as their an­ces­tors, dressed up in the liter­ary genre of “sci­ence” but still the same species psy­chol­ogy.

    Once again, I dis­agree with Eliezer. De­scribing a phe­nomenon as emer­gent is (for me) equiv­a­lent to de­scribing an al­gorithm as re­cur­sive; merely pro­vid­ing rele­vant char­ac­ter­i­sa­tion to dis­t­in­guish the sub­ject (phe­nomenon/​al­gorithm) from other sub­jects. Emer­gence is noth­ing mag­i­cal to me; when I say con­scious­ness is emer­gent, I carry no illu­sions that I now un­der­stand con­scious­ness, my cu­ri­os­ity is not sated—but I ar­gue—I am now more knowl­edge­bale than I was be­fore; I now have an ab­stract con­cep­tion of the mechanism of con­scious­ness; it is very limited, but it is bet­ter than noth­ing. Tel­ling you quick­sort is re­cur­sive doesn’t tell you how to im­ple­ment quick­sort, but it does (sig­nifi­cantly) con­strain your search space; If you were go­ing to run a brute force search of al­gorithm de­sign space to find quick­sort, you now know to con­fine your search to re­cur­sive al­gorithms. Tel­ling you that quick­sort is re­cur­sive, brings you closer to un­der­stand­ing quick­sort than if you were told it’s just an al­gorithm. The same is true for say­ing con­scious­ness is emer­gent. You know un­der­stand more on con­scious­ness than you did be­fore; you now know that it arises cat­e­go­rial novum as a con­se­quence of the in­ter­ac­tion of neu­rons. De­scribing a phe­nomenon as “emer­gent” does not con­vey zero in­for­ma­tion, and thus I ar­gue the cat­e­gory is nec­es­sary. Emer­gent is only as fu­tile an ex­pla­na­tion as re­cur­sion is.

    Now that I have (hope­fully) es­tab­lished that emer­gence is a real the­ory (albeit one with limited ex­pla­na­tion power, not un­like de­scribing an al­gorithm as re­cur­sive), I would like to add some­thing else. The above is a defence of the le­gi­t­i­macy of emer­gence as a the­ory; I am not of ne­ces­sity say­ing that emer­gence is cor­rect. It may be the case that no prop­erty of any sys­tem is emer­gent, and as such all prop­er­ties of sys­tems are prop­er­ties of at least one of its con­stituent com­po­nents. The ques­tion of whether emer­gence is cor­rect (there ex­ists at least one prop­erty of at least one sys­tem that is not a prop­erty of any of its con­stituent com­po­nents (not nec­es­sar­ily con­scious­ness/​in­tel­li­gence)) is an en­tirely differ­ent ques­tion, and is nei­ther the the­sis of this write up, nor a ques­tion I am cur­rently equipped to tackle. If it is of any rele­vance, I do be­lieve con­scious­ness is at least a (weakly) emer­gent prop­erty of sapi­ent an­i­mal brains.

    Part of The Con­trar­ian Sequences