Though Bostrom seems right to talk about better transmission—which could have been parsed into more reliable, robust, faster, compact, nested etc.… - he stops short of looking deep into what made cultural transmission better. To claim that a slight improvement in (general) mental faculties did it would be begging the question. Brilliant though he is, Bostrom is “just” a physicist, mathematical logician, philosopher, economist, computational neuroscientist who invented the field of existential-risks and revolutionized anthropics, so his knowledge of cultural evolution and this transition is somewhat speculative. That’s why we need other people :)
In that literature we have three main contenders for what allowed human prowess to reshape earth:
Symbolic ability: the ability to decently process symbols—which have a technical definition hard to describe here—and understand them in a timely fashion is unique to humans and some other currently extinct anthropoids. Terrence Deacon argues for this being what matters in The Symbolic Species.
Iterative recursion processing: This has been argued in many styles.
Chomsky argued the primacy of recursion as a requisite ability for human language in the late fifties
Pinker endorses this in his Language Instinct and in The Stuff of Thought
The Mind Is A Computer metaphor (Lakoff 1999) has been widely adopted and very successful memetically, and though it has other distinctions, the main distinction from “Mind Is A Machine” is that recursion is involved in computers, but not in all machines. The computational theory of mind thrived in the hands of Pinker, Koch, Dennet, Kahneman and more recently Tononi. Within LW and among programmers Mind is a Computer is frequently thought to be the fundamental metaphysics of mind, and a final shot at the ontological constituent of our selves—a perspective I considered naïve here.
Ability to share intentions: the ability to share goals and intentions and parallelize in virtue of doing so with other co-specimens. Tomasello (2005)
Great books on evolutionary transmission are Not By Genes Alone, The Meme Machine and LWer Tim Tyler’s Memetics.
When I was thinking about past discussions I was realized something like:
(selfish) gene → meme → goal.
When Bostrom is thinking about singleton’s probability I am afraid he overlook possibility to run more ‘personalities’ on one substrate. (we could suppose more teams to have possibility to run their projects on one hardware. Like more teams could use Hubble’s telescope to observe diffferent objects)
And not only possibility but probably also necessity.
If we want to prevent destructive goal to be realized (and destroy our world) then we have to think about multipolarity.
We need to analyze how to slightly different goals could control each other.
I’ll coin the term Monolithing Multipolar for what I think you mean here, one stable structure that has different modes activated at different times, and these modes don’t share goals, like a human—specially like a schizophrenic one.
The problem with Monolithic Multipolarity is that it is fragile. In humans, what causes us to behave differently and want different things at different times is not accessible for revision, otherwise, each party may have an incentive to steal the other’s time. An AI would need not to deal with such triviality, since, by definition of explosive recursively-self improving it can rewrite it-selves.
We need other people, but Bostrom doesn’t let simple things left out easily.
Though Bostrom seems right to talk about better transmission—which could have been parsed into more reliable, robust, faster, compact, nested etc.… - he stops short of looking deep into what made cultural transmission better. To claim that a slight improvement in (general) mental faculties did it would be begging the question. Brilliant though he is, Bostrom is “just” a physicist, mathematical logician, philosopher, economist, computational neuroscientist who invented the field of existential-risks and revolutionized anthropics, so his knowledge of cultural evolution and this transition is somewhat speculative. That’s why we need other people :) In that literature we have three main contenders for what allowed human prowess to reshape earth:
Symbolic ability: the ability to decently process symbols—which have a technical definition hard to describe here—and understand them in a timely fashion is unique to humans and some other currently extinct anthropoids. Terrence Deacon argues for this being what matters in The Symbolic Species.
Iterative recursion processing: This has been argued in many styles.
Chomsky argued the primacy of recursion as a requisite ability for human language in the late fifties
Pinker endorses this in his Language Instinct and in The Stuff of Thought
The Mind Is A Computer metaphor (Lakoff 1999) has been widely adopted and very successful memetically, and though it has other distinctions, the main distinction from “Mind Is A Machine” is that recursion is involved in computers, but not in all machines. The computational theory of mind thrived in the hands of Pinker, Koch, Dennet, Kahneman and more recently Tononi. Within LW and among programmers Mind is a Computer is frequently thought to be the fundamental metaphysics of mind, and a final shot at the ontological constituent of our selves—a perspective I considered naïve here.
Ability to share intentions: the ability to share goals and intentions and parallelize in virtue of doing so with other co-specimens. Tomasello (2005)
Great books on evolutionary transmission are Not By Genes Alone, The Meme Machine and LWer Tim Tyler’s Memetics.
When I was thinking about past discussions I was realized something like:
(selfish) gene → meme → goal.
When Bostrom is thinking about singleton’s probability I am afraid he overlook possibility to run more ‘personalities’ on one substrate. (we could suppose more teams to have possibility to run their projects on one hardware. Like more teams could use Hubble’s telescope to observe diffferent objects)
And not only possibility but probably also necessity.
If we want to prevent destructive goal to be realized (and destroy our world) then we have to think about multipolarity.
We need to analyze how to slightly different goals could control each other.
I’ll coin the term Monolithing Multipolar for what I think you mean here, one stable structure that has different modes activated at different times, and these modes don’t share goals, like a human—specially like a schizophrenic one.
The problem with Monolithic Multipolarity is that it is fragile. In humans, what causes us to behave differently and want different things at different times is not accessible for revision, otherwise, each party may have an incentive to steal the other’s time. An AI would need not to deal with such triviality, since, by definition of explosive recursively-self improving it can rewrite it-selves.
We need other people, but Bostrom doesn’t let simple things left out easily.
One mode could have goal to be something like graphite moderator in nuclear reactor. To prevent unmanaged explosion.
In this moment I just wanted to improve our view at probability of only one SI in starting period.