grey goo has to out-compete the existing biosphere. This seems hard.
Really? Von Neumann machines (the universal assembler self-replicating variety, not the computer architecture) versus regular ol’ mitosis, and you think mitosis would win out?
I’ve only ever heard “building self-replicating machinery on a nano-scale is really hard” as the main argument against the immediacy of that particular x-risk, never “even if there were self-replicators on a nano-scale, they would have a hard time out-competing the existing biosphere”. Can you elaborate?
As one of my physics professors put it, “We already have grey goo. They’re called bacteria.”
The intuition behind the grey goo risk appears to be “as soon as someone makes a machine that can make itself, the world is a huge lump of matter and energy just waiting to be converted into copies of that machine.” That is, of course, not true- matter and energy and prized and fought over, and any new contender is going to have to join the fight.
That’s not to say it’s impossible for an artificial self-replicating nanobot to beat the self-replicating nanobots which have evolved naturally, just that it’s hard. For example, it’s not clear to me what part of “regular ol’ mitosis” you think is regular, and easy to improve upon. Is it that the second copy is built internally, preventing it from attack and corruption?
Bacteria et al. are only the locally optimal solution after a long series of selection steps, each of which generally needed to be an improvement upon the previous step, i.e. the result of a greedy algorithm. There are few problems in which you’d expect a greedy algorithm to end up anywhere but in a very local optimum:
DNA is a hilariously inefficient way of storing partly superfluous data (all of which must undergo each mitosis), informational density could be an order/orders of magnitude higher with minor modifications, and the safety redundancies are precarious at best, compared to e.g. Hamming code. A few researchers in a poorly funded government lab can come up with deadlier viruses in a few years (remember the recent controversy) than what nature engineered in millenia. That’s not to say that compared to our current macroscopic technology the informational feats of biological data transmission, duplication etc. aren’t impressive, but that’s only because we’ve not yet achieved molecular manufacturing (a necessity for a Grey Goo scenario). (We could go into more details on gross biological inefficiencies if you’d like.)
Would you expect some antibodies and phagocytosis to defeat an intelligently engineered self-replicating nanobot the size of a virus (but which doesn’t depend on live cells and without the telltale flaws and tradeoffs of Pandemic-reminiscent”can’t kill the host cell too quickly” etc.)?
To me it seems like saying “if you drowned the world in acid, the biosphere could well win the fight in a semi-recognizable form and claim the negentropy for themselves” (yes, cells can survive in extremely adverse environments and survive in some sort of niche, but I wouldn’t exactly call such a pseudo-equilibrium winning, and self-replicators wouldn’t exactly wait for its carbon food source to evolutionary adapt).
A few researchers in a poorly funded government lab can come up with deadlier viruses in a few years (remember the recent controversy) than what nature engineered in millenia.
Killing one human is easier than converting the entire biosphere.
Would you expect some antibodies and phagocytosis to defeat an intelligently engineered self-replicating nanobot the size of a virus (but which doesn’t depend on live cells and without the telltale flaws and tradeoffs of Pandemic-reminiscent”can’t kill the host cell too quickly” etc.)?
Well, that depends on what I think the engineering constraints are. It could be that in order to be the size of a virus, self-assembly has to be outsourced. It could be that in order to be resistant to phagocytosis, it needs exotic materials which limit its growth rate and maximal growth.
To me it seems like saying “if you drowned the world in acid, the biosphere could well win the fight in a semi-recognizable form and claim the negentropy for themselves”
It’s more “in order to drown the world in acid, you need to generate a lot of acid, and that’s actually pretty hard.”
A few researchers in a poorly funded government lab can come up with deadlier viruses in a few years (remember the recent controversy) than what nature engineered in millenia.
Yes, and you may have noticed that bioengineered pandemic was voted top threat.
Really? Von Neumann machines (the universal assembler self-replicating variety, not the computer architecture) versus regular ol’ mitosis, and you think mitosis would win out?
I’ve only ever heard “building self-replicating machinery on a nano-scale is really hard” as the main argument against the immediacy of that particular x-risk, never “even if there were self-replicators on a nano-scale, they would have a hard time out-competing the existing biosphere”. Can you elaborate?
As one of my physics professors put it, “We already have grey goo. They’re called bacteria.”
The intuition behind the grey goo risk appears to be “as soon as someone makes a machine that can make itself, the world is a huge lump of matter and energy just waiting to be converted into copies of that machine.” That is, of course, not true- matter and energy and prized and fought over, and any new contender is going to have to join the fight.
That’s not to say it’s impossible for an artificial self-replicating nanobot to beat the self-replicating nanobots which have evolved naturally, just that it’s hard. For example, it’s not clear to me what part of “regular ol’ mitosis” you think is regular, and easy to improve upon. Is it that the second copy is built internally, preventing it from attack and corruption?
Bacteria et al. are only the locally optimal solution after a long series of selection steps, each of which generally needed to be an improvement upon the previous step, i.e. the result of a greedy algorithm. There are few problems in which you’d expect a greedy algorithm to end up anywhere but in a very local optimum:
DNA is a hilariously inefficient way of storing partly superfluous data (all of which must undergo each mitosis), informational density could be an order/orders of magnitude higher with minor modifications, and the safety redundancies are precarious at best, compared to e.g. Hamming code. A few researchers in a poorly funded government lab can come up with deadlier viruses in a few years (remember the recent controversy) than what nature engineered in millenia. That’s not to say that compared to our current macroscopic technology the informational feats of biological data transmission, duplication etc. aren’t impressive, but that’s only because we’ve not yet achieved molecular manufacturing (a necessity for a Grey Goo scenario). (We could go into more details on gross biological inefficiencies if you’d like.)
Would you expect some antibodies and phagocytosis to defeat an intelligently engineered self-replicating nanobot the size of a virus (but which doesn’t depend on live cells and without the telltale flaws and tradeoffs of Pandemic-reminiscent”can’t kill the host cell too quickly” etc.)?
To me it seems like saying “if you drowned the world in acid, the biosphere could well win the fight in a semi-recognizable form and claim the negentropy for themselves” (yes, cells can survive in extremely adverse environments and survive in some sort of niche, but I wouldn’t exactly call such a pseudo-equilibrium winning, and self-replicators wouldn’t exactly wait for its carbon food source to evolutionary adapt).
Killing one human is easier than converting the entire biosphere.
Well, that depends on what I think the engineering constraints are. It could be that in order to be the size of a virus, self-assembly has to be outsourced. It could be that in order to be resistant to phagocytosis, it needs exotic materials which limit its growth rate and maximal growth.
It’s more “in order to drown the world in acid, you need to generate a lot of acid, and that’s actually pretty hard.”
Yes, and you may have noticed that bioengineered pandemic was voted top threat.