Oops oh no. I used the wrong word. I meant planetary civilization, e.g. humanity or an alien civilization. Sorry.
I’ll edit the post to replace “civilization” with “planetary civilization.” Thank you for commenting, you saved me from confusing everyone!
In the discussion of You can, in fact, bamboozle an unaligned AI into sparing your life, the people from planet 1 can revive the people from another planet which got taken over by a misaligned ASI (planet 2), if that ASI saved the brain states of planet 2′s people before killing them.
Both the people from planet 1 and the ASI from planet 2 might colonize the stars, expanding further and further until they meet each other. The ASI might sell the brain states of planet 2′s people, to planet 1′s people, so that planet 1′s people can revive planet 2′s people.
Planet 1′s people agree to this deal because they care about saving people from other planets. The ASI from planet 2 agree to this deal because planet 1′s people might give them a tiny bit more resources for making paperclips.
This was one out of many ideas, for how one surviving planetary civilization could revive others.
Thanks for the clarification—I’m still a bit unsure if “planetary civilization” is distinct from “the specific set of individuals inhabiting a planet”, and I should admit that I’m highly skeptical of the value (to an AGI or even to other humans) of a specific individual’s brain-state, and I have a lot of trouble following arguments that imply migration or resurrection of more than a few percent of biological intelligences.
Sorry, yes a planetary civilization is simply the specific set of individuals inhabiting a planet. I’m not sure what’s the best way to describe that in two words :/
How valuable a few surviving civilizations are depends on your ontology. If you believe in the many worlds interpretation of quantum mechanics, or believe that the universe is infinitely big, then there are infinite exact copies of the Earth. Even if only 0.1% of Earths were saved, there will still be infinite copies of future you alive, but at 0.1% the density.
The planetary civilization saving Earth may have immense resources in the post singularity world. With millions of years of technological progress, technology will be limited only by the laws of physics. They can expand out close to the speed of light, and control the matter and energy of 1022 stars. Meanwhile, energy required to simulate all of humanity, using the most efficient computers possible, is probably not much more than running 1 electric car.[1]
They could easily simulate 1000 copies of humanity.
This means for every 1000 identical copies of you, you might have 999 dying, and one surviving but duplicated 1000 times.
If you don’t care about personal survival but whether the average sentient life in all of existence is happy or miserable, then it’s also good for planetary civilizations to randomize their strategies, to ensure at least a few survive, and use their immense resources to create far more happy lives than all the miserable lives from pre-singularity times.
The human brain uses 20 watts of energy, but is very inefficient. Each neuron firing uses6×108 ATP molecules. If a simulated neuron firing only uses the energy equivalent of 60 ATP molecules, then it would be 107 times more efficient, and 8 billion people will only use 16,000 watts, similar to an electric car.
Oops oh no. I used the wrong word. I meant planetary civilization, e.g. humanity or an alien civilization. Sorry.
I’ll edit the post to replace “civilization” with “planetary civilization.” Thank you for commenting, you saved me from confusing everyone!
In the discussion of You can, in fact, bamboozle an unaligned AI into sparing your life, the people from planet 1 can revive the people from another planet which got taken over by a misaligned ASI (planet 2), if that ASI saved the brain states of planet 2′s people before killing them.
Both the people from planet 1 and the ASI from planet 2 might colonize the stars, expanding further and further until they meet each other. The ASI might sell the brain states of planet 2′s people, to planet 1′s people, so that planet 1′s people can revive planet 2′s people.
Planet 1′s people agree to this deal because they care about saving people from other planets. The ASI from planet 2 agree to this deal because planet 1′s people might give them a tiny bit more resources for making paperclips.
This was one out of many ideas, for how one surviving planetary civilization could revive others.
Thanks for the clarification—I’m still a bit unsure if “planetary civilization” is distinct from “the specific set of individuals inhabiting a planet”, and I should admit that I’m highly skeptical of the value (to an AGI or even to other humans) of a specific individual’s brain-state, and I have a lot of trouble following arguments that imply migration or resurrection of more than a few percent of biological intelligences.
Sorry, yes a planetary civilization is simply the specific set of individuals inhabiting a planet. I’m not sure what’s the best way to describe that in two words :/
What I described there was only one out of very many ideas proposed in the discussion of You can, in fact, bamboozle an unaligned AI into sparing your life. The overall idea is that a few surviving civilizations can do a lot of good.
How valuable a few surviving civilizations are depends on your ontology. If you believe in the many worlds interpretation of quantum mechanics, or believe that the universe is infinitely big, then there are infinite exact copies of the Earth. Even if only 0.1% of Earths were saved, there will still be infinite copies of future you alive, but at 0.1% the density.
The planetary civilization saving Earth may have immense resources in the post singularity world. With millions of years of technological progress, technology will be limited only by the laws of physics. They can expand out close to the speed of light, and control the matter and energy of 1022 stars. Meanwhile, energy required to simulate all of humanity, using the most efficient computers possible, is probably not much more than running 1 electric car.[1]
They could easily simulate 1000 copies of humanity.
This means for every 1000 identical copies of you, you might have 999 dying, and one surviving but duplicated 1000 times.
If you don’t care about personal survival but whether the average sentient life in all of existence is happy or miserable, then it’s also good for planetary civilizations to randomize their strategies, to ensure at least a few survive, and use their immense resources to create far more happy lives than all the miserable lives from pre-singularity times.
The human brain uses 20 watts of energy, but is very inefficient. Each neuron firing uses 6×108 ATP molecules. If a simulated neuron firing only uses the energy equivalent of 60 ATP molecules, then it would be 107 times more efficient, and 8 billion people will only use 16,000 watts, similar to an electric car.