which is still small cost for universe-wide superintelligent AI, which will control billions of billions stars for tens of billions of years
All of the stars will be dead in 100 trillion years (although it’s likely a good org will aestivate and continue most of its activities beyond that, which supposedly will get them a much higher operating efficiency than anything that’s imaginable now). There are only 50 Bn stars in the local cluster, and afaik it’s not physically possible to spread beyond the local cluster. All that stuff’s just a bunch of fading images that we’ll never touch. (I tried to substantiate this and the only simple account I could find was a youtube video. Such is our internet https://www.youtube.com/watch?v=ZL4yYHdDSWs best I could do)
(And it doesn’t seem sound, to me, to guess that we’ll ever find a way around the laws of relativity just because we really want to.)
It still seems profoundly hard to tell how much of the distribution of a history generator is going to be fictional, and it wouldn’t surprise me if the methods you have in mind generate mostly cosmically unlikely life-histories. You essentially have to get the measure of your results to match the measure of people who really lived and died. We have access to a huge measure multiplier, but it’s finite, and the error rate might just as huge.
How many lives-worth of energy are you trading away for every resurrection?
Personally, I think that it would not be computationally intense for an AI capable to create past simulations (and also it will create them anyway for some instrumental reasons), so it will be more likely to be less than 1000 years and a small fraction of one star energy. It is based on some ideas about limits of computations and power of human brain, and I think Bostrom had calculations in hist article about simulations.
However, I think that we are morally obliged to resurrect all the dead, as most of the people of past dreamed about some form of life after death. They lived and died for us and for our capability to create advance technology. We will pay the price back.
Just regarding
All of the stars will be dead in 100 trillion years (although it’s likely a good org will aestivate and continue most of its activities beyond that, which supposedly will get them a much higher operating efficiency than anything that’s imaginable now). There are only 50 Bn stars in the local cluster, and afaik it’s not physically possible to spread beyond the local cluster. All that stuff’s just a bunch of fading images that we’ll never touch. (I tried to substantiate this and the only simple account I could find was a youtube video. Such is our internet https://www.youtube.com/watch?v=ZL4yYHdDSWs best I could do)
(And it doesn’t seem sound, to me, to guess that we’ll ever find a way around the laws of relativity just because we really want to.)
It still seems profoundly hard to tell how much of the distribution of a history generator is going to be fictional, and it wouldn’t surprise me if the methods you have in mind generate mostly cosmically unlikely life-histories. You essentially have to get the measure of your results to match the measure of people who really lived and died. We have access to a huge measure multiplier, but it’s finite, and the error rate might just as huge.
How many lives-worth of energy are you trading away for every resurrection?
Personally, I think that it would not be computationally intense for an AI capable to create past simulations (and also it will create them anyway for some instrumental reasons), so it will be more likely to be less than 1000 years and a small fraction of one star energy. It is based on some ideas about limits of computations and power of human brain, and I think Bostrom had calculations in hist article about simulations.
However, I think that we are morally obliged to resurrect all the dead, as most of the people of past dreamed about some form of life after death. They lived and died for us and for our capability to create advance technology. We will pay the price back.