My question has only a little bit to do with the probability that an AI project is successful. It has mostly to do with P(universe goes to waste | AI projects are unsuccessful). For instance, couldn’t the universe go on generating human utility after humans go extinct?
Aliens. I would be pleased to learn that something amazing was happening (or was going to happen, long “after” I was dead) in one of those galaxies. Since it’s quite likely that something amazing is happening in one of those 80 billion galaxies, shouldn’t I be pleased even without learning about it?
Of course, I would be correspondingly distressed to learn that something horrible was happening in one of those galaxies.
My question has only a little bit to do with the probability that an AI project is successful. It has mostly to do with P(universe goes to waste | AI projects are unsuccessful). For instance, couldn’t the universe go on generating human utility after humans go extinct?
How? By coincidence?
(I’m assuming you also mean no posthumans, if humans go extinct and AI is unsuccessful.)
Aliens. I would be pleased to learn that something amazing was happening (or was going to happen, long “after” I was dead) in one of those galaxies. Since it’s quite likely that something amazing is happening in one of those 80 billion galaxies, shouldn’t I be pleased even without learning about it?
Of course, I would be correspondingly distressed to learn that something horrible was happening in one of those galaxies.