No—expected value is important. If many successful FAI scenarios could result in negative value, then zero value (universal extinction) would be better.
We should put some thought into whether a negative-value universe is plausible, and what it would look like.
No—expected value is important. If many successful FAI scenarios could result in negative value, then zero value (universal extinction) would be better.
We should put some thought into whether a negative-value universe is plausible, and what it would look like.