When Anders Sandberg uses a simulated-atmosphere / simulated-brain analogy and says roughly “We’re interested in climate, not weather” I’m tempted to reply “Speak for yourself, buddy.” Many will be interested in the “weather” of the brain as well as the “climate”. This is especially true if a brain emulation is proposed for uploading.
It is quite possible that the brain, and the course of an individual life involving relationships, jobs, and so on, are both chaotic enough that a relatively minor looking variation in brain activity could lead to a vastly different life. And of course, a vastly different life will in turn change the “climate” of brain activity.
It’s probably arguable that the expected utility, for an agent who obeys something like the Von Neumann-Morgenstern axioms, is the same. A life chosen by an emulated brain might miss out on a wonderful relationship or career, due to random fluctuations, that the organic brain would have enjoyed. But the exact reverse might be true instead, with equal probability.
Such an expected-utility argument will probably satisfy those who satisfy the axioms. In other words, a minority. And could in principle satisfy those who, on rational reflection, would satisfy the axioms. In other words, still a minority.
When Anders Sandberg uses a simulated-atmosphere / simulated-brain analogy and says roughly “We’re interested in climate, not weather” I’m tempted to reply “Speak for yourself, buddy.” Many will be interested in the “weather” of the brain as well as the “climate”. This is especially true if a brain emulation is proposed for uploading.
It is quite possible that the brain, and the course of an individual life involving relationships, jobs, and so on, are both chaotic enough that a relatively minor looking variation in brain activity could lead to a vastly different life. And of course, a vastly different life will in turn change the “climate” of brain activity.
It’s probably arguable that the expected utility, for an agent who obeys something like the Von Neumann-Morgenstern axioms, is the same. A life chosen by an emulated brain might miss out on a wonderful relationship or career, due to random fluctuations, that the organic brain would have enjoyed. But the exact reverse might be true instead, with equal probability.
Such an expected-utility argument will probably satisfy those who satisfy the axioms. In other words, a minority. And could in principle satisfy those who, on rational reflection, would satisfy the axioms. In other words, still a minority.