I was modelling it as a superintelligence acting on eg GWB’s behalf, including doing his moral philosophy (ie GWB’s Extrapolated Volition). I see I wasn’t exactly obvious with that assumption.
Let’s put it this way, conditional on the BDFL doing well by their own standards (so, not the usual human fail), I would probably find that world superior to this.
The only wrench to be thrown in this is human corruption by power, but then it’s debatable whether the BDFL is doing well by their own (previous) standards.
I broadly agree with this. GWB probably thinks of eg minimising gay sex as a terminal value, but I would have thought that a superintelligence extrapolating GWBEV would figure out that value was conditional on their being a God, which there isn’t, and discard it.
I was modelling it as a superintelligence acting on eg GWB’s behalf, including doing his moral philosophy (ie GWB’s Extrapolated Volition). I see I wasn’t exactly obvious with that assumption.
Let’s put it this way, conditional on the BDFL doing well by their own standards (so, not the usual human fail), I would probably find that world superior to this.
The only wrench to be thrown in this is human corruption by power, but then it’s debatable whether the BDFL is doing well by their own (previous) standards.
I broadly agree with this. GWB probably thinks of eg minimising gay sex as a terminal value, but I would have thought that a superintelligence extrapolating GWBEV would figure out that value was conditional on their being a God, which there isn’t, and discard it.