The WBE question is altogether a question of values—is the WBE the part of mankind and represent survival of mankind, or not? If you got a WBE to someone who has no relatives, nor friends, nor children, and is also a complete sociopath or best yet a psychopath, then yeah they might self optimize into god knows what. At the same time they are exceedingly likely to please themselves, or the like.
On top of this, if you are ultra frigging clever running at 10^15 flops, you may be a brilliant engineer, but you won’t forecast weather much better than the best weather prediction supercomputer because there ain’t clever shortcuts around Lyapunov’s exponent. Even an exact-but-for-1 atom replica of Earth won’t predict weather very well. Pulling the trick from ‘Limitless’ (or Ted Chiang’s Understand) on stock markets may also be impossible.
That’s the thing with fiction, we make up fiction characters that are easy to make up, which means that the super-intelligent characters got to be pretty damn average monkey that does savant tricks, and the powers of prediction are the easiest savant trick to speculate up (and the one we should expect the least). This goes for answering questions wrt what we should and shouldn’t expect of the AI
The WBE question is altogether a question of values—is the WBE the part of mankind and represent survival of mankind, or not? If you got a WBE to someone who has no relatives, nor friends, nor children, and is also a complete sociopath or best yet a psychopath, then yeah they might self optimize into god knows what. At the same time they are exceedingly likely to please themselves, or the like.
On top of this, if you are ultra frigging clever running at 10^15 flops, you may be a brilliant engineer, but you won’t forecast weather much better than the best weather prediction supercomputer because there ain’t clever shortcuts around Lyapunov’s exponent. Even an exact-but-for-1 atom replica of Earth won’t predict weather very well. Pulling the trick from ‘Limitless’ (or Ted Chiang’s Understand) on stock markets may also be impossible.
That’s the thing with fiction, we make up fiction characters that are easy to make up, which means that the super-intelligent characters got to be pretty damn average monkey that does savant tricks, and the powers of prediction are the easiest savant trick to speculate up (and the one we should expect the least). This goes for answering questions wrt what we should and shouldn’t expect of the AI