This is unrealistic. It assumes:
Orders of magnitude more intelligence
The actual usefulness of such intelligence in the physical world with its physical limits
The more worrying prospect is that the AI might not necessarily fear suicide. Suicidal actions are quite prevalent among humans, after all.
This is unrealistic. It assumes:
Orders of magnitude more intelligence
The actual usefulness of such intelligence in the physical world with its physical limits
The more worrying prospect is that the AI might not necessarily fear suicide. Suicidal actions are quite prevalent among humans, after all.