For what it’s worth, I personally don’t worry about this concern much, due to 1 specific reason:
I think that while it will be trivial to isolate yourself from the outside world, and it will be way more practical to live by yourself ala hikkikomoris than today, I don’t expect them to control much of the long-run future, and indeed it’s very plausible they get essentially 0.001% control over the universe or less.
So compared to basically every other problem, putting any effort on this problem is basically the epitome of the perfect future being the enemy of a merely great future, at least to me, and if we only had that problem, I’d basically endorse building superintelligence now.
For what it’s worth, I personally don’t worry about this concern much, due to 1 specific reason:
I think that while it will be trivial to isolate yourself from the outside world, and it will be way more practical to live by yourself ala hikkikomoris than today, I don’t expect them to control much of the long-run future, and indeed it’s very plausible they get essentially 0.001% control over the universe or less.
So compared to basically every other problem, putting any effort on this problem is basically the epitome of the perfect future being the enemy of a merely great future, at least to me, and if we only had that problem, I’d basically endorse building superintelligence now.