Not to mention the question of just how friendly a heavily enhanced human will be. Do I want an aggressive king maker with tons of money to spend on upgrades to increase their power by massively amplifying their intelligence? How about a dictator who had been squirreling away massive, illegally obtained funds?
Power corrupts, and even if enhancements are made widely available, there’s a good possibility of an accelerating (or at least linearly increasing) gap in cognitive enhancements (I have the best enhancement, ergo I can find a quicker path to improving my own position, including inventing new enhancements if the need arises—thereby securing my position at the top long enough to seize an awful lot of control). An average person may end up with a greatly increased intelligence that is miniscule relative to what’s possible to attain if they had the resources to do so.
In a scenario where someone who does have access to lots of resources can immediately begin to control the game at a level of precision far beyond what is obtainable for all but a handful of people, this may be a vast improvement over a true UFAI let loose on an unsuspecting universe, but it’s still a highly undesirable scenario. I would much rather have an FAI (I suspect some of these hypothetical persons would decide it to be in their best interest to block any sort of effort to build something that outstrips their capacity for controlling their environment—FAI or no).
Not to mention the question of just how friendly a heavily enhanced human will be. Do I want an aggressive king maker with tons of money to spend on upgrades to increase their power by massively amplifying their intelligence? How about a dictator who had been squirreling away massive, illegally obtained funds?
Power corrupts, and even if enhancements are made widely available, there’s a good possibility of an accelerating (or at least linearly increasing) gap in cognitive enhancements (I have the best enhancement, ergo I can find a quicker path to improving my own position, including inventing new enhancements if the need arises—thereby securing my position at the top long enough to seize an awful lot of control). An average person may end up with a greatly increased intelligence that is miniscule relative to what’s possible to attain if they had the resources to do so.
In a scenario where someone who does have access to lots of resources can immediately begin to control the game at a level of precision far beyond what is obtainable for all but a handful of people, this may be a vast improvement over a true UFAI let loose on an unsuspecting universe, but it’s still a highly undesirable scenario. I would much rather have an FAI (I suspect some of these hypothetical persons would decide it to be in their best interest to block any sort of effort to build something that outstrips their capacity for controlling their environment—FAI or no).