Seems to me that it’s a pretty serious bug that currently the only way we have for making more human-level intelligences involves causing one heck of a lot of pain, risk, and general discomfort to already-existing human-level intelligences.
And therefore?
Seems to me that all potential ways of making more human-or-greater-level intelligences discussed at this site involves heck of a lot of pain (in terms of man-hours of work), risk (of the x kind), and general discomfort (in adjusting to ai technology) to already exisiting human-level intelligences. And yet, when it happens, if we survive and all, it will be rather awesome.
I mean, is calling the grand canyon awesome minimizing the hazards of flash-flooding rivers may pose? Is calling a sky-scraper awesome callous to those who have ever labored or died in construction?
Seems to me that it’s a pretty serious bug that currently the only way we have for making more human-level intelligences involves causing one heck of a lot of pain, risk, and general discomfort to already-existing human-level intelligences.
And therefore? Seems to me that all potential ways of making more human-or-greater-level intelligences discussed at this site involves heck of a lot of pain (in terms of man-hours of work), risk (of the x kind), and general discomfort (in adjusting to ai technology) to already exisiting human-level intelligences. And yet, when it happens, if we survive and all, it will be rather awesome.
I mean, is calling the grand canyon awesome minimizing the hazards of flash-flooding rivers may pose? Is calling a sky-scraper awesome callous to those who have ever labored or died in construction?
Can you point to any bug-free system?