The Singularity Institute exists to carry out the mission of the Singularity-aware – to accelerate the arrival of the Singularity in order to hasten its human benefits; …
This seems a somewhat gung-ho attitude which is not consistent with the message on the rest of the site. And this isn’t just my misreading or quoting out of context—apparently that page is very out of date and no longer represents the worldview of the more mature, grown up SIAI.
Machine intelligence is a race. I think everyone involved is aware of the time pressure element. About the only strategy that doesn’t involve attempting rapid progress is sabotaging other people’s projects—and that looks like a pretty ineffective strategy—not least because such destruction probably won’t “get” all of the projects.
I emailed them to ask about that particular sentence, and got back that it was out of date and doesn’t accurately reflect their current position.
The issue is a lot more nuanced than just “singularity is bad” or “singularity is good” and these subtleties need to be made clear. Don’t assume that your line of thinking will be immediately obvious to readers.
[EDIT: … obvious to readers of the SIAI website, that is.]
The best strategies involve accelerating progress, decelerating progress, or not affecting progress. We each have some probabilistic estimate about which strategies are best.
The advantages of going fast include the possibility of actively taking positive steps towards a positive outcome. The disadvantages include the possibility of messing up. The disadvantages of attempting to go slowly include the difficulty of affecting very many teams. Unilaterally going slowly is likely to especially pointless—that just means you will lose.
My current expectation is that the successful team is likely to prioritise going rapidly quite highly, that the race will be quite intense, and that there is little point in aiming to lose.
I’d be kind-of surprised if others thought differently. For instance, does anyone really think it is practical or desirable to try and slow things down? The unabomber tried that. It doesn’t look terribly practical or desirable to me.
I think you’re missing my point. I’m not arguing about which strategy is best but simply about whether what’s on the website reflects what SIAI actually believes.
You are not concerned with which strategy is best? I see.
On reflection, “to accelerate the arrival of the Singularity in order to hasten its human benefits” does sound bad. If someone told me that was their explanation for why they wanted their program to go rapidly, my expectation would be that they were either confused or not telling the truth.
About the only strategy that doesn’t involve attempting rapid progress is sabotaging other people’s projects—and that looks like a pretty ineffective strategy—not least because such destruction probably won’t get all the projects.
I’d go as far as to suspect that making sabotage attempts is likely to speed up the rate of research so may only be expected to push back the critical date when the situation has got particularly urgent.
Sabotage and negative marketing seem rather common. For example, here is some baseless shit slinging:
And if Novamente should ever cross the finish line, we all die.
I’m not clear what the net effect of such FUD on the overall rate of progress (if any) is, though. Usually such strategies aim at hampering competitors—not at manipulating the overall rate of progress.
I think we should probably discourage the use of negative marketing in this area. I think it is more likely to be used by organisations with poor moral scruples—of the type we do not want to gain an advantage. Public disapproval may not eliminate it—but might at least drive it underground.
Machine intelligence is a race. I think everyone involved is aware of the time pressure element. About the only strategy that doesn’t involve attempting rapid progress is sabotaging other people’s projects—and that looks like a pretty ineffective strategy—not least because such destruction probably won’t “get” all of the projects.
I emailed them to ask about that particular sentence, and got back that it was out of date and doesn’t accurately reflect their current position.
The issue is a lot more nuanced than just “singularity is bad” or “singularity is good” and these subtleties need to be made clear. Don’t assume that your line of thinking will be immediately obvious to readers.
[EDIT: … obvious to readers of the SIAI website, that is.]
The best strategies involve accelerating progress, decelerating progress, or not affecting progress. We each have some probabilistic estimate about which strategies are best.
The advantages of going fast include the possibility of actively taking positive steps towards a positive outcome. The disadvantages include the possibility of messing up. The disadvantages of attempting to go slowly include the difficulty of affecting very many teams. Unilaterally going slowly is likely to especially pointless—that just means you will lose.
My current expectation is that the successful team is likely to prioritise going rapidly quite highly, that the race will be quite intense, and that there is little point in aiming to lose.
I’d be kind-of surprised if others thought differently. For instance, does anyone really think it is practical or desirable to try and slow things down? The unabomber tried that. It doesn’t look terribly practical or desirable to me.
I think you’re missing my point. I’m not arguing about which strategy is best but simply about whether what’s on the website reflects what SIAI actually believes.
You are not concerned with which strategy is best? I see.
On reflection, “to accelerate the arrival of the Singularity in order to hasten its human benefits” does sound bad. If someone told me that was their explanation for why they wanted their program to go rapidly, my expectation would be that they were either confused or not telling the truth.
I’d go as far as to suspect that making sabotage attempts is likely to speed up the rate of research so may only be expected to push back the critical date when the situation has got particularly urgent.
Sabotage and negative marketing seem rather common. For example, here is some baseless shit slinging:
I’m not clear what the net effect of such FUD on the overall rate of progress (if any) is, though. Usually such strategies aim at hampering competitors—not at manipulating the overall rate of progress.
I think we should probably discourage the use of negative marketing in this area. I think it is more likely to be used by organisations with poor moral scruples—of the type we do not want to gain an advantage. Public disapproval may not eliminate it—but might at least drive it underground.