This all stems from a misunderstanding, which is that “Singularity” in that sentence implicitly means “positive Singularity”.
A misunderstanding on the part of the article’s author, presumably. The page continues:
If there’s a Singularity effort that has a strong vision of this future and supports projects that explicitly focus on transhuman technologies such as brain-computer interfaces and self-improving Artificial Intelligence, then humanity may succeed in making the transition to this future a few years earlier, saving millions of people who would have otherwise died.
This adds support to the “mad rush → reap benefits → yay!” interpretation. That seems to be quite some distance from the current, more sensible idea that it makes more sense to be mainly concerned with how well the transition goes.
The SIAI didn’t really invent machine intelligence doom-mongering marketing. Hugo de Garis and Kevin Warwick were doing that before them—and machines-gone-wrong has been a staple of science fiction for far longer.
A misunderstanding on the part of the article’s author, presumably. The page continues:
This adds support to the “mad rush → reap benefits → yay!” interpretation. That seems to be quite some distance from the current, more sensible idea that it makes more sense to be mainly concerned with how well the transition goes.
The SIAI didn’t really invent machine intelligence doom-mongering marketing. Hugo de Garis and Kevin Warwick were doing that before them—and machines-gone-wrong has been a staple of science fiction for far longer.