2.3. Spread our message and clarify our arguments with public-facing academic deliverables.
What I would add to the list is to directly and publicly engage people like Holden Karnofsky from GiveWell or John Baez. They seem to have the necessary background knowledge and know the math. If you can convince them, or show that they are wrong, you defeated your strongest critics. Other people include Katja Grace and Robin Hanson. All of them are highly educated, have read the sequences and disagree with the Singularity Institute.
I admit that you pretty much defeated Hanson and Baez as they haven’t been able or willing to put forth much substantive criticism regarding the general importance of an organisation like the Singularity Institute. I am unable to judge the arguments made by Grace and Karnofsky as they largely evade my current ability to grasp the involved math, but judged by the upvotes of the latest post by Karnofsky and his position I suppose that it might be a productive exercise to refute his objections.
SI has an internal roadmap of papers it would like to publish to clarify and extend our standard arguments, and these would at the same time address many public objections. At the same time, we don’t want to be sidetracked from pursuing our core mission by taking the time to respond to every critic. It’s a tough thing to balance.
I especially like the following points:
1.1. Clarify the open problems relevant to our core mission.
1.5. Estimate current AI risk levels.
2.2.b. Make use of LessWrong.com for collaborative problem-solving (in the manner of the
earlier LessWrong.com progress on decision theory).
2.3. Spread our message and clarify our arguments with public-facing academic deliverables.
What I would add to the list is to directly and publicly engage people like Holden Karnofsky from GiveWell or John Baez. They seem to have the necessary background knowledge and know the math. If you can convince them, or show that they are wrong, you defeated your strongest critics. Other people include Katja Grace and Robin Hanson. All of them are highly educated, have read the sequences and disagree with the Singularity Institute.
I admit that you pretty much defeated Hanson and Baez as they haven’t been able or willing to put forth much substantive criticism regarding the general importance of an organisation like the Singularity Institute. I am unable to judge the arguments made by Grace and Karnofsky as they largely evade my current ability to grasp the involved math, but judged by the upvotes of the latest post by Karnofsky and his position I suppose that it might be a productive exercise to refute his objections.
SI has an internal roadmap of papers it would like to publish to clarify and extend our standard arguments, and these would at the same time address many public objections. At the same time, we don’t want to be sidetracked from pursuing our core mission by taking the time to respond to every critic. It’s a tough thing to balance.