Outsiders will more likely see your current focus (FAI) as the result of pruning causes rather than leaping toward your passion—imagine if GiveWell were called GiveToMalariaCauses.
If it was pruning, that was helluva lot of pruning in very little time early in SI history: it must be likely that the AI would be created that would have a properly grounded material goal (nobody knows how to do that nor has a need to ground goals), the extreme foom has to be possible (hyper-singularity), the FAI has to be the only solution (I haven’t seen SI working on figuring out how to use the wireheading as failsafe, or how to use lack of symbol grounding for safety for that matter, despite examples of both the theorem prover that wireheads and the AIXI that doesn’t symbol ground and won’t see shutdown of it’s physical hardware as resulting in lack of reward to it’s logical structure).
If it was pruning, that was helluva lot of pruning in very little time early in SI history: it must be likely that the AI would be created that would have a properly grounded material goal (nobody knows how to do that nor has a need to ground goals), the extreme foom has to be possible (hyper-singularity), the FAI has to be the only solution (I haven’t seen SI working on figuring out how to use the wireheading as failsafe, or how to use lack of symbol grounding for safety for that matter, despite examples of both the theorem prover that wireheads and the AIXI that doesn’t symbol ground and won’t see shutdown of it’s physical hardware as resulting in lack of reward to it’s logical structure).