I also took a while to understand what was meant, so here is my understanding of the meaning:
Assumptions:
There will be a singularity in 100 years.
If the proposed research is started now it will be a successful singularity,
e.g. friendly AI.
If the proposed research isn’t started by the time of the singularity, it will be a unsuccessful (negative) singularity, but still a singularity.
The probability of the successful singularity linearly decreases with the time when the research starts, from 100 percent now, to 0 percent in 100 years time.
A 1 in 80 billion chance of saving 80 billion galaxies is equivalent to definitely saving 1 galaxy, and the linearly decreasing chance of a successful singularity affecting all of them is equivalent to a linearly decreasing number being affected. 25 galaxies per second is the rate of that decrease.
I also took a while to understand what was meant, so here is my understanding of the meaning:
Assumptions: There will be a singularity in 100 years. If the proposed research is started now it will be a successful singularity, e.g. friendly AI. If the proposed research isn’t started by the time of the singularity, it will be a unsuccessful (negative) singularity, but still a singularity. The probability of the successful singularity linearly decreases with the time when the research starts, from 100 percent now, to 0 percent in 100 years time.
A 1 in 80 billion chance of saving 80 billion galaxies is equivalent to definitely saving 1 galaxy, and the linearly decreasing chance of a successful singularity affecting all of them is equivalent to a linearly decreasing number being affected. 25 galaxies per second is the rate of that decrease.