This seemed like a nice explainer post, though it’s somewhat confusing who the post is for – if I imagine being someone who didn’t really understand any arguments about superintelligence, I think I might bounce off the opening paragraph or title because I’m like “why would I care about eating the sun.”
There is something nice and straightforward about the current phrasing but suspect there’s an opening paragraph that would do a better job explaining why you might care about this.
(But I’d be curious to hear from people who weren’t really sold on any singularity stuff who read it and can describe how it was for them)
I think partially it’s meant to go from some sort of abstract model of intelligence as a scalar variable that increases at some rate (like, on a x/y graph) to concrete, material milestones. Like, people can imagine “intelligence goes up rapidly! singularity!” and it’s unclear what that implies, I’m saying sufficient levels would imply eating the sun, that makes it harder to confuse with things like “getting higher scores on math tests”.
I suppose a more general category would be, the relevant kind of self-improving intelligence would be the sort that can re-purpose mass-energy to creating more computation that can run its intelligence, and “eat the Sun” is an obvious target given this background notion of intelligence.
(Note, there is skepticism about feasibility on Twitter/X, that’s some info about how non-singulatarians react)
I was already sold on singularity. For what it’s worth I found the post and comments very helpful for why you would want to take the sun apart in the first place and why it would be feasible and desirable for superintelligent and non-superintelligent civilization (Turning the sun into a smaller sun that doesn’t explode seems nicer than having it explode. Fusion gives off way more energy than lifting the material. Gravity is the weakest of the 4 forces after all. In a superintelligent civilization with reversible computers, not taking apart the sun will make readily available mass a taut constraint).
Ignoring such confusion is good for hardening the frame where the content is straightforward. It’s inconvenient to always contextualize, refusing to do so carves out the space for more comfortable communication.
This seemed like a nice explainer post, though it’s somewhat confusing who the post is for – if I imagine being someone who didn’t really understand any arguments about superintelligence, I think I might bounce off the opening paragraph or title because I’m like “why would I care about eating the sun.”
There is something nice and straightforward about the current phrasing but suspect there’s an opening paragraph that would do a better job explaining why you might care about this.
(But I’d be curious to hear from people who weren’t really sold on any singularity stuff who read it and can describe how it was for them)
I think partially it’s meant to go from some sort of abstract model of intelligence as a scalar variable that increases at some rate (like, on a x/y graph) to concrete, material milestones. Like, people can imagine “intelligence goes up rapidly! singularity!” and it’s unclear what that implies, I’m saying sufficient levels would imply eating the sun, that makes it harder to confuse with things like “getting higher scores on math tests”.
I suppose a more general category would be, the relevant kind of self-improving intelligence would be the sort that can re-purpose mass-energy to creating more computation that can run its intelligence, and “eat the Sun” is an obvious target given this background notion of intelligence.
(Note, there is skepticism about feasibility on Twitter/X, that’s some info about how non-singulatarians react)
I was already sold on singularity. For what it’s worth I found the post and comments very helpful for why you would want to take the sun apart in the first place and why it would be feasible and desirable for superintelligent and non-superintelligent civilization (Turning the sun into a smaller sun that doesn’t explode seems nicer than having it explode. Fusion gives off way more energy than lifting the material. Gravity is the weakest of the 4 forces after all. In a superintelligent civilization with reversible computers, not taking apart the sun will make readily available mass a taut constraint).
Ignoring such confusion is good for hardening the frame where the content is straightforward. It’s inconvenient to always contextualize, refusing to do so carves out the space for more comfortable communication.