Does this idea accelerate capabilities? (Someone might put more money into doing serial speedup after reading my post.)
Does it accelerate convincing people about AI risk? (Makes it more intuitive to visualise ASI, Yudkowsky uses similar metaphors to describe ASI)
I have honestly no idea.
Does this idea accelerate capabilities? (Someone might put more money into doing serial speedup after reading my post.)
Does it accelerate convincing people about AI risk? (Makes it more intuitive to visualise ASI, Yudkowsky uses similar metaphors to describe ASI)
I have honestly no idea.