either is robust and stable, or falls probalistically after some time
do you see that statement as load-bearing in an argument chain that would be possible to unroll, please? I imagine one would have to believe it’s a positive feedback loop that is unstoppable by any “natural” negative feedback loop short of total plexish-value destruction in order to be worried about probabilistic cumulation over time
I mostly expect that humanity won’t react anything like strongly enough to them
yeah sure, but why do you expect technocratic civilization artifacts to be more robust to the destruction of human habitat than biological gene pool? why would that lead to extinction by default and not the kind of collapse when data centers stop operating before hunter gatherers?
..because for me, I would have to imagine a certain kind of competence to be worried about fully automated robotics to empower the extinction danger from artificial means, and I don’t see that level of competence in the artifacts of current civilization.. yet
At each timestep there is some % chance of value decay. Either that % chance falls rapidly, and the value is stable, or or does not fall rapidly, and after some timesteps you should expect values to be decayed.
why would that lead to extinction by default and not the kind of collapse when data centers stop operating before hunter gatherers?
do you see that statement as load-bearing in an argument chain that would be possible to unroll, please? I imagine one would have to believe it’s a positive feedback loop that is unstoppable by any “natural” negative feedback loop short of total plexish-value destruction in order to be worried about probabilistic cumulation over time
yeah sure, but why do you expect technocratic civilization artifacts to be more robust to the destruction of human habitat than biological gene pool? why would that lead to extinction by default and not the kind of collapse when data centers stop operating before hunter gatherers?
..because for me, I would have to imagine a certain kind of competence to be worried about fully automated robotics to empower the extinction danger from artificial means, and I don’t see that level of competence in the artifacts of current civilization.. yet
At each timestep there is some % chance of value decay. Either that % chance falls rapidly, and the value is stable, or or does not fall rapidly, and after some timesteps you should expect values to be decayed.
Humanoid robots are pretty near-term: https://fxtwitter.com/KyberLabsRobots/status/2036127368088080867