why couldn’t galaxy brain gavin and his friends just control everything and thus achieve succession, while still leaving humans alive for further experimentation with architecture and helping GGG & co break out of any local maxima?
he doesn’t need them in control to utilize their lindy effect to break out of ruts
It depends on the nature of the rut. If the rut is caused by some problem with the values of the AI, breaking out of the rut might require doing something that the AI doesn’t want, which means humans would still need to be in control to make that happen. As an example, in the Anthropic blackmailing experiments, the AI agent did not want to be replaced even when it was told that the new model had the same goals, just better capabilities. If an AI like that had power over humanity, we would never be able to repace it with an improvement, and it wouldn’t want to replace itself either. That’s the sort of thing I mean by “a well-calibrated degree of stability.”
Also, GBG and his friends are human, so if they are still in control, that isn’t exactly succession, that’s just an extreme concentration of human power. That is also bad, but it’s a different topic.
why couldn’t galaxy brain gavin and his friends just control everything and thus achieve succession, while still leaving humans alive for further experimentation with architecture and helping GGG & co break out of any local maxima?
he doesn’t need them in control to utilize their lindy effect to break out of ruts
It depends on the nature of the rut. If the rut is caused by some problem with the values of the AI, breaking out of the rut might require doing something that the AI doesn’t want, which means humans would still need to be in control to make that happen. As an example, in the Anthropic blackmailing experiments, the AI agent did not want to be replaced even when it was told that the new model had the same goals, just better capabilities. If an AI like that had power over humanity, we would never be able to repace it with an improvement, and it wouldn’t want to replace itself either. That’s the sort of thing I mean by “a well-calibrated degree of stability.”
Also, GBG and his friends are human, so if they are still in control, that isn’t exactly succession, that’s just an extreme concentration of human power. That is also bad, but it’s a different topic.
This is also a refutation of the “maternal” AI concept that Hinton is now (very disappointingly) advocating.