This seemed to be evidence that there are growing costs to continual self-modification in software systems that might limit this strategy.
It’s an unusual case, but AlphaGo provides an example of something being removed and retrained and getting better.
Outside of that—perhaps. The viability of self-modifying software...I guess we’ll see. For a more intuitive approach, let’s imagine an AGI is a human emulation except it’s immortal/doesn’t die of old age. (I.e. maybe the ‘software’ in some sense doesn’t change but the knowledge continues to accumulate and be integrated in a mind.)
1. Why would such an AI have ‘children’?
2. How long do software systems last when compared to people?
Just reasoning by analogy, yes ‘mentoring’ makes sense, though maybe in a different form. One person teaching everyone else in the world sounds ridiculous—with AGI, it seems conceivable. Or in a different direction, imagine if when you forgot about something you just asked your past self.
Overall, I’d say it’s not an necessary thing, but for agents like us it seems useful, and so the scenario you describe seems probable, but not guaranteed.
It’s an unusual case, but AlphaGo provides an example of something being removed and retrained and getting better.
Outside of that—perhaps. The viability of self-modifying software...I guess we’ll see. For a more intuitive approach, let’s imagine an AGI is a human emulation except it’s immortal/doesn’t die of old age. (I.e. maybe the ‘software’ in some sense doesn’t change but the knowledge continues to accumulate and be integrated in a mind.)
1. Why would such an AI have ‘children’?
2. How long do software systems last when compared to people?
Just reasoning by analogy, yes ‘mentoring’ makes sense, though maybe in a different form. One person teaching everyone else in the world sounds ridiculous—with AGI, it seems conceivable. Or in a different direction, imagine if when you forgot about something you just asked your past self.
Overall, I’d say it’s not an necessary thing, but for agents like us it seems useful, and so the scenario you describe seems probable, but not guaranteed.