Emergence is a subset of the word Surprise. It’s not meaningless but you can’t use it to usefully predict things you want to achieve with something, because it’s equivalent to saying “If we put all these things together maybe they’ll surprise us in an awesome way!”
Sort of. It is not surprising that incremental quantitative changes results in a qualitative change, but the exact nature of what emerges can indeed be quite a surprise. It is nevertheless useful to keep in mind the general pattern in order to not be blindsided by the fact of emergence in each particular case (“But… but.. they are all nice people, I didn’t expect them to turn into a mindless murderous mob!”). And to be ready to take action when the emergent entity hits the fan.
Emergence is a subset of the word Surprise. It’s not meaningless but you can’t use it to usefully predict things you want to achieve with something, because it’s equivalent to saying “If we put all these things together maybe they’ll surprise us in an awesome way!”
Sort of. It is not surprising that incremental quantitative changes results in a qualitative change, but the exact nature of what emerges can indeed be quite a surprise. It is nevertheless useful to keep in mind the general pattern in order to not be blindsided by the fact of emergence in each particular case (“But… but.. they are all nice people, I didn’t expect them to turn into a mindless murderous mob!”). And to be ready to take action when the emergent entity hits the fan.
Or in simpler terms, AI is a crapshoot.
Agreed. Like with surprises, you can try to be robust to them or agile enough to adapt.
If something is an emergent property, you can bet on it not being the sum of its parts. That has some use.
Aiming the tiny Friendly dot in AI-space is not one of them, though.