Not to spoil the joke, but on cosmological scales a number of things happen which erode your proposed foundations. For instance, singularities become plural throughout the universe. Resultant AGI are subjected to selection pressure from the outside environment, which on a sufficient scale will always assert itself. Human progress reduces human exposure to traditional selection pressures and most human selection pressures become a very deep recursion of human behavior, which is itself a more provincial type of aligning force, so humans lose special human authenticity status as they climb the tech and social complexity trees. Also, because AI train on biologically produced data sets, the chauvanism about whose form of selection pressure is the kind that has the special sauce in it gets memetically mutated into AI form.
So, guy who likes practicality in the face of consequences. Meet consequences, and have fun relearning practicality.
All good and valid points, worthy of more exploration.
Some of these I do try to explore later in the sequence: for example that humans are now evolving in a different and mostly less harsh environment, and that AI adsorbs things that in humans were evolved, sometimes inappropriately — which I touch on in this post, but explore more in some later posts.
However “What happens when we meet other alien sapient species/civilizations that have also gone through Singularities” isn’t explored in this sequence — primarily because there are rather a lot of variables: it depends a lot how that Singularity went for the first couple of civilizations we encounter, and as I say, I think that has far more possible outcomes than most people are thinking about. (Also, we are really very uncertain of a couple of terms in the Drake Equation, since we have a sample size of 1 and by the Bayesian equivalent of the Anthropic Argument, all that tells us is that it has happened at least once somewhere in the entire universe, so we have no good base rate evidence whether we’re alone in the galaxy or even in the observable or entire universe, or if it’s fairly packed, just not so packed that we’re yet observed obvious evidence.) The only place I’ve explored that is in the SF trilogy I never published (because I got too interested in trying to solve the problem in real life instead before it killed us all). In that, we met two major examples, at around the same time and far enough apart to not immediately produce a 3-way interaction, each of which had gone dramatically differently: briefly one had gone the upload-into-mechanical bodies route and was no longer biological at all, some of them were vastly upgraded though functionally not disempowered, while the other had turned into a unitary ASI sovereign computronium Dyson swarm that disempowered and/or uploaded and adsorbed the sapient species, which had then decided to uplift its entire former ecosystem above a practicable size to to sapience, and for the larger ones to well above it, and have that ecosystem go colonize other starsystems. All of this was assuming no faster-than-light travel or communication, as the system then becomes far more boringly uniform.
Not to spoil the joke, but on cosmological scales a number of things happen which erode your proposed foundations. For instance, singularities become plural throughout the universe. Resultant AGI are subjected to selection pressure from the outside environment, which on a sufficient scale will always assert itself. Human progress reduces human exposure to traditional selection pressures and most human selection pressures become a very deep recursion of human behavior, which is itself a more provincial type of aligning force, so humans lose special human authenticity status as they climb the tech and social complexity trees. Also, because AI train on biologically produced data sets, the chauvanism about whose form of selection pressure is the kind that has the special sauce in it gets memetically mutated into AI form.
So, guy who likes practicality in the face of consequences. Meet consequences, and have fun relearning practicality.
All good and valid points, worthy of more exploration.
Some of these I do try to explore later in the sequence: for example that humans are now evolving in a different and mostly less harsh environment, and that AI adsorbs things that in humans were evolved, sometimes inappropriately — which I touch on in this post, but explore more in some later posts.
However “What happens when we meet other alien sapient species/civilizations that have also gone through Singularities” isn’t explored in this sequence — primarily because there are rather a lot of variables: it depends a lot how that Singularity went for the first couple of civilizations we encounter, and as I say, I think that has far more possible outcomes than most people are thinking about. (Also, we are really very uncertain of a couple of terms in the Drake Equation, since we have a sample size of 1 and by the Bayesian equivalent of the Anthropic Argument, all that tells us is that it has happened at least once somewhere in the entire universe, so we have no good base rate evidence whether we’re alone in the galaxy or even in the observable or entire universe, or if it’s fairly packed, just not so packed that we’re yet observed obvious evidence.) The only place I’ve explored that is in the SF trilogy I never published (because I got too interested in trying to solve the problem in real life instead before it killed us all). In that, we met two major examples, at around the same time and far enough apart to not immediately produce a 3-way interaction, each of which had gone dramatically differently: briefly one had gone the upload-into-mechanical bodies route and was no longer biological at all, some of them were vastly upgraded though functionally not disempowered, while the other had turned into a unitary ASI sovereign computronium Dyson swarm that disempowered and/or uploaded and adsorbed the sapient species, which had then decided to uplift its entire former ecosystem above a practicable size to to sapience, and for the larger ones to well above it, and have that ecosystem go colonize other starsystems. All of this was assuming no faster-than-light travel or communication, as the system then becomes far more boringly uniform.