The general trend has been to make computers user-friendly, and to hide the complexity from the user. On one hand, this has been helpful for their diffusion, and I’m sure it benefited a lot of people in a lot of ways (besides making a lot of money). If I think of my parents, for instance, I can’t believe they would have ever started to use computers had they been more complicated.
On the other hand, this might be the fundamental obstacle in the way of coding literacy. To do stuff in the modern world, you actually have to know how to read and write. To use computers, you don’t need to know how to program (at the level that most people use them). If computers keep getting more intuitive, interactive and user friendly, why should people feel the need to understand them?
(One could imagine a future where technology is so attuned to people’s intentions, they can just think, gesture or say what they want, and the machine provides; as a result, they lose interest in reading and writing, and a new dark age of illiteracy begins).
I had trouble understanding how the different facts and judgments in your post are connected between each other and with the concept of upside decay.
But I want to say that I really appreciate the concept, because something very similar occurred to me once, though at the time I didn’t give it a name. I was studying the careers of creative artists, and there is a lot of discrimination in these fields. Against women, against people who start out in less prestigious institutions, and so on.
My idea was that because many people were excluded and diversity was stifled, this reduced the probability of “hitting the jackpot” with an extremely brilliant artist that would be the far right of the “artistic potential” curve and end up being the next Picasso. I wanted to model this intuition and verify it in the data, but eventually my project changed and I moved on. The idea, anyway, is that you reduce the chance of getting outliers (or even black swans) in the tails, but you only care about positive outliers.