There’s been 20 years of “Prompt programming” now, and so loads of apps have been built using it and lots of kinks have been worked out. Any thoughts on what sorts of apps would be up and running by 2040 using the latest models?
Prompt programming isn’t as good as it was cracked up to be. In the past, as old timers never shut up about:
“Programmers* just wrote programs and they *** worked! Or at least gave errors that made sense! And there was some consistency! When programs crashed for ‘no reason’ there used to a reason! None of this ‘neural network gets cancer from the internet’s sheer stupidity and dies.’ crap!”
In the present, programming is a more...mixed role. Debugging is a nightmare, brute force to find good (inputs to get good) outputs remains a distressingly large amount of the puzzle, despite all the fancy techniques that dress it up as something—anything—else in a world that no longer makes sense.
The people working on maintenance deal with the edge cases and see things they never wanted to see, as the price paid for wonderful tech. “The fat tails of the distribution” as Taleb put it, leads to more disillusionment, burnout, etc. in the associated professions when people try to do things too ambitious. This isn’t some grand narrative about Atlantis, and hubris—just more of the same, with technology that can do wonderful things, but is consistently over budget, over-hyped—the dream of AGI remains hacked together, created half by ‘machines’ and half by people.
Imagine an ice cream machine that seems (far too often) to stop working just when you need it the most, a delicate piece of machinery that is always a pain..to deal with. This is the future of AI. (One second it works great, the next, a shitshow—a move nicknamed ‘the treacherous swan dive’. Commercial applications secretly, under the hood involve way too much caching—saving good outputs—and a lot of people working to create stuff that extrapolates from past good outputs, and reasons by input similarity to cover holes that inevitably pop up. In other words, AI will secretly be 20% humans trying to answer the question ‘what would the AI do if it worked right on this input instead of spewing nonsense?’, when there’s enough stuff to cover the holes. The other 80% of the time, it works amazing well, and performs feats once considered miraculous, though amazement quickly fades and consumers soon return to having ridiculous expectations, from this, rather brittle technology.)
*original sentence with typos: Where once programs just wrote programs and they *** worked!
Some examples of products and services:
More people try to do things—like write books, due to encouragement from AIs. Editors are astounded by the dramatic variation in quality over the course of manuscripts.
Fanfiction enters a new age. When just AIs write stories, new genres that play to their strengths result, but which are...difficult to understand. For instance, descendants of ‘Egg Salad’ which was were when a story is rewritten, with one of the characters...as an egg.
Eventually AIs gets good, but conflicts arise, like: Is the ‘unofficial ending’ written by GPZ really how the author would have wrapped things up? Or is, GPL’s ending, though less exciting, more true to the themes of the story? Debates emerge about whether an author is or isn’t human (and to what extent), and whether it’s really art if it’s not made by a person. Could a human being really have solved the who-dunit, given that it wasn’t written by a human being? These arguments over the souls of fiction are taken seriously by some. Fans of Sherlock Holmes seem to care about the legacy/the future of mysteries. Other genres, it varies.
Starlink internet is fast, reliable, cheap, and covers the entire globe.
Starlink isn’t super cheap. But the quality, given the price, is a great deal, and eventually it becomes very popular as people get tired of ‘the internet being slow’ or not working even for short periods of time. In order to cut costs, however, businesses** that ‘don’t really care about their customers’ don’t always invest in it, and remain a source of complaints.
**also schools, K-12.
3D printing is much better and cheaper now. Most cities have at least one “Additive Factory” that can churn out high-quality metal or plastic products in a few hours and deliver them to your door, some assembly required. (They fill up downtime by working on bigger orders to ship to various factories that use 3D-printed components, which is most factories at this point since there are some components that are best made that way)
A battle begins for the label ‘artisanal’.
“But it looks like it was made by a real person!”
“It looks too good. The errors there, and here, and there—it’s too authentic.”
Unspeakable things happen in fashion, which becomes way more varied. In some places/groups, waste and conspicuous consumption (ridiculous number of clothes, changing all the time to keep up with fashion changing with speed unimaginable today) grow so extreme that it accidentally creates a competitor out of ‘minimalism’*** (a small set of amazing clothes, possibly designed to be tweaked (regularly) a little bit to fit in, but not too much, and not too obviously).
***As a result of very popular/fashionable people being left behind, and fighting the trends, so they don’t have to work at keeping up literally every second of every day.
Prompt programming isn’t as good as it was cracked up to be. In the past, as old timers never shut up about:
“Programmers* just wrote programs and they *** worked! Or at least gave errors that made sense! And there was some consistency! When programs crashed for ‘no reason’ there used to a reason! None of this ‘neural network gets cancer from the internet’s sheer stupidity and dies.’ crap!”
In the present, programming is a more...mixed role. Debugging is a nightmare, brute force to find good (inputs to get good) outputs remains a distressingly large amount of the puzzle, despite all the fancy techniques that dress it up as something—anything—else in a world that no longer makes sense.
The people working on maintenance deal with the edge cases and see things they never wanted to see, as the price paid for wonderful tech. “The fat tails of the distribution” as Taleb put it, leads to more disillusionment, burnout, etc. in the associated professions when people try to do things too ambitious. This isn’t some grand narrative about Atlantis, and hubris—just more of the same, with technology that can do wonderful things, but is consistently over budget, over-hyped—the dream of AGI remains hacked together, created half by ‘machines’ and half by people.
Imagine an ice cream machine that seems (far too often) to stop working just when you need it the most, a delicate piece of machinery that is always a pain..to deal with. This is the future of AI. (One second it works great, the next, a shitshow—a move nicknamed ‘the treacherous swan dive’. Commercial applications secretly, under the hood involve way too much caching—saving good outputs—and a lot of people working to create stuff that extrapolates from past good outputs, and reasons by input similarity to cover holes that inevitably pop up. In other words, AI will secretly be 20% humans trying to answer the question ‘what would the AI do if it worked right on this input instead of spewing nonsense?’, when there’s enough stuff to cover the holes. The other 80% of the time, it works amazing well, and performs feats once considered miraculous, though amazement quickly fades and consumers soon return to having ridiculous expectations, from this, rather brittle technology.)
*original sentence with typos: Where once programs just wrote programs and they *** worked!
More people try to do things—like write books, due to encouragement from AIs. Editors are astounded by the dramatic variation in quality over the course of manuscripts.
Fanfiction enters a new age. When just AIs write stories, new genres that play to their strengths result, but which are...difficult to understand. For instance, descendants of ‘Egg Salad’ which was were when a story is rewritten, with one of the characters...as an egg.
Eventually AIs gets good, but conflicts arise, like: Is the ‘unofficial ending’ written by GPZ really how the author would have wrapped things up? Or is, GPL’s ending, though less exciting, more true to the themes of the story? Debates emerge about whether an author is or isn’t human (and to what extent), and whether it’s really art if it’s not made by a person. Could a human being really have solved the who-dunit, given that it wasn’t written by a human being? These arguments over the souls of fiction are taken seriously by some. Fans of Sherlock Holmes seem to care about the legacy/the future of mysteries. Other genres, it varies.
Starlink isn’t super cheap. But the quality, given the price, is a great deal, and eventually it becomes very popular as people get tired of ‘the internet being slow’ or not working even for short periods of time. In order to cut costs, however, businesses** that ‘don’t really care about their customers’ don’t always invest in it, and remain a source of complaints.
**also schools, K-12.
A battle begins for the label ‘artisanal’.
“But it looks like it was made by a real person!”
“It looks too good. The errors there, and here, and there—it’s too authentic.”
Unspeakable things happen in fashion, which becomes way more varied. In some places/groups, waste and conspicuous consumption (ridiculous number of clothes, changing all the time to keep up with fashion changing with speed unimaginable today) grow so extreme that it accidentally creates a competitor out of ‘minimalism’*** (a small set of amazing clothes, possibly designed to be tweaked (regularly) a little bit to fit in, but not too much, and not too obviously).
***As a result of very popular/fashionable people being left behind, and fighting the trends, so they don’t have to work at keeping up literally every second of every day.
Small props start to creep in.
And ridiculous hats make an astounding comeback!