I think the problem with this is that markets are a complicated and highly inefficient tool for coordinating resource consumption among competing individuals without needing an all-knowing resource-allocator. This is extremely useful when you need to coordinate resource consumption among competing individuals, but in the case of programming, the functions in your program aren’t really competing in the same way (there’s a limited pool of resources, but for the most part they each need a precise amount of memory, disk space, CPU time, etc. and no more and no less).
There also is a close-enough-to-all-knowing resource allocator (the programmer or system administrator). The market model actually sounds like a plausibly-workable way to do profiling, but it would be less overhead to just instrument every function to report what resources it uses and then cental-plan your resource economy.
In short, if everyone is a mindless automaton who takes only what they need and performs exactly what others require of them, and if the central planner can easily know exactly what resources exist and who wants them, then central planning works fine and markets are overkill (at least in the sense of being a useful tool; capitalism-as-a-moral-system is out-of-scope when talking about computer programs).
Note that even in cases like Amazon Web Services, the resource tracking and currency is just there to charge the end-user. Very few programs take these costs into account while they’re executing (the exception is EC2 instance spot-pricing, but I think it’s a stretch to even call that agoric computing).
Also, one other thing to consider is that agoric computing trades off something really, really cheap (computing resources) for something really, really expensive (programmer time). Most people don’t even bother profiling because programmer time is dramatically more valuable than computer parts.
The central planner may know exactly what resources exist on the system they own, but they don’t know all the algorithms and data that are available somewhere on the internet.
Agoric computing would enable more options for getting programmers and database creators to work for you.
When dealing with resources on the internet, you’re running into the “trading off something cheap for something expensive” issue again. I could *right now* spend several days/ weeks write a program that dynamically looks up how expensive it is to run some algorithm on arbitrary cloud providers and run on the cheapest one (or wait if the price is too high), but it would be much faster for me to just do a quick Google search and hard-code to the cheapest provider right now. They might not always be the cheapest but it’s probably not worth thousands of dollars of my time to optimize this more than that.
Regarding writing a program to dynamically lookup more complicated resources like algorithms and data.. I don’t know how you would do this without a general-purpose programmer-equivalent AI. I think maybe your view of programming seriously underestimates how hard this is. Probably 95% of data science is finding good sources of data, getting them into a somewhat-machine-readable-form, cleaning them up, and doing various validations that the data makes any sense. If it was trivial for programs to use arbitrary data on the internet, there would be much bigger advancements than agoric computing.
I think the problem with this is that markets are a complicated and highly inefficient tool for coordinating resource consumption among competing individuals without needing an all-knowing resource-allocator. This is extremely useful when you need to coordinate resource consumption among competing individuals, but in the case of programming, the functions in your program aren’t really competing in the same way (there’s a limited pool of resources, but for the most part they each need a precise amount of memory, disk space, CPU time, etc. and no more and no less).
There also is a close-enough-to-all-knowing resource allocator (the programmer or system administrator). The market model actually sounds like a plausibly-workable way to do profiling, but it would be less overhead to just instrument every function to report what resources it uses and then cental-plan your resource economy.
In short, if everyone is a mindless automaton who takes only what they need and performs exactly what others require of them, and if the central planner can easily know exactly what resources exist and who wants them, then central planning works fine and markets are overkill (at least in the sense of being a useful tool; capitalism-as-a-moral-system is out-of-scope when talking about computer programs).
Note that even in cases like Amazon Web Services, the resource tracking and currency is just there to charge the end-user. Very few programs take these costs into account while they’re executing (the exception is EC2 instance spot-pricing, but I think it’s a stretch to even call that agoric computing).
Also, one other thing to consider is that agoric computing trades off something really, really cheap (computing resources) for something really, really expensive (programmer time). Most people don’t even bother profiling because programmer time is dramatically more valuable than computer parts.
The central planner may know exactly what resources exist on the system they own, but they don’t know all the algorithms and data that are available somewhere on the internet. Agoric computing would enable more options for getting programmers and database creators to work for you.
When dealing with resources on the internet, you’re running into the “trading off something cheap for something expensive” issue again. I could *right now* spend several days/ weeks write a program that dynamically looks up how expensive it is to run some algorithm on arbitrary cloud providers and run on the cheapest one (or wait if the price is too high), but it would be much faster for me to just do a quick Google search and hard-code to the cheapest provider right now. They might not always be the cheapest but it’s probably not worth thousands of dollars of my time to optimize this more than that.
Regarding writing a program to dynamically lookup more complicated resources like algorithms and data.. I don’t know how you would do this without a general-purpose programmer-equivalent AI. I think maybe your view of programming seriously underestimates how hard this is. Probably 95% of data science is finding good sources of data, getting them into a somewhat-machine-readable-form, cleaning them up, and doing various validations that the data makes any sense. If it was trivial for programs to use arbitrary data on the internet, there would be much bigger advancements than agoric computing.