Let’s consider the set of worlds where developing AGI does not destroy or permanently disempower humanity.
You have a good point, that in many such scenarios the investors in AI labs, or the AI companies, may not be able to capture more than a tiny fraction of the value they generate.
Does this make the investment a mistake?
Personally, I don’t think so. The people making these investments generally have a level of wealth where no amount of additional money can make more than a small additional improvement to their well being. In contrast, AGI and ASI could plausibly (within their lifetimes) render the world as a whole many OOMs richer, with capabilities that seem centuries or millennia away in a non-AI future. Not being able to claim all of that wealth may cost them status/dominance/control, but they would also gain the status/prestige of having enabled such radical improvement. And in any case, they might (very reasonably, IMO) be willing to trade status for immortality + godlike technological powers.
Also, in the proportion of worlds where the AI labs or those funding them do manage to retain control of the wealth created or to obtain and hold other forms of power, that’s quite the high payoff even at these valuations.
I think this is broadly correct. Investors in AI labs early on likely viewed their investment more like a donation than a traditional financial transaction.
My point is more narrow. That RSI → homogeneous product → low profit margins in the long run assuming similar hardware and data availability. The data and feedback signals these labs are able to collect post-AGI might even be enough for them to provide a large return on investment for their investors who are financially minded.
I think this kind of model is worth bearing in mind for individuals who believe capital will retain relevance post-AGI, and who are currently trying to grow their capital by investing in AI labs or companies dependent on them.
What would you say to the idea that other kinds of capital retain value post-AGI? Like land, or mineral rights, or electricity generating capacity? I think those are also unlikely, but I do come across them once in a while.
Let’s consider the set of worlds where developing AGI does not destroy or permanently disempower humanity.
You have a good point, that in many such scenarios the investors in AI labs, or the AI companies, may not be able to capture more than a tiny fraction of the value they generate.
Does this make the investment a mistake?
Personally, I don’t think so. The people making these investments generally have a level of wealth where no amount of additional money can make more than a small additional improvement to their well being. In contrast, AGI and ASI could plausibly (within their lifetimes) render the world as a whole many OOMs richer, with capabilities that seem centuries or millennia away in a non-AI future. Not being able to claim all of that wealth may cost them status/dominance/control, but they would also gain the status/prestige of having enabled such radical improvement. And in any case, they might (very reasonably, IMO) be willing to trade status for immortality + godlike technological powers.
Also, in the proportion of worlds where the AI labs or those funding them do manage to retain control of the wealth created or to obtain and hold other forms of power, that’s quite the high payoff even at these valuations.
I think this is broadly correct. Investors in AI labs early on likely viewed their investment more like a donation than a traditional financial transaction.
My point is more narrow. That RSI → homogeneous product → low profit margins in the long run assuming similar hardware and data availability. The data and feedback signals these labs are able to collect post-AGI might even be enough for them to provide a large return on investment for their investors who are financially minded.
I think this kind of model is worth bearing in mind for individuals who believe capital will retain relevance post-AGI, and who are currently trying to grow their capital by investing in AI labs or companies dependent on them.
What would you say to the idea that other kinds of capital retain value post-AGI? Like land, or mineral rights, or electricity generating capacity? I think those are also unlikely, but I do come across them once in a while.