[Question] How much should you be willing to pay for an AGI?

Having spent quite a bit of time with GPT-3, my feelings can be expressed as:

This is really awesome, but it would be even better if it didn’t cost $0.06 per character.

GPT-3 is slightly too expensive for many of the use-cases that I am interested in. This problem is made even worse by the fact that one of the basic techniques I normally use in procedural generation is “generate 100 of something and then pick the best one”.

This is actually a noticeable problem with Deep Learning generally in the present moment. Tools like AI-Dungeon and Artbreeder are intentionally handicapped in ways that are designed to minimize the amount that users actually need to use the Deep Learning that makes them interesting.

Now, if we look at the Metaculus prediction for this question, the bulk of the probability mass is >=100 petaflops, which is ~10,000x more than GPT-3.

So, how much would we be willing to pay for access to such an AGI?

To make this more concrete, imagine that the first AGI is approximately as smart as the smartest human who ever lived. An obvious lower bound is “how much do really smart people make on average?”. While this number varies widely from profession to profession, I think the fact that a Senior Software Engineer at Google makes somewhere around $250k/​year $125/​working hour is probably a decent estimate.

On the other hand, the upper-bound is probably something like “how much money do we have?”. After all, Von-Neumann was responsible for ground-breaking innovations in fields such as quantum theory, the development of nuclear weapons, and the invention of the digital computer. Having access to the world’s smartest person might literally be a matter of national survival.

If you consider that the Manhatten Project cost about 1% of the GDP of the US, that equals $227Billion/​year or about $25million/​hour.

Interestingly, if AGI really requires 100 petaflops, this number is not too far from the actual cost of running such an AGI. Computing on a GPU is estimated to cost between $0.03 and $0.30, which is $3million-$30million/​hour for our hypothetical AGI (I have no idea why the range is so wide).

This suggests that we might be nearing the moment when a Manhattan project to build an AGI is reasonable, but we are nowhere near the point where commercial applications of AGI are feasible*.

*throughout this post, I assume that the development of AGI is primarily hardware-bound and that development of the first AGI will not lead to a hard takeoff. If the development of the first AGI does lead to recursive-self improvement followed shortly thereafter by a singularity, then the expected value of the first AGI is either insanely high or insanely negative (probably the latter though).