Studying a subject gets progressively harder as you learn more and more, and the effort required is conjectured to be exponential or worse … the initial ‘honeymoon’ phase tends to peter out eventually.
In terms of AI this would mean that the model size/power consumption would be exponential in “intelligence” (whatever it might mean, probably some unsaturated benchmark score). Do the last 3 years confirm or refute this?
If confirmed, would it not give us some optimism that we are not all gonna die, because the “true” superintelligence we cannot ever hope to control would require so much resources, we would have to colonize the lightcone as non-superintelligent humans to get there?
I think this depends on how you do the research. If your goal is to “know everything about X”, yes that expands exponentially, at least at the beginning. But if you have a specific goal and use some heuristic to focus on the parts that seem relevant, that should be more manageable.
I once conjectured that
In terms of AI this would mean that the model size/power consumption would be exponential in “intelligence” (whatever it might mean, probably some unsaturated benchmark score). Do the last 3 years confirm or refute this?
If confirmed, would it not give us some optimism that we are not all gonna die, because the “true” superintelligence we cannot ever hope to control would require so much resources, we would have to colonize the lightcone as non-superintelligent humans to get there?
I think this depends on how you do the research. If your goal is to “know everything about X”, yes that expands exponentially, at least at the beginning. But if you have a specific goal and use some heuristic to focus on the parts that seem relevant, that should be more manageable.