Taboo “compute overhang”

There’s no consensus definition of “compute overhang” or “hardware overhang.” See e.g. 1 2 3 4 5, and I’ve recently seen researchers use several different definitions. And I asked some friends what “hardware overhang” means and they had different responses (a plurality said it means sufficient hardware for human-level AI already exists, which is not a useful concept). If you say “compute overhang” without clarification, many people will misunderstand you.

Instead of tabooing it, we could declare a canonical definition. I think the best candidate is something like: there is a compute overhang to the extent that the largest training runs could quickly be scaled up. But I think it’s probably better for people to avoid the term or define it whenever they use it.

P.S. You should also define “takeoff speed,” “transformative AI,” and “warning shot” whenever you’re using them in a precise way; their usage varies a lot too. (Using those terms to vaguely gesture at “the speed of progress around human-level AI,” “powerful AI,” and “legible scary AI event,” respectively, seems ok).