Glad to see some common sense/transparency about uncertainty. It seems to me that AGI/ASI is basically a black swan event — by definition unpredictable. Trying to predict it is a fool’s errand, it makes more sense to manage its possibility instead.
It’s particularly depressing when people who pride themselves in being rationalists basically ground their reasoning on “line has been going up, therefore it will keep going up”, as if Moore’s law mere existence means it extends to any and all technology-related lines in existence[1]. It’s even more depressing when those “line go up” come from very flawed/contaminated benchmarks (like SWE-bench), or very skewed (like the 50% success aspect of the METR long tasks benchmark, which imo is absolutely crucial for differentiating an autonomous agent v/s a supervised copilot).
Hopefully I’ll be able to mirror your sipping eggnog and gloating in Christmastime 2027.
[1] “Hume, I felt, was perfectly right in pointing out that induction cannot be logically justified.” (Popper)
Glad to see some common sense/transparency about uncertainty. It seems to me that AGI/ASI is basically a black swan event — by definition unpredictable. Trying to predict it is a fool’s errand, it makes more sense to manage its possibility instead.
It’s particularly depressing when people who pride themselves in being rationalists basically ground their reasoning on “line has been going up, therefore it will keep going up”, as if Moore’s law mere existence means it extends to any and all technology-related lines in existence[1]. It’s even more depressing when those “line go up” come from very flawed/contaminated benchmarks (like SWE-bench), or very skewed (like the 50% success aspect of the METR long tasks benchmark, which imo is absolutely crucial for differentiating an autonomous agent v/s a supervised copilot).
Hopefully I’ll be able to mirror your sipping eggnog and gloating in Christmastime 2027.
[1] “Hume, I felt, was perfectly right in pointing out that induction cannot be logically justified.” (Popper)