My assumption is that when people say AGI here they mean Bostrom’s ASI, and they got linked because Eliezer believed (and believes still?) that AGI will FOOM into ASI almost immediately, which it has not.
In case this wasn’t clear from early discussion, I disagree with Eliezer on a number of topics, including takeoff speeds. In particular I disagree about the time from AI that is economically transformative to AI that is much, much more powerful.
I think you’ll probably find it healthier and more productive to not think of LW as an amorphous collective and instead note that there are a variety of different people who post on the forum with a variety of different views. (I sometimes have made this mistake in the past and I find it healthy to clarify at least internally.)
E.g. instead of saying “LW has bad views about X” say “a high fraction of people who comment on LW have bad views about X” or “a high fraction of karma votes seem to be from people with bad views about X”. Then, you should maybe double check the extent to which a given claim is actualy right : ). For instance, I don’t think almost immediate FOOM is the typical view on LW when aggregating by most metrics, a somewhat longer duration takeoff is now a more common view I think.
.
Definition in the OpenAI Charter:
A post on the topic by Richard (AGI = beats most human experts).
In case this wasn’t clear from early discussion, I disagree with Eliezer on a number of topics, including takeoff speeds. In particular I disagree about the time from AI that is economically transformative to AI that is much, much more powerful.
I think you’ll probably find it healthier and more productive to not think of LW as an amorphous collective and instead note that there are a variety of different people who post on the forum with a variety of different views. (I sometimes have made this mistake in the past and I find it healthy to clarify at least internally.)
E.g. instead of saying “LW has bad views about X” say “a high fraction of people who comment on LW have bad views about X” or “a high fraction of karma votes seem to be from people with bad views about X”. Then, you should maybe double check the extent to which a given claim is actualy right : ). For instance, I don’t think almost immediate FOOM is the typical view on LW when aggregating by most metrics, a somewhat longer duration takeoff is now a more common view I think.
Also, I’m going to peace out of this discussion FYI.