It seems to me that I have done a lot of careful thinking about timelines, and that I also feel the AGI. Why can’t you have a careful understanding what timelines we should expect, and also have an emotional reaction to that? Reasonably coming to the conclusion that many things will change greatly in the next few years deserves a reaction.
As I use the term, the presence or absence of an emotional reaction isn’t what determines whether someone is “feeling the AGI” or not. I use it to mean basing one’s AI timeline predictions on a feeling.
It seems to me that I have done a lot of careful thinking about timelines, and that I also feel the AGI. Why can’t you have a careful understanding what timelines we should expect, and also have an emotional reaction to that? Reasonably coming to the conclusion that many things will change greatly in the next few years deserves a reaction.
As I use the term, the presence or absence of an emotional reaction isn’t what determines whether someone is “feeling the AGI” or not. I use it to mean basing one’s AI timeline predictions on a feeling.