Long timelines is the main one, but also low p(doom), low probability on the more serious forms of RSI which seem both likely and very dangerous, and relatedly not being focused on misalignment/power-seeking risks to the extent that seems correct given how strong a filter that is on timelines with our current alignment technology. I’m sure not all epoch people have these issues, and hope that with the less careful ones leaving the rest will have more reliably good effects on the future.
Which incorrect conclusions do you think they have been tied to, in your opinion?
Long timelines is the main one, but also low p(doom), low probability on the more serious forms of RSI which seem both likely and very dangerous, and relatedly not being focused on misalignment/power-seeking risks to the extent that seems correct given how strong a filter that is on timelines with our current alignment technology. I’m sure not all epoch people have these issues, and hope that with the less careful ones leaving the rest will have more reliably good effects on the future.