Attempting to destroy anything with non-epsilon probability of preventing you from maximally satisfying your current utility function (such as humans, which might shut you down or modify your utility function in the extreme case) is one of the first instrumentally convergent strategies I thought of, and I’d never heard of instrumentally convergent strategies before today. Seems reasonable for EY to assume.
Attempting to destroy anything with non-epsilon probability of preventing you from maximally satisfying your current utility function (such as humans, which might shut you down or modify your utility function in the extreme case) is one of the first instrumentally convergent strategies I thought of, and I’d never heard of instrumentally convergent strategies before today. Seems reasonable for EY to assume.