Needn’t be total—average would suggest creating one single extremely happy being—probably not human.
Needn’t only include hedonic pleasure—a preference utilitarian might support eliminating humans and replacing them with beings whose preferences are cheap to satisfy (hedonic pleasure being one cheap preference). Or you could want multiple kinds of pleasure, but see hedonic as always more efficient to deliver as proposed in the post.
Who cares about humans exactly? I care about utility. If the AI thinks humans aren’t an efficient way of generating utility, we should be eliminated.