Utilitarianism- WBE (uploading) > FAI

If you were a utilitarian, then why would you want to risk creating an AGI that had the potential to be an existential risk, when you could eliminate all suffering with the advent of WBE (whole brain emulation) and hence virtual reality (or digital alteration of your source code) and hence utopia? Wouldn’t you want to try to prevent AI research and just promote WBE research? Or is it that AGI is more likely to come before WBE and so we should focus our efforts on making sure that the AGI is friendly? Or maybe uploading isn’t possible for technological or philosophical reasons (substrate dependence)?
Is there a link to a discussion on this that I’m missing out on?