It depends on how they did it. If they did it by formalizing the notion of “the values and preferences (coherently extrapolated) of (the living members of) the species that created the AI”, then even just blindly copying their design without any attempt to understand it has a very high probability of getting a very good outcome here on Earth.
The AI of course has to inquire into and correctly learn about our values and preferences before it can start intervening on our behalf, so one way such a blind copying might fail is if the method the aliens used to achieve this correct learning depended on specifics of the situation on the alien planet that don’t obtain here on Earth.
It depends on how they did it. If they did it by formalizing the notion of “the values and preferences (coherently extrapolated) of (the living members of) the species that created the AI”, then even just blindly copying their design without any attempt to understand it has a very high probability of getting a very good outcome here on Earth.
The AI of course has to inquire into and correctly learn about our values and preferences before it can start intervening on our behalf, so one way such a blind copying might fail is if the method the aliens used to achieve this correct learning depended on specifics of the situation on the alien planet that don’t obtain here on Earth.