Possibly a good idea (when you put this as a Trolley problem, with the whole of future potential on the other side), but too difficult to implement in a way that gives advantage to future development of FAI (otherwise you just increase existential risk if civilization never recovers, or replay the same race as we face now).
Very good answer.
Also, depending on temporal discounting, even a perfect plan that trades current humanity for future FAI with certainty could be incorrect, so we’d prefer to keep present humanity and reject the future FAI.
Very good answer.
Also a good point.