Arms races are bad things. First best by far is if nobody has the doomsday devices, but second best is if we attempt nonproliferation of doomsday devices.
As a parallel, we would have been still at risk in a world where DeepMind was working on building ASI but where Elon didn’t freak out and start a competitor (followed by another competitor), but not as much risk. That’s not because DeepMind are “the good guys”, it’s because of race dynamics.
When it specifically comes to loss-of-control risks killing or sidelining all of humanity, I don’t believe Sam or Dario or Demis or Elon want that to happen, because it would happen to them too. (Larry Page is different on that count, of course.) You do have conflict theory over the fact that some of them would like ASI to make them god-emperor of the universe, but all of them would definitely take a solution to “loss of control” if it were handed to them on a silver platter.