I disagree strongly. You can repeatedly get it it wrong with failed states, and learn from your mistakes. The utility cost for each failure is additive, whereas the first FAI failure is fatal.
Distinguish the difficulty of developing an adequate theory, from the difficulty of verifying that a theory is adequate. It’s the failure with the latter that might lead to disaster, while not failing requires a lot of informed rational caution. On the other hand, not inventing an adequate theory doesn’t directly lead to a disaster, and failure to invent an adequate theory of FAI is something you can learn from (the story of my life for the last three years).
Distinguish the difficulty of developing an adequate theory, from the difficulty of verifying that a theory is adequate. It’s the failure with the latter that might lead to disaster, while not failing requires a lot of informed rational caution. On the other hand, not inventing an adequate theory doesn’t directly lead to a disaster, and failure to invent an adequate theory of FAI is something you can learn from (the story of my life for the last three years).