p.s. I just realized that I did not answer your question:
> Is it about believing that systems have become safer and more controlled over time?
No this is not my issue here. While I hope it won’t be the case, systems could well become more risky and less controlled over time. I just believe that if that is the case then it would be observable via seeing increased rate of safety failures far before we get to the point where failure means that literally everyone on earth dies.
What’s the least-worrying thing we may see that you’d expect to lead to a pause in development?
(this isn’t a trick question; I just really don’t know what kind of thing gradualists would consider cause for concern, and I don’t find official voluntary policies to be much comfort, since they can just be changed if they’re too inconvenient. I’m asking for a prediction, not any kind of commitment!)
p.s. I just realized that I did not answer your question:
> Is it about believing that systems have become safer and more controlled over time?
No this is not my issue here. While I hope it won’t be the case, systems could well become more risky and less controlled over time. I just believe that if that is the case then it would be observable via seeing increased rate of safety failures far before we get to the point where failure means that literally everyone on earth dies.
What’s the least-worrying thing we may see that you’d expect to lead to a pause in development?
(this isn’t a trick question; I just really don’t know what kind of thing gradualists would consider cause for concern, and I don’t find official voluntary policies to be much comfort, since they can just be changed if they’re too inconvenient. I’m asking for a prediction, not any kind of commitment!)