What I pointed was that the spaceship examples had very specific features:
Both personal and economic incentives are against the issue.
The problem are obvious when one is confronted with the situation
At the point where the problem becomes obvious, you can still solve it.
My intuition is that the main disanalogies with the AGI case are the first one (at least the economic incentives that might push people to try dangerous things when the returns are potentially great) and the last one, depending on your position on takeoffs.
I talked about this issue with Buck in the comments (my comment, Buck’s answer)
What I pointed was that the spaceship examples had very specific features:
Both personal and economic incentives are against the issue.
The problem are obvious when one is confronted with the situation
At the point where the problem becomes obvious, you can still solve it.
My intuition is that the main disanalogies with the AGI case are the first one (at least the economic incentives that might push people to try dangerous things when the returns are potentially great) and the last one, depending on your position on takeoffs.