Didn’t it use to be for thousands of years, before we had observed thousands of bridge designs falling or not falling and developed exact models, that bridges DID fall down like that quite often?
Have you played Poly Bridge?
Thanks, this is exactly what I were hoping for!
Agreed, about “most intellectual disciplines”, and even more so when it comes to something like art, game design, or startup entrepreneurs. However, I think AI risk is one of the exceptions to this rule, quite strongly.
That was the plan for from the very beginning, but I don’t know anyone like that IRL, and didn’t manage to get in contact with anyone over email or discord after trying several times. Now I did thou, so it’s what I’m doing right now.
Do I really come of as such a complete idiot that these things wouldn’t be obvious and already accounted for? I have a billion ideas, some of which I’ve been sitting on for decades hoping for my health to get better. I’m already a “hobbyist-expert” and have spent most of my life on these questions, but I due to chronic illness and not being a one-in-a-million genius I probably won’t ever be able to work professionally.
I wouldn’t have posted this here if it wasn’t literally a billions-of-live-on-the-line situation, and another week of waiting and trying to express it better might mean disaster. Doing this this way is extremely painful and humiliating to me, but I have tried everything else and I can’t see any way to avoid it that’s ethically defensible. I am extremely disappointed and frightened by your uncharitable reading and my every instinct is screaming at me to delete the post but I can’t. People like you are why this took weeks and almost didn’t get posted, causing severe harm and risk, and your behavior is extremely irresponsible and dangerous.
Your prior about non-obvious ideas being harmless has huge amounts of evidence against it, both this site’s history and the fact all experts and important organization seem to take the risk quite seriously. And I can see several ways my idea in particular could cause significant harm, even if they are on the whole unlikely.