I suggested that, while a set of plans for a nuclear reactor might be true, and safe if executed correctly, if not executed correctly this might have similar effects to a nuke. Thus ‘stability’ - if something is (almost) impossible for humans to execute correctly, and is unsafe in the event it is performed even slightly incorrectly, then it is ‘unstable’ (and dangerous in a different way than ‘stable’ designs for a nuke).
Misuse starts to get into relativity—someone who would never use plans to build a nuke isn’t harmed by receiving them (absent other actors trying to steal said plans from them), which means information hazards are relative.
I suggested that, while a set of plans for a nuclear reactor might be true, and safe if executed correctly, if not executed correctly this might have similar effects to a nuke. Thus ‘stability’ - if something is (almost) impossible for humans to execute correctly, and is unsafe in the event it is performed even slightly incorrectly, then it is ‘unstable’ (and dangerous in a different way than ‘stable’ designs for a nuke).
Misuse starts to get into relativity—someone who would never use plans to build a nuke isn’t harmed by receiving them (absent other actors trying to steal said plans from them), which means information hazards are relative.