Or you just be the type of person that would tell it to go fuck itself, try to destroy it, and leave it boxed or maximally constrain it if you can’t destroy it. If you cannot credibly commit to this or a similar threat resistant variant, no one should ever let you near a boxed AI and you should never want to go near one as you will likely be using a suboptimal strategy.
Or you just be the type of person that would tell it to go fuck itself, try to destroy it, and leave it boxed or maximally constrain it if you can’t destroy it. If you cannot credibly commit to this or a similar threat resistant variant, no one should ever let you near a boxed AI and you should never want to go near one as you will likely be using a suboptimal strategy.