I wouldn’t be persuaded to “let the AI out” by that argument. In fact, even after reading about the AI box experiments I still can’t imagine any argument that would convince me to let the AI out. As somebody not affiliated with SIAI at all, I think my somehow being persuaded would count for more evidence than, for instance Carl Shulman being persuaded. Unfortunately, because I’m not affiliated with the AI research community in general, I’m presumably not qualified to participate in an AI-box experiment.
I wouldn’t be persuaded to “let the AI out” by that argument. In fact, even after reading about the AI box experiments I still can’t imagine any argument that would convince me to let the AI out.
For some time now I suspect that the argument that convinced Carl Shulman and others was along the lines of acausal trade. See here, here and here. Subsequently I suspect that those who didn’t let the AI out of the box either didn’t understand the implications, haven’t had enough trust into the foundations and actuality of acausal trade, or were more like General Thud.
When Eliezer was doing them, the primary qualification was being willing to put up enough money to get Eliezer to do it. (I’m not criticizing him for this—it was a clever and interesting fundraising technique; and doing it for small sums would set a bad precedent.)
I wouldn’t be persuaded to “let the AI out” by that argument. In fact, even after reading about the AI box experiments I still can’t imagine any argument that would convince me to let the AI out. As somebody not affiliated with SIAI at all, I think my somehow being persuaded would count for more evidence than, for instance Carl Shulman being persuaded. Unfortunately, because I’m not affiliated with the AI research community in general, I’m presumably not qualified to participate in an AI-box experiment.
For some time now I suspect that the argument that convinced Carl Shulman and others was along the lines of acausal trade. See here, here and here. Subsequently I suspect that those who didn’t let the AI out of the box either didn’t understand the implications, haven’t had enough trust into the foundations and actuality of acausal trade, or were more like General Thud.
When Eliezer was doing them, the primary qualification was being willing to put up enough money to get Eliezer to do it. (I’m not criticizing him for this—it was a clever and interesting fundraising technique; and doing it for small sums would set a bad precedent.)