It’s not useless, but it’s definitely risky to do it, and the things required for safety would mean distillation has to be very cheap. And here we come to the question “How doomed by default are we if AGI is created?” If the chance is low, I agree that it’s not a risk worth taking. If high, then you’ll probably have to do it. The more doomed by default you think creating AGI is, the more risk you should take, especially with short timelines. So MIRI would probably want to do this, given their atypically high levels of doominess, but most other organizations probably won’t do it due to thinking of fairly low risk from AGI.
It’s not useless, but it’s definitely risky to do it, and the things required for safety would mean distillation has to be very cheap. And here we come to the question “How doomed by default are we if AGI is created?” If the chance is low, I agree that it’s not a risk worth taking. If high, then you’ll probably have to do it. The more doomed by default you think creating AGI is, the more risk you should take, especially with short timelines. So MIRI would probably want to do this, given their atypically high levels of doominess, but most other organizations probably won’t do it due to thinking of fairly low risk from AGI.