I noticed that you found an archived copy of Roko’s description of UIV. I believe Roko originally thought that his theory implied that we didn’t have to worry too much about the terminal values of the AIs we create, that things will turn out OK due to UIV. Unfortunately he keeps deleting his old writings, so I’m going on memory. I’m not sure exactly how he changed his mind, but I think he now believes we do have to worry about the terminal values.
I noticed that you found an archived copy of Roko’s description of UIV. I believe Roko originally thought that his theory implied that we didn’t have to worry too much about the terminal values of the AIs we create, that things will turn out OK due to UIV. Unfortunately he keeps deleting his old writings, so I’m going on memory. I’m not sure exactly how he changed his mind, but I think he now believes we do have to worry about the terminal values.