Suppose Hypothetically that I had a brilliant AI idea, (I haven’t) I don’t know much about AI strategy or arms races, Im not quite sure what follows from this idea, but I have reason to suspect that made building a powerful AI much easier. Naturally I don’t want to share this Idea if its dangerous. If I specify too exactly what the implications are, people could work backwards to the idea. People knowing the idea exists could be bad if it encourages them to entice the idea out of me. I suggest it would be useful to make some sort of dangerous AI idea decision tree, and/ or helpline.
For a concrete example, what would you do if you had discovered a linear time AIXI?
[I’m a grad student at CHAI, but I am not officially speaking on behalf of CHAI or making any promises on anybody’s behalf]
If you reached out to a grad student at CHAI or one of our staff, I strongly suspect that we would at least screen the idea for sanity checking, and if it passed that test, I predict that we would seriously consider what to do with it and how dangerous it was.
Suppose Hypothetically that I had a brilliant AI idea, (I haven’t) I don’t know much about AI strategy or arms races, Im not quite sure what follows from this idea, but I have reason to suspect that made building a powerful AI much easier. Naturally I don’t want to share this Idea if its dangerous. If I specify too exactly what the implications are, people could work backwards to the idea. People knowing the idea exists could be bad if it encourages them to entice the idea out of me. I suggest it would be useful to make some sort of dangerous AI idea decision tree, and/ or helpline.
For a concrete example, what would you do if you had discovered a linear time AIXI?
[I’m a grad student at CHAI, but I am not officially speaking on behalf of CHAI or making any promises on anybody’s behalf]
If you reached out to a grad student at CHAI or one of our staff, I strongly suspect that we would at least screen the idea for sanity checking, and if it passed that test, I predict that we would seriously consider what to do with it and how dangerous it was.
Contact FHI.