Definitely barking up the wrong tree there. Chaos-worshippersDynamists like me are under-represented here for such a technology-loving community—note that the whole basis of FAI is that rapidly self-improving technology by default results in a Bad End.
I am asking for Eliezer to apply the technique described in this essay to his own belief system. I don’t see how that could be barking up the wrong tree, unless you are implying that he is some how impervious to “spontaneously self-attack[ing] strong points with comforting replies to rehearse, then to spontaneously self-attack the weakest, most vulnerable points.”
Definitely barking up the wrong tree there.
Chaos-worshippersDynamists like me are under-represented here for such a technology-loving community—note that the whole basis of FAI is that rapidly self-improving technology by default results in a Bad End.Contrast EY’s notion of AGI with Ben Goertzel’s.
I am asking for Eliezer to apply the technique described in this essay to his own belief system. I don’t see how that could be barking up the wrong tree, unless you are implying that he is some how impervious to “spontaneously self-attack[ing] strong points with comforting replies to rehearse, then to spontaneously self-attack the weakest, most vulnerable points.”