Not entirely content-free, but very much stream of consciousness and thoroughly mistaken. The main argument seems to be:
Making logical arguments requires accepting some (unspecified) axioms about “symmetry”.
These axioms must be accepted with 100% credence.
This conflicts with the common (though not universally accepted) LW position that nothing can be known with literally 100% credence.
Until LW accepts the author’s preferred epistemology, there’s little point in engaging directly with discussion on LW.
Thus, there’s no point in writing up an actual proof of their claim that alignment is impossible.
The author is also pretty rude, repeatedly calling rationalists unreasonable, irrational, and generally seeming very offended over a difference in opinion about a fairly trivial (IMO) point of epistemology.
Relevant context: other work by the author was linked previously, and Paul Christiano said that work seemed “cranky”, so I don’t hold the author’s abrasiveness fully against him.
I still singly-downvoted this post because I think the core of the provided argument is extremely weak. As far as I saw, the author just repeatedly asserts that performing logical reasoning implies you should assign 100% confidence to at least some claims, and that rationalists are completely irrational for thinking otherwise. All the while, the author made no reference whatsoever to preexisting work in this area. E.g., MIRI’s Logical Induction paper directly explains one way to have coherent uncertainties over logical / mathematical facts, as well as the limits of ones own reasoning process, despite Gödel incompleteness.
Not entirely content-free, but very much stream of consciousness and thoroughly mistaken. The main argument seems to be:
Making logical arguments requires accepting some (unspecified) axioms about “symmetry”.
These axioms must be accepted with 100% credence.
This conflicts with the common (though not universally accepted) LW position that nothing can be known with literally 100% credence.
Until LW accepts the author’s preferred epistemology, there’s little point in engaging directly with discussion on LW.
Thus, there’s no point in writing up an actual proof of their claim that alignment is impossible.
The author is also pretty rude, repeatedly calling rationalists unreasonable, irrational, and generally seeming very offended over a difference in opinion about a fairly trivial (IMO) point of epistemology.
Relevant context: other work by the author was linked previously, and Paul Christiano said that work seemed “cranky”, so I don’t hold the author’s abrasiveness fully against him.
I still singly-downvoted this post because I think the core of the provided argument is extremely weak. As far as I saw, the author just repeatedly asserts that performing logical reasoning implies you should assign 100% confidence to at least some claims, and that rationalists are completely irrational for thinking otherwise. All the while, the author made no reference whatsoever to preexisting work in this area. E.g., MIRI’s Logical Induction paper directly explains one way to have coherent uncertainties over logical / mathematical facts, as well as the limits of ones own reasoning process, despite Gödel incompleteness.