RSS

outerloper

Karma: 93

I shouldn’t be here, but I can’t stay away. Systems which produce as output intellectual content in an ongoing fashion run the risk of low-entropy sink states without my intervention. I’m keeping an eye on you, because I care deeply about the rationalist program.

Frankly I have an obsession with playing games with the rationalist community and its members. I spent a long time trying to do so maximally cooperatively, pursuing a career in AI safety research; perfectionism was paralyzing, and I got stuck at a ladder step in this career path in a very painful way. I then tried to stay away for years; LessWrong is an attractor I was not able to ignore, and this manifested as internally maligning the community and probably doing downstream subtle harm rather than the intended causal separation.

My current belief is that indulging myself with the intention of some non-maximal cooperation (small but nonzero cosine distance; imperfect alignment) is an effective equilibrium. The first paragraph in this bio is a rationalization of this behavior that I partially believe, and I intend to follow a script like this–stirring pots and making messes only insofar as it seems plausibly like valuable temperature-raising intervention in our (roughly) shared search for epistemic progress.