LesserWrong is dead to me. Trust nothing here; it is not truth-tracking.
PDV
nit: *KaTeX
Blind Goaltenders: Unproductive Disagreements
I can try.
It’s someone who doesn’t understand your objection, and doesn’t seem to understand why you think it’s important that they understand it. (In stronger cases, they don’t even understand that they don’t understand it.) This generally feels like they are dodging the point of disagreement every way you bring it up, like it’s foreign to their entire worldview.
Any of those could count.
As mentioned in the comments on the crosspost: Ben and, independently, Zvi Moskowitz, have both been doing something similar but different and finding that making it more like the original tradition fulfills their goals more. This seems to be a pretty good indication that it is not just a rationalization.
Also mentioned there, from Jason Green-Lowe: The actual tradition as she is practiced has drifted far from the thing that’s desirable.
I definitely agree with the “interacting with a calculative person is dangerous” interpretation. If someone is presenting their unfiltered view, and I know them, I can interpret it in light of their goals and values and how they relate to mine and to me (and in fact do this instinctively). If they are calculating (in excess of my discernment) I cannot.
Some people maintain that maneuvering people to better serve their own best interests is a distinct skill from manipulating people toward arbitrary goals. I don’t believe that, so it is hard to get evidence that someone—who by stipulation is manipulating me in ways I can’t perceive—has my best interests at heart.
>If you’re worried about an oncoming problem and discussing it with others to plan
Personal Model of Social Energy
[Moderation notice by Ben: I’ve blocked replies to this comment, as this comment reads as fairly political and quite aggressive. In particular this comment has personal attacks (‘obviously and dangerously confused’), incendiary language (‘cancer in the rationalist community’, ‘block off sound reasoning’), and little substance for gworley to respond to (or for me to even understand what exactly PDV believes and why).
PDV, there may be some valid points in there, but if you want to discuss them, you should create a comment that’s less directly insulting and more likely to promote productive discussion. This is a warning notice; if we read similar comments from you in the future, your account will likely be temporarily suspended.
Thanks to users who are hitting ‘report’ on comments—it’s super helpful to know quickly about things where moderators want to be able to watch/act.]
-----------------------------------------------------------------------------------------
I am not one of the most epistemically rational people in the community. Probably the top half, unlikely the top quarter. Even relative to me, though, you, Chapman, and your whole crowd, are very obviously and dangerously confused.
So confused that I genuinely believe that taking your ideas seriously corrodes good epistemic hygiene. They’re seductive, self-reinforcing, and block off sound reasoning, like the strongest conspiracy theories. They’re built into a superstructure that justifies feeling superior and ignoring counterargument, by distinguishing themselves as “gnostic” or “Kegan 5″ or “postrationalist” and implying that this is better than standard-issue rationality.
They are not. They are just standard irrational thinking, dressed up in the language of the Sequences so that they can be embraced wholeheartedly while not rejecting the ingroup signalling.
They are a cancer in the rationalist community and have done more damage than RationalWiki. Only Intentional Insights and Leverage are real competition with Meaningness for organization most harmful to the rationalist project.
(I’m not sure how Simler relates to any of it, since he has not displayed any of the themes or ideas I’ve seen from this irrationalist cluster. If he self-identifies with it that is surprising and I will have to re-evaluate my good opinion of his work.)
There are people productively engaging with that concept. They have none of these problems. Even if it’s true that that is important and what they’re trying to convey, it is harmful to accept their framing, more harmful than could justify potential benefits from them instead of sticking to people who are grounded in true things instead of nice things.
I know I’m going to use the next 2 years thing.
I’m not sure I believe that isn’t a contradiction in terms.
Chapman’s entire shtick is pretending to be wise, but even worse he’s good enough that people take his ideas seriously. And then spend months or years of effort building a superstructure of LW-shaped mysterianness on top of it, losing sight of actual ability to distinguish true things from false things and/or accomplish goals.
The basic deal is that it professes to include all the goals and prpose of rationality, while also using other methods. But those other methods are thinly-disguised woo, which are attractive because they’re easy and comfortable, and comfortable because they’re not bound to effectiveness and accuracy. It keeps the style and language of the rationalist community—the bad parts—while pushing the simple view of truth so far back in its priorities that it’s just lip service.
I’ll grant that this isn’t quite the same flavor of anti-truth woo as Chapman. But the difference is unimportant to me.
The falsity of this argument follows directly from the computability of physics.
Your belief system is flawed, built on embracing not-even-wrong statements as truth. This makes every conclusion you draw suspect, and when you’ve stated enough confident conclusions confidently which bear out that suspicion, it is no longer reasonable to presume they are correct until proven otherwise. That does not constitute an ad hominem, merely updating priors in response to evidence.
Insofar as I understand what it’s pointing at, it is pointing at something I’d paraphrase as “logical thought is overrated”. There’s nuance to what exactly it’s being pushed aside in favor of, but that’s the core piece I object to.
I object to it the most strongly because it’s from an intellectual lineage that draws adherents mostly from the rationalist community and is based around disparaging logical thought and a naive view of truth in favor of various wooy “instinct/social reasoning/tradition/spirituality without understanding is good” frameworks.
And while there’s value to system 1 reasoning, I think that A) CFAR is handling that quite fine with more care and purpose and B) Anything that hooks tightly to system 1 without being moderated by the strong endorsement of system 2 should be treated as BADSCARYATOMICFIRESPIDERS, even while trying to extract value from it.
I don’t know what he believes. I know only what he says. If he doesn’t believe what he says, that isn’t exactly a ringing endorsement, but would complicate things.
What I do know is that his entire notion of meaningness, and everything I’ve ever read from that blog, is anti-truth and anti-rationality. It’s grounded in assertions that rationality has problems which I do not accept are problems and makes bald-faced assertions (see: eternalism, aka ‘Truth exists’, which is asserted to be wrong because it contains divine command theory as a subset) that are just not true, while laying the foundation for his other arguments. Ex falso sequitur quodlibet, and his writing style has all the bad qualities of Eliezer’s, so I can’t bring myself to read it in enough depth to write a point by point rebuttal.
The content may be fine, but the style is sneer and insight porn over substance. Please go back to good writing like your previous blogposts.
I would take this more seriously if I’d seen any evidence that Robin’s position had updated at all since the first Yudkowsky-Hanson FOOM debate. Which, despite seeing many discussions, between them and otherwise, I have not.
As it is, updating on this post would be double-counting.
I disagree. I think this is anti-epistemic and tends to devolve easily into bad manners, politics, and harassment.
See, e.g., the Sufficient Velocity forum, where any reaction with a negative sentiment (including things as mild as ‘sarcasm’ and ‘Picard facepalm’, and ones that were only accessible in limited quantities to paid subscribers) were quickly discontinued because they were used for flaming and Internet Arguments.