LesserWrong is dead to me. Trust nothing here; it is not truth-tracking.
PDV
nit: *KaTeX
I can try.
It’s someone who doesn’t understand your objection, and doesn’t seem to understand why you think it’s important that they understand it. (In stronger cases, they don’t even understand that they don’t understand it.) This generally feels like they are dodging the point of disagreement every way you bring it up, like it’s foreign to their entire worldview.
Any of those could count.
As mentioned in the comments on the crosspost: Ben and, independently, Zvi Moskowitz, have both been doing something similar but different and finding that making it more like the original tradition fulfills their goals more. This seems to be a pretty good indication that it is not just a rationalization.
Also mentioned there, from Jason Green-Lowe: The actual tradition as she is practiced has drifted far from the thing that’s desirable.
I definitely agree with the “interacting with a calculative person is dangerous” interpretation. If someone is presenting their unfiltered view, and I know them, I can interpret it in light of their goals and values and how they relate to mine and to me (and in fact do this instinctively). If they are calculating (in excess of my discernment) I cannot.
Some people maintain that maneuvering people to better serve their own best interests is a distinct skill from manipulating people toward arbitrary goals. I don’t believe that, so it is hard to get evidence that someone—who by stipulation is manipulating me in ways I can’t perceive—has my best interests at heart.
>If you’re worried about an oncoming problem and discussing it with others to plan
[Moderation notice by Ben: I’ve blocked replies to this comment, as this comment reads as fairly political and quite aggressive. In particular this comment has personal attacks (‘obviously and dangerously confused’), incendiary language (‘cancer in the rationalist community’, ‘block off sound reasoning’), and little substance for gworley to respond to (or for me to even understand what exactly PDV believes and why).
PDV, there may be some valid points in there, but if you want to discuss them, you should create a comment that’s less directly insulting and more likely to promote productive discussion. This is a warning notice; if we read similar comments from you in the future, your account will likely be temporarily suspended.
Thanks to users who are hitting ‘report’ on comments—it’s super helpful to know quickly about things where moderators want to be able to watch/act.]
-----------------------------------------------------------------------------------------
I am not one of the most epistemically rational people in the community. Probably the top half, unlikely the top quarter. Even relative to me, though, you, Chapman, and your whole crowd, are very obviously and dangerously confused.
So confused that I genuinely believe that taking your ideas seriously corrodes good epistemic hygiene. They’re seductive, self-reinforcing, and block off sound reasoning, like the strongest conspiracy theories. They’re built into a superstructure that justifies feeling superior and ignoring counterargument, by distinguishing themselves as “gnostic” or “Kegan 5″ or “postrationalist” and implying that this is better than standard-issue rationality.
They are not. They are just standard irrational thinking, dressed up in the language of the Sequences so that they can be embraced wholeheartedly while not rejecting the ingroup signalling.
They are a cancer in the rationalist community and have done more damage than RationalWiki. Only Intentional Insights and Leverage are real competition with Meaningness for organization most harmful to the rationalist project.
(I’m not sure how Simler relates to any of it, since he has not displayed any of the themes or ideas I’ve seen from this irrationalist cluster. If he self-identifies with it that is surprising and I will have to re-evaluate my good opinion of his work.)
There are people productively engaging with that concept. They have none of these problems. Even if it’s true that that is important and what they’re trying to convey, it is harmful to accept their framing, more harmful than could justify potential benefits from them instead of sticking to people who are grounded in true things instead of nice things.
I know I’m going to use the next 2 years thing.
I’m not sure I believe that isn’t a contradiction in terms.
Chapman’s entire shtick is pretending to be wise, but even worse he’s good enough that people take his ideas seriously. And then spend months or years of effort building a superstructure of LW-shaped mysterianness on top of it, losing sight of actual ability to distinguish true things from false things and/or accomplish goals.
The basic deal is that it professes to include all the goals and prpose of rationality, while also using other methods. But those other methods are thinly-disguised woo, which are attractive because they’re easy and comfortable, and comfortable because they’re not bound to effectiveness and accuracy. It keeps the style and language of the rationalist community—the bad parts—while pushing the simple view of truth so far back in its priorities that it’s just lip service.
I’ll grant that this isn’t quite the same flavor of anti-truth woo as Chapman. But the difference is unimportant to me.
The falsity of this argument follows directly from the computability of physics.
Your belief system is flawed, built on embracing not-even-wrong statements as truth. This makes every conclusion you draw suspect, and when you’ve stated enough confident conclusions confidently which bear out that suspicion, it is no longer reasonable to presume they are correct until proven otherwise. That does not constitute an ad hominem, merely updating priors in response to evidence.
Insofar as I understand what it’s pointing at, it is pointing at something I’d paraphrase as “logical thought is overrated”. There’s nuance to what exactly it’s being pushed aside in favor of, but that’s the core piece I object to.
I object to it the most strongly because it’s from an intellectual lineage that draws adherents mostly from the rationalist community and is based around disparaging logical thought and a naive view of truth in favor of various wooy “instinct/social reasoning/tradition/spirituality without understanding is good” frameworks.
And while there’s value to system 1 reasoning, I think that A) CFAR is handling that quite fine with more care and purpose and B) Anything that hooks tightly to system 1 without being moderated by the strong endorsement of system 2 should be treated as BADSCARYATOMICFIRESPIDERS, even while trying to extract value from it.
I don’t know what he believes. I know only what he says. If he doesn’t believe what he says, that isn’t exactly a ringing endorsement, but would complicate things.
What I do know is that his entire notion of meaningness, and everything I’ve ever read from that blog, is anti-truth and anti-rationality. It’s grounded in assertions that rationality has problems which I do not accept are problems and makes bald-faced assertions (see: eternalism, aka ‘Truth exists’, which is asserted to be wrong because it contains divine command theory as a subset) that are just not true, while laying the foundation for his other arguments. Ex falso sequitur quodlibet, and his writing style has all the bad qualities of Eliezer’s, so I can’t bring myself to read it in enough depth to write a point by point rebuttal.
The content may be fine, but the style is sneer and insight porn over substance. Please go back to good writing like your previous blogposts.
I would take this more seriously if I’d seen any evidence that Robin’s position had updated at all since the first Yudkowsky-Hanson FOOM debate. Which, despite seeing many discussions, between them and otherwise, I have not.
As it is, updating on this post would be double-counting.
Specifics, because otherwise I’ll be asked by people who doubt that my experiences are psychologically plausible.
I have no daily routine. Really. None. I don’t wake up at a consistent time, when I go into work is dictated by whether I’m thinking about it and varies back and forth by an hour without external stimulus. I have to remember to eat when there is not a defined schedule I have to stick to, and sometimes even then.
My hard-mode TAP attempt was as follows:
I had an en-suite bathroom in my apartment. I placed my water-flosser in a specific spot, visible from the doorway but not in direct line of sight walking in.
The TAP was “when I open the door, I will look toward the flosser.”
About a dozen times when initially conceiving of it, I rehearsed this. I did a few variations, such as swinging the door open different amounts and having it start open or closed.
Over the next month, approximately, I regularly went into the bathroom. When I did, I sometimes, more often toward the beginning, would remember that I had a plan to install this TAP. If I remembered, I would carry it out. If I had entered the bathroom without doing it and then remembered later, I would usually leave and re-enter to rehearse it.This never became any more automatic. By the time I gave up, several weeks after I began, I still regularly entered the room without the TAP firing. I think it was still occasionally coming to mind when I moved out a few months later, but it definitely wasn’t affecting the regularity with which I was actually remembering to floss. (I have a chart of that in Beeminder so I’m quite sure.)
One thing I think is widely underappreciated about the color wheel as a classification system is the extent to which everything nonwhite is alien to modern society. We live in a culture and society dominated to a very large extent by white values. Our virtues are white; our fears are white; 1984, Stranger in a Strange Land, the Borg, FDR’s Four Freedoms, the UN Charter of Human Rights, the military, the Peace Corps. The major geopolitical and ideological struggles of the 20th century were between various flavors of white; royalism vs. democracy, communism and.democracy vs. fascism, communism vs, democracy. Tradition vs. feminism and vs. racial equality. Religion and separation of church and state are also both white concepts.
There are important nonwhite concepts; capitalism is black at its most prosocial, atheism is blue, the sexual revolution brought on by The Pill is primarily red. Rationalism is, naturally, extremely blue.
But just like the correct answer to “What D&D alignment are you?” is almost always true neutral, the correct answer to “What main MtG color are you?” is almost always white. In order to get useful classification power you must adjust away from the baseline by treating someone who’s 70% white and evenly split between everything else as your zero point.
I disagree. I think this is anti-epistemic and tends to devolve easily into bad manners, politics, and harassment.
See, e.g., the Sufficient Velocity forum, where any reaction with a negative sentiment (including things as mild as ‘sarcasm’ and ‘Picard facepalm’, and ones that were only accessible in limited quantities to paid subscribers) were quickly discontinued because they were used for flaming and Internet Arguments.