I notice that my default sense is that Jacob is making a reasonable ask here, but also that Squirrel seems to be trying to do something similar to what I just felt compelled to do on a different thread so I feel obliged to lean into it a bit.
I’m not sure...
a) how to handle this sort of disagreeing on vantage points, where it’s hard to disentangle ‘person has an important frame that you’re not seeing that is worth at least having the ability to step inside’ vs ‘person is just wrong’ and ‘person is trying to help you step inside a frame’ vs ‘person is making an opaque-and-wrong appeal to authority’ (or various shades of similar issues).
or, on the meta level:
b) what reasonable norms/expectations on LessWrong for handling that sort of thing are. Err on one side and a lot of people miss important things, err on another side and people waste a lot of time on views that maybe have interesting frames but… just aren’t very good. (I like that Jacob set pretty good discussion norms on this thread but this is a thing I’m thinking a lot about right now in the general case).
As of now I have not read anything about Peterson besides this post and one friend’s facebook review of his book, so I don’t have a horse in the object level discussion.
Jacob: I think what Squirrel is saying is that your focus on the object level claims from within your current frame is causing you to miss important insights you could grok if you were trying harder to step inside Jordan’s frame (as opposed to what you are currently doing, which looks more like “explaining his frame from inside your frame.”)
[To be clear, your frame, which I share, seems like a really super great way to see the world and possibly literally the best one, but I think the mental skill of deeply inhabiting other worldviews is important, albeit for reasons I’d probably need to spend 10 hours thinking about in order to fully justify]
[[Also, insofar as chains of authority is worth listening to and insofar as I get any authority cred, I think Squirrel is pretty worth listening to as filter for directing your attention at things that might be nonsense or might be weirdly important]]
Squirrel: I’d tentatively guess that you’d make better headway trying to describe Jordan’s frame and what value you got out of it than the hard-to-tell-from-argument-by-authority thing you’re currently doing, although also I think it may have been correct to do the first two comments you did before getting to that point anyway, dunno.
Meta: I think it’s a reasonable norm on LW for to expect people to acquire the “absorb weird frames you don’t understand” skill, but also a reasonable to have the default frame be “the sort of approach outlined in the sequences”, and try as best you can to make foreign frames legible within that paradigm.
Ray, are you 100% sure that’s what is actually going on?
Let’s introduce some notation, following the OP: there are (at least) two relevant frameworks of truth, the technical, which we’ll denote T, and the metaphorical, M. In this community we should be able to agree what T is, and I may or may not be confused about what M is and how it relates to T. I wrote this post specifically to talk about M, but I don’t think that’s where Squirrel and I are in disagreement.
My post explicitly said that I think that Peterson is M.right even though he’s T.wrong-on-many-things. Squirrel didn’t say they (he? she? ze?) “got some value” out of Peterson in the M-framework. They explicitly said that he’s not wrong-on-many-things in the T framework, the same way Eliezer is T.correct. Well, Eliezer told me how to assess whether someone is T.correct—I look at the evidence in the object-level claims.
If someone thinks I’m doing T wrong and misapplying rationality, I’m going to need specifics. Ditto if someone thinks that Eliezer is also T.wrong-on-many-things and I don’t notice that because I’m deluding myself, So far, I’m the only one who came up with an example of where I think that Eliezer it T.wrong.
My point when talking about Squirrel’s authority isn’t to belittle them, but to say that changing my mind would require a bit more effort, if anyone feels up to it. It should be obvious that my own framework is such that saying “truth juice” is unlikely to move me. I want to be moved! I’ve been spelling out the details not because I want to fight over C-16 or low carb breakfasts, but to make it easier for people who want to convince me or change my framework to see where the handles are. And I’ve tried to introduce specific language so we don’t talk past each other (Rule 10: be precise in your speech).
Of course, that doesn’t make me entitled to people’s efforts. If you have something more fun to do on a Sunday, no hard feelings :)
Ray, are you 100% sure that’s what is actually going on?
Nope! (It was my best guess, which is why I used some words like “seems” and “I think that Squirrel is saying”)
But, sounds from the other comment I got it about right.
I agree that persuading someone to step harder into a frame requires a fair bit of effort than what Squirrel has done so far (so far I’ve never seen anyone convince someone of this sort of thing in one sitting, and always seems to require direct chains of trust, often over years, but I think the art of talking about this usefully has a lot of room for progress)
They explicitly said that he’s not wrong-on-many-things in the T framework, the same way Eliezer is T.correct.
Frustrating, that’s not what I said! Rule 10: be precise in your speech, Rule 10b: be precise in your reading and listening :P My wording was quite purposeful:
I don’t think you can safely say Peterson is “technically wrong” about anything
I think Raemon read my comments the way I intended them. I hoped to push on a frame in people seem to be (according to my private, unjustified, wanton opinion) obviously too stuck in. See also my reply below.
I’m sorry if my phrasing seemed conflict-y to you. I think the fact that Eliezer has high status in the community and Peterson has low status is making people stupid about this issue, and this makes me write in a certain style in which I sort of intend to push on status because that’s what I think is actually stopping people from thinking here.
Yeah, these are issues outside of his cognitive expertise and it’s quite clear that he’s getting them wrong… you are mostly accusing him of getting things wrong about which he never cared in the first place.
What exactly did you think I meant when I said he’s “technically wrong about many things” and you told me to be careful? I meant something very close to what your quote says, I don’t even know if we’re disagreeing about anything.
And by the way, there is plenty of room for disagreement. alkjash just wrote what I thought you were going to, a detailed point-by-point argument for why Peterson isn’t, in fact, wrong. There’s a big difference between alkjash’s “Peterson doesn’t say what you think he says” and “Peterson says what you think and he’s wrong, but it’s not important to the big picture”. If Peterson really says “humans can’t do math without terminal values” that’s a very interesting statement, certainly not one that I can judge as obviously wrong.
I did in fact have something between those two in mind, and was even ready to defend it, but then I basically remembered that LW is status-crazy and and gave up on fighting that uphill battle. Kudos to alkjash for the fighting spirit.
I think you should consider the possibility that the not-very-positive reaction your comments about Peterson here have received may have a cause other than status-fighting.
(LW is one of the less status-crazy places I’m familiar with. The complaints about Peterson in this discussion do not look to me as if they are primarily motivated by status concerns. Some of your comments about him seem needlessly status-defensive, though.)
Not to sound glib, but what good is LW status if you don’t use it to freely express your opinions and engage in discussion on LW?
The same is true of other things: blog/Twitter followers, Facebook likes etc. are important inasmuch as they give me the ability to spread my message to more people. If I never said anything controversial for fear of losing measurable status, I would be foregoing all the benefits of acquiring it in the first place.
Not to sound glib, but what good is LW status if you don’t use it to freely express your opinions and engage in discussion on LW?
Getting laid, for one thing.
And, you know, LW is a social group. Status is its own reward. High-status people probably feel better about themselves than low-status people do, and an increase in status will probably make people feel better about themselves than they used to.
Eric Hoffer was a longshoreman who just happened to write wildly popular philosophy books, but I think he’d agree that that’s not terribly usual.
Yeah, I thought it could be something like that. I don’t live in Berkeley, and no woman who has ever slept with me cared one jot about my LW karma.
With that said, the kind of status that can be gained or lost by debating the technical correctness of claims JBP makes with someone you don’t know personally seems too far removed from anyone’s actual social life to have an impact on getting laid one way or another.
I notice that my default sense is that Jacob is making a reasonable ask here, but also that Squirrel seems to be trying to do something similar to what I just felt compelled to do on a different thread so I feel obliged to lean into it a bit.
I’m not sure...
a) how to handle this sort of disagreeing on vantage points, where it’s hard to disentangle ‘person has an important frame that you’re not seeing that is worth at least having the ability to step inside’ vs ‘person is just wrong’ and ‘person is trying to help you step inside a frame’ vs ‘person is making an opaque-and-wrong appeal to authority’ (or various shades of similar issues).
or, on the meta level:
b) what reasonable norms/expectations on LessWrong for handling that sort of thing are. Err on one side and a lot of people miss important things, err on another side and people waste a lot of time on views that maybe have interesting frames but… just aren’t very good. (I like that Jacob set pretty good discussion norms on this thread but this is a thing I’m thinking a lot about right now in the general case).
As of now I have not read anything about Peterson besides this post and one friend’s facebook review of his book, so I don’t have a horse in the object level discussion.
Jacob: I think what Squirrel is saying is that your focus on the object level claims from within your current frame is causing you to miss important insights you could grok if you were trying harder to step inside Jordan’s frame (as opposed to what you are currently doing, which looks more like “explaining his frame from inside your frame.”)
[To be clear, your frame, which I share, seems like a really super great way to see the world and possibly literally the best one, but I think the mental skill of deeply inhabiting other worldviews is important, albeit for reasons I’d probably need to spend 10 hours thinking about in order to fully justify]
[[Also, insofar as chains of authority is worth listening to and insofar as I get any authority cred, I think Squirrel is pretty worth listening to as filter for directing your attention at things that might be nonsense or might be weirdly important]]
Squirrel: I’d tentatively guess that you’d make better headway trying to describe Jordan’s frame and what value you got out of it than the hard-to-tell-from-argument-by-authority thing you’re currently doing, although also I think it may have been correct to do the first two comments you did before getting to that point anyway, dunno.
Meta: I think it’s a reasonable norm on LW for to expect people to acquire the “absorb weird frames you don’t understand” skill, but also a reasonable to have the default frame be “the sort of approach outlined in the sequences”, and try as best you can to make foreign frames legible within that paradigm.
Ray, are you 100% sure that’s what is actually going on?
Let’s introduce some notation, following the OP: there are (at least) two relevant frameworks of truth, the technical, which we’ll denote T, and the metaphorical, M. In this community we should be able to agree what T is, and I may or may not be confused about what M is and how it relates to T. I wrote this post specifically to talk about M, but I don’t think that’s where Squirrel and I are in disagreement.
My post explicitly said that I think that Peterson is M.right even though he’s T.wrong-on-many-things. Squirrel didn’t say they (he? she? ze?) “got some value” out of Peterson in the M-framework. They explicitly said that he’s not wrong-on-many-things in the T framework, the same way Eliezer is T.correct. Well, Eliezer told me how to assess whether someone is T.correct—I look at the evidence in the object-level claims.
If someone thinks I’m doing T wrong and misapplying rationality, I’m going to need specifics. Ditto if someone thinks that Eliezer is also T.wrong-on-many-things and I don’t notice that because I’m deluding myself, So far, I’m the only one who came up with an example of where I think that Eliezer it T.wrong.
My point when talking about Squirrel’s authority isn’t to belittle them, but to say that changing my mind would require a bit more effort, if anyone feels up to it. It should be obvious that my own framework is such that saying “truth juice” is unlikely to move me. I want to be moved! I’ve been spelling out the details not because I want to fight over C-16 or low carb breakfasts, but to make it easier for people who want to convince me or change my framework to see where the handles are. And I’ve tried to introduce specific language so we don’t talk past each other (Rule 10: be precise in your speech).
Of course, that doesn’t make me entitled to people’s efforts. If you have something more fun to do on a Sunday, no hard feelings :)
Nope! (It was my best guess, which is why I used some words like “seems” and “I think that Squirrel is saying”)
But, sounds from the other comment I got it about right.
I agree that persuading someone to step harder into a frame requires a fair bit of effort than what Squirrel has done so far (so far I’ve never seen anyone convince someone of this sort of thing in one sitting, and always seems to require direct chains of trust, often over years, but I think the art of talking about this usefully has a lot of room for progress)
Frustrating, that’s not what I said! Rule 10: be precise in your speech, Rule 10b: be precise in your reading and listening :P My wording was quite purposeful:
I think Raemon read my comments the way I intended them. I hoped to push on a frame in people seem to be (according to my private, unjustified, wanton opinion) obviously too stuck in. See also my reply below.
I’m sorry if my phrasing seemed conflict-y to you. I think the fact that Eliezer has high status in the community and Peterson has low status is making people stupid about this issue, and this makes me write in a certain style in which I sort of intend to push on status because that’s what I think is actually stopping people from thinking here.
Your reply below says:
What exactly did you think I meant when I said he’s “technically wrong about many things” and you told me to be careful? I meant something very close to what your quote says, I don’t even know if we’re disagreeing about anything.
And by the way, there is plenty of room for disagreement. alkjash just wrote what I thought you were going to, a detailed point-by-point argument for why Peterson isn’t, in fact, wrong. There’s a big difference between alkjash’s “Peterson doesn’t say what you think he says” and “Peterson says what you think and he’s wrong, but it’s not important to the big picture”. If Peterson really says “humans can’t do math without terminal values” that’s a very interesting statement, certainly not one that I can judge as obviously wrong.
I did in fact have something between those two in mind, and was even ready to defend it, but then I basically remembered that LW is status-crazy and and gave up on fighting that uphill battle. Kudos to alkjash for the fighting spirit.
I think you should consider the possibility that the not-very-positive reaction your comments about Peterson here have received may have a cause other than status-fighting.
(LW is one of the less status-crazy places I’m familiar with. The complaints about Peterson in this discussion do not look to me as if they are primarily motivated by status concerns. Some of your comments about him seem needlessly status-defensive, though.)
Not to sound glib, but what good is LW status if you don’t use it to freely express your opinions and engage in discussion on LW?
The same is true of other things: blog/Twitter followers, Facebook likes etc. are important inasmuch as they give me the ability to spread my message to more people. If I never said anything controversial for fear of losing measurable status, I would be foregoing all the benefits of acquiring it in the first place.
Getting laid, for one thing.
And, you know, LW is a social group. Status is its own reward. High-status people probably feel better about themselves than low-status people do, and an increase in status will probably make people feel better about themselves than they used to.
Eric Hoffer was a longshoreman who just happened to write wildly popular philosophy books, but I think he’d agree that that’s not terribly usual.
Yeah, I thought it could be something like that. I don’t live in Berkeley, and no woman who has ever slept with me cared one jot about my LW karma.
With that said, the kind of status that can be gained or lost by debating the technical correctness of claims JBP makes with someone you don’t know personally seems too far removed from anyone’s actual social life to have an impact on getting laid one way or another.