Datapoint: I thought handoflixue’s comment was much more reasonable and less uncharitable than cousin_it’s opening comment was; in particular, the points about needing an explicit abort procedure sounded very reasonable and it makes me slightly worried to see you making a comment that implies you’re just disregarding them. (only slightly because of my personal trust in you and your abilities; I expect that people who don’t know you, will get much more worried)
EDIT: I wrote this comment before reading your reply to jbeshir’s comment; your response there further reduces my worry.
Not knowing the author, can’t say much else than “someone freaked out”? I see mostly a strong emotional reaction, which looks to me similar than a bunch of other strong emotional reactions that people have had when they’ve pattern-matched things in the rationalist community to their stereotype of a cult, without really actually understanding the community (or necessarily cults either).
Ah, now I see why some smart folks were okay with Duncan’s idea. They pattern-matched criticisms of it to criticisms of the rationalist community! That’s sneaky, even Scott fell prey to it, though he came around quickly (check his tumblr).
It seems like the only way “weird” groups can defend against such radicalization over time is by adopting “normie” ideas. I’ve been advocating that for a while, but I know it’s a hard sell here because many rationalists feel hurt by normies.
They pattern-matched criticisms of it to criticisms of the rationalist community!
Well, what else can you say to a criticism that’s mostly an emotional outburst? That post was using every opportunity it could to interpret Duncan’s post in a maximally uncharitable light and turn stuff into ad hominems, such as “yes, dude, I too had a job in college”. I searched for the “self-insert” phrase like you asked me to, and it brought up a line where the author expressed not liking Duncan’s writing. What substantive point am I supposed to take out of someone’s literary preferences? (also the author mischaracterizes “A Weekend with the Legion”—to the extent that it’s a self-insert fic, it’s one of joining a rationalist group house not founding one, and I’m not sure where the “mary sue” thing came from)
For me personally, a lot of what Duncan wrote resonated in me a lot in that I’ve long wished to live in a society that would be arranged kind of like he described Dragon Army, and it seemed clear that he’d seen the same things and worked off a similar model. Whereas few of the criticisms seemed to understand those intuitions/emotional needs that I presume we’re both operating out of, so ended up missing the mark. E.g. I’m totally willing to buy it when he says that he doesn’t actually want to be the leader, both because I’ve met him, and also because not wanting to be the leader is a major part of why I’m not trying to create a similar project myself now that I’ve read his post (that, and because it would be too difficult to explain to people without them pattern-matching it into cults).
It feels weird saying this to you, and please don’t take it too seriously, but if you feel an emotional need to live in a commune with salutes, push-up punishments and restrictions on criticism, have you considered that your emotions might be wrong (from an outside perspective)? For example, many of my emotions are wrong, that’s why I don’t text my exes while drunk.
The things you mentioned seem to me more like incidental than essential features of the commune; also I’m not saying that I would agree with Duncan on exactly everything regarding the design—for one, I thought Ender’s Game was an okay book but didn’t see what all the fuss about it was. :) But then again, his project, and I’m sure that my ideal aesthetics wouldn’t be his ideal aesthetics either.
The core things that do appeal to me are… well, this is a little hard to verbalize, since like him this is operating more off a system 1, pattern matching basis rather than any explicit first principles. But things like agreement with the sense that the pendelum of modern society has swung a little too far with regard to individualism and commitment, a sense that there is genuine value in being part of a group where everyone is genuinely entirely commited to the project and each other’s welfare (“One for all, all for one”), where people are willing to try whatever weird things if it works without needing to worry about what outsiders might think, and generally having a strong supportive social structure that offers you help when you’re struggling, pushes you to become the best possible version of yourself when you might otherwise slack off, and provides frequent feedback of how you’re doing regardless.
I think I’d be much happier off in a situation like that, rather than the current situation where it feels like I mostly have to figure out everything myself and it’s a constant struggle to find allies for any project that would make things better and which I can’t pull off just by myself.
But sure, I’m open to the possibility that I’m wrong in this and such an environment wouldn’t actually be good for me, or that I’m reading too much into Duncan’s post and that the intuitions he’s operating out of are actually substantially different from the ones I’m having.
If the problem is lack of supporting structure in modern life, surely the answer is joining traditional institutions, not more costly and risky social experiments?
surely the answer is joining traditional institutions
I think this depends on how much alignment you can expect to have with traditional institutions. Quakers let in gays and atheists, but the politics of the typical member grated; joining the Mormons would involve celibacy until God calls up the prophet and tells them that being gay is okay (which I cautiously expect in less than ten years) and lying about beliefs in the supernatural. Joining the military involves participating in ‘wars’ that I disagree with strenuously, and when I was the right age to do it “don’t ask don’t tell” was still official policy (and, I later learned from an acquaintance who did go to the Academy I would’ve gone to, being openly atheistic was seen as an invitation for hazing by some of the instructors).
I’m not inviting people to join the Mormons. The OP’s curriculum would be better covered by joining a gym, meditation group, public speaking club or graphic design course, which don’t have the problems you mention.
I brought up the Mormons because I seriously considered joining them (and rejected it for the above reasons).
I think you’re fundamentally misunderstanding the nutrient being sought out if you think that the list of four things you mention (individually or all together) would actually satisfy the relevant hunger.
I thought the point was learning skills and interacting with people. If the real point is filling a tribe shaped hole in your soul, I can only repeat my question to Kaj. Are you sure that yearning for a tribe is an emotion that serves your interests?
Are you sure that yearning for a tribe is an emotion that serves your interests?
Given how yearning for a tribe is a “powerful, fundamental, and extremely pervasive motivation” (old paper, but later research has only served to further confirm the general notion), I would guess yes; for me personally, “being in a tribe” seems very much like the strongest unmet terminal goal that I have.
That seems like proving too much, since I don’t yearn for a tribe. Are you sure you aren’t confusing your social needs for a specific dream of fulfilling them?
A motivation can be “extremely pervasive” without being universal. (very few things in psychology are truly universal) You may not share the yearning, but I’ve certainly run into plenty of people who do.
Are you sure you aren’t confusing your social needs with a specific way to fulfill them?
That is possible, and I have made that kind of a mistake before, but if there’s an alternative way of fulfilling them I haven’t found it.
I think you misunderstand the point. The goal is not to develop skills, the goal is to create an emotional web of support that comes from being a bona fide member of a tightly-knit tribe. You don’t (normally) get that at a gym or a public speaking group.
Possibly excluding some religious communities, which I wouldn’t want to join because I’m not religious, I don’t know of any traditional institutions that would provide general life support. Schools have some support structures in place that are aimed at helping you do better at school, martial arts training supports you become better at martial arts, etc. Which traditional institution is one that you can just join, and which is aimed at making all of its members become the best versions of themselves in all respects?
(By the way, I forgot to reply to this in the earlier comment, but I think that interpreting “start from the assumption of good faith when interacting with other members of the house” as “no criticizing the leader” is… not a particularly charitable interpretation.)
When deciding who to put in power and how much power to give them, the principle of charity is harmful.
It seems to me that institutions that claim to make you better in every way are always scams. The fact that a school will teach you only welding, and will give you a welder certificate in a certain number of weeks if you keep showing up, is a feature. If you join two or three institutions according to your interests, you’ll be fully booked in both self-improvement and social interaction, and it’s still less costly or risky than joining an authoritarian commune.
When deciding who to put in power and how much power to give them, the principle of charity is harmful.
There’s healthy skepticism and then there’s twisting words wildly beyond any reasonable interpretation...
Also the level of skepticism should be proportionate to the level of authority requested; it makes sense to be more skeptical the more power someone wants. But my reading of the original post agrees with Sinal’s reading, who compares the level of authoritarianism with that of a Boy Scout troop leader. The original post has stuff like the first rule of conduct for a dragon being to protect themselves; it mentioned that people can “hard veto” proposed experimental norms; people are free to leave the experiment if they wish. Duncan’s authority seems to be limited to upholding policies that were agreed upon by group consensus and running them for a limited time; he has mentioned in the comments that he can be removed from power using the kind of procedures one would expect, e.g. a majority vote. The specific examples of his “tyrannical” powers that were given were things like deciding that a specific meeting will be held on Tuesdays even though not everyone wants the meeting to be on a Tuesday.
The Boy Scout troop leader probably has more power over his scouts than Duncan has in the house, and I doubt we’d consider people obviously unsuitable to be scout leaders for the sin of suggesting that scouts should assume good intent in their dealings with each other.
You’re talking like joining this commune would be a huge enormous risk, and I just don’t see that. Sure there’s a risk, but it’s on the same order as joining any other commune or moving in with other roommates—you risk having a miserable time for a while if it turns out you’re not a good fit for each other, and then things may be inconvenient for a while as you need to look for a new place where to live.
Personally I made the mistake of moving in with some wildly incompatible roommates at least once, and have also on other occasions lived together with other people who I’d strongly have preferred not to live together with. Yes, it sucked a lot and made me much more miserable than I probably would have been otherwise. But then I moved out and don’t think I’ve suffered any lasting consequences, and despite the unpleasantness I still don’t consider it a risk on the order of “has to absolutely be avoided”.
It seems to me that institutions that claim to make you better in every way are always scams. The fact that a school will teach you only welding, and will give you a welder certificate in a certain number of weeks if you keep showing up, is a feature.
Agreed that this is a feature: sometimes one really does only want to learn welding. But if you want to learn dancing and everyone’s only teaching welding, with all the places that claim to teach dancing actually being scams… then that’s a major problem for you, and suggests that you’d get a lot out of it if someone did found a dancing school that actually taught dancing and wasn’t a scam.
I think claiming to teach skills that aren’t taught by any traditional institutions is fishy. (This isn’t an isolated demand, I’ve argued with CFAR folks that they should prioritize research into testing rationality, instead of jumping head first into teaching it.)
Yeah, when we want to learn things beyond the expertise of a house member (such as when we learned to use firearms during the weekend experiment) we bring in professional help.
The post says it will help you achieve three goals, of which self-improvement is the most important, and gives a list of 15 skills it will help you learn (many of which are fishy by my standard above).
Which traditional institution is one that you can just join, and which is aimed at making all of its members become the best versions of themselves in all respects?
I think what you’re referring to is something like the Holy Grail of institutions. So if someone claims that they’ve found the global optimum of institutions, the right reaction should be one of heavy skepticism. It’s not wrong to seek the global optimum, but when someone proposes that it exists in some well-explored territory based on a somewhat simple model, the argument they should present for it would probably look something like 1) We overlooked some seemingly trivial, but serious details that would have fixed the major issues we had previously and/or 2) Iterating on this idea for a while will not result in diminishing gains for a considerable time.
What we have in society right now is a bunch of local optimums for specific needs. I think we should be prepared for the scenario in which the global optimum looks weird, and is composed of sort of a hodgepodge of various fixes and hacks and specific set-ups to meet different requirements for different people. And I know this looks ugly, but that’s typically what solutions as the output of optimization processes look like. I consider a single hierarchical institution to be a simple model, and therefore consider it unlikely that such an ambitious goal will be reached using such a solution.
So based on my above model of institutions I sort of place low probability on a solution that consists of a simple model already well-explored or without a considerable amount of details tacked-on that have been found through consistent iteration and optimization. Right now I think this experiment will have to be run with significant fail-safe mechanisms in place and outside observation so that this process can actually take place.
It’s not obvious to me that Duncan is proposing that. See my comment here. To me, it seems more like iterating and optimizing towards the minimum would get you something far from both the extremes of the libertarian egalitarian model and the one-person-in-charge-of-everything model.
I mentioned in another comment that Duncan’s role seems to be “upholding policies that were agreed upon by group consensus and running them for a limited time”; this does seem like it’s pretty distant from both rampant individualism and one-person-in-charge-of-everything to me.
I’m not sure of how to interpret your referenced comment; you seem to be talking about the “old model” being “cults”, but I don’t know what you mean by cults—I interpret a “cult” to be something like “a small group rallied around a charismatic leader with absolute authority”, but I don’t think that has been the predominant mode of social organization at any point in history?
I interpret “cult” as applicable to both small and large groups and not dependent on whether the leader has charisma or not (It could also refer to small tribes with chieftains, dictatorships, absolute monarchies, etc.). And I think in this regard it has been the predominant mode of social organization throughout history.
But after seeing Scott’s “on fourth thought” I have been more convinced that Duncan has been moving in the direction of placing limits on his power and making sure the appropriate safe-guards are in place, which has updated me away from seeing the pendulum as swinging too far in the opposite direction. I think the question remains whether or not continued updates and iterations will involve further limitations on his authority.
being part of a group where everyone is genuinely entirely commited to the project and each other’s welfare (“One for all, all for one”), where people are willing to try whatever weird things if it works without needing to worry about what outsiders might think, and generally having a strong supportive social structure that offers you help when you’re struggling, pushes you to become the best possible version of yourself when you might otherwise slack off, and provides frequent feedback of how you’re doing regardless.
Sure. You are describing a church group, or maybe an entire sect/denomination (see e.g. pretty much all early Protestant movements).
Is it a good idea? As usual, it depends :-/ Sometimes it works out and sometimes it doesn’t. Sometimes you spend a safe and content life doing good work, and sometimes you find yourself killing evil abominations like Catholics.
Besides, such groups evolve and usually not in a good direction. Becoming bureaucratic and ossified is relatively harmless, but being taken over by sociopaths (as per ribbonfarm) can be much worse.
Ok. If you don’t mind, I’ll use you as an interpreter for Duncan, since he doesn’t answer questions much. Can you explain why the idea of a group house with salutes, push-up punishments, restrictions on criticism etc. appeals to you? Is there any evidence that it would help learn skills more effectively, compared to taking a class? Why do you feel that the obvious dangers aren’t dangers, apart from knowing Duncan personally (many real world tyrants were reportedly charming in person) and seeing the list of excuses that’s identical to that of every other cult?
I resisted playing the fallacy game with Duncan because he’s clearly just parroting stuff, but I expected better from you. Okay, let’s go. “You’re being emotional” and “you’re pattern matching” are examples of the bulverism fallacy. Your turn.
This person’s post, while containing some overlap with the more true and useful criticism here, is also not the sort of thing I expect people to cite on LW and not, I think, a useful entry in the back and forth here.
On the other hand, the difference in our levels of endorsement of it explains a lot about why our interaction went south in a hurry.
Quoting Qiaochu:
I would like everyone posting criticism, especially heated criticism, to keep very firmly in mind that Duncan did not have to write this. Whatever your opinion of him, at least make sure you’ve factored in the evidence that he wrote this whole, weird thing, complete with references to Ender’s Game, Fight Club, etc. instead of writing either 1) nothing or 2) something much more reassuring.
There are critics who think Duncan is incompetent and overconfident, and about this hypothesis I can say at least that it is consistent with Duncan having written this post. Then there are critics who think Duncan is, I dunno, evil or power-hungry or something, and I think those people are mostly failing to see what is in front of them.
I was tentatively willing to give you some benefit of the doubt even though I don’t know you but I’m really disappointed that you feel the need to score points against a rationalist-adjacent posting to her Tumblr about how your post looks to her from her outside vantage point. I brought a similar-amount-of-adjacent friend to the seder and it freaked her out. Rationalist shit looks bizarre from a couple steps away. You do not have to slam my friend for not being impressed with you.
Fair point. I will edit the above to remove point-scoring criticism; if this person wanted to be exposed to it, they would’ve posted here directly. I’ll ask you to leave your comment so it’s clear what originally occurred.
That being said, they certainly have no qualms about tearing into me. Like, my response to them was not a response to “I am unimpressed” or “I have a negative reaction to this,” and I think it’s a little disingenuous or unfair of you to summarize their content thusly. It’s … an asymmetric expectation of charity? Holding a double standard? Or something like that. I’d hope you’d offer feedback to them similar to what you said to me here, to see how they respond.
I know her and she has earned some charity from me. You’re a stranger soliciting a line of credit. Also, her task is “opine on Tumblr” and yours is “benevolent dictatorship”. If you want me to convey to her that your feelings were hurt I could do that for you, I suppose.
It’s less that my feelings were hurt (they were, a little, but I’ve developed a pretty thick skin around “strangers are wrong about me”), and more that you’re saying, to me, “hey, please don’t be uncharitable or overly critical or focus on point-scoring,” and I think the point-scoring exhibited in that post would cause me, in your shoes, to make a symmetric point to my friend. It’s a consistency thing, of supporting the norms I want to see in all places, ignoring partisan or loyalty lines (being willing to call out my allies as much as I’m willing to call out a stranger or an enemy).
I guess if I were to ask you to convey a message, it would be “this person thinks you’ve jumped to unfounded conclusions, and wonders what odds you’d put on ‘I might be wrong.’”
Thanks. As Lumifer has pointed out, I have become more defensive in the past 36 hours, but I claim it’s almost entirely limited to the two individuals who have shown themselves to be deontologically hostile and extremely overconfident in their models. There’s obviously wiggle room in there to say “Eh, even given that, Duncan, I think you’re overreacting,” but if so, it’s because I feel that after a hundred comments and a multithousand word post (that I didn’t have to make at all, in the first place) I deserve some credit à la I’ve clearly demonstrated willingness to engage positively with criticism and update publicly and admit wrong and so on and so forth (and therefore don’t like comments that presuppose me not being all those things).
Datapoint: I thought handoflixue’s comment was much more reasonable and less uncharitable than cousin_it’s opening comment was; in particular, the points about needing an explicit abort procedure sounded very reasonable and it makes me slightly worried to see you making a comment that implies you’re just disregarding them. (only slightly because of my personal trust in you and your abilities; I expect that people who don’t know you, will get much more worried)
EDIT: I wrote this comment before reading your reply to jbeshir’s comment; your response there further reduces my worry.
Kaj, I’m surprised. What do you think of this? Especially Ctrl+F “self-insert” and “horns effect”.
Not knowing the author, can’t say much else than “someone freaked out”? I see mostly a strong emotional reaction, which looks to me similar than a bunch of other strong emotional reactions that people have had when they’ve pattern-matched things in the rationalist community to their stereotype of a cult, without really actually understanding the community (or necessarily cults either).
Ah, now I see why some smart folks were okay with Duncan’s idea. They pattern-matched criticisms of it to criticisms of the rationalist community! That’s sneaky, even Scott fell prey to it, though he came around quickly (check his tumblr).
It seems like the only way “weird” groups can defend against such radicalization over time is by adopting “normie” ideas. I’ve been advocating that for a while, but I know it’s a hard sell here because many rationalists feel hurt by normies.
Well, what else can you say to a criticism that’s mostly an emotional outburst? That post was using every opportunity it could to interpret Duncan’s post in a maximally uncharitable light and turn stuff into ad hominems, such as “yes, dude, I too had a job in college”. I searched for the “self-insert” phrase like you asked me to, and it brought up a line where the author expressed not liking Duncan’s writing. What substantive point am I supposed to take out of someone’s literary preferences? (also the author mischaracterizes “A Weekend with the Legion”—to the extent that it’s a self-insert fic, it’s one of joining a rationalist group house not founding one, and I’m not sure where the “mary sue” thing came from)
For me personally, a lot of what Duncan wrote resonated in me a lot in that I’ve long wished to live in a society that would be arranged kind of like he described Dragon Army, and it seemed clear that he’d seen the same things and worked off a similar model. Whereas few of the criticisms seemed to understand those intuitions/emotional needs that I presume we’re both operating out of, so ended up missing the mark. E.g. I’m totally willing to buy it when he says that he doesn’t actually want to be the leader, both because I’ve met him, and also because not wanting to be the leader is a major part of why I’m not trying to create a similar project myself now that I’ve read his post (that, and because it would be too difficult to explain to people without them pattern-matching it into cults).
It feels weird saying this to you, and please don’t take it too seriously, but if you feel an emotional need to live in a commune with salutes, push-up punishments and restrictions on criticism, have you considered that your emotions might be wrong (from an outside perspective)? For example, many of my emotions are wrong, that’s why I don’t text my exes while drunk.
No offense taken.
The things you mentioned seem to me more like incidental than essential features of the commune; also I’m not saying that I would agree with Duncan on exactly everything regarding the design—for one, I thought Ender’s Game was an okay book but didn’t see what all the fuss about it was. :) But then again, his project, and I’m sure that my ideal aesthetics wouldn’t be his ideal aesthetics either.
The core things that do appeal to me are… well, this is a little hard to verbalize, since like him this is operating more off a system 1, pattern matching basis rather than any explicit first principles. But things like agreement with the sense that the pendelum of modern society has swung a little too far with regard to individualism and commitment, a sense that there is genuine value in being part of a group where everyone is genuinely entirely commited to the project and each other’s welfare (“One for all, all for one”), where people are willing to try whatever weird things if it works without needing to worry about what outsiders might think, and generally having a strong supportive social structure that offers you help when you’re struggling, pushes you to become the best possible version of yourself when you might otherwise slack off, and provides frequent feedback of how you’re doing regardless.
I think I’d be much happier off in a situation like that, rather than the current situation where it feels like I mostly have to figure out everything myself and it’s a constant struggle to find allies for any project that would make things better and which I can’t pull off just by myself.
But sure, I’m open to the possibility that I’m wrong in this and such an environment wouldn’t actually be good for me, or that I’m reading too much into Duncan’s post and that the intuitions he’s operating out of are actually substantially different from the ones I’m having.
If the problem is lack of supporting structure in modern life, surely the answer is joining traditional institutions, not more costly and risky social experiments?
I think this depends on how much alignment you can expect to have with traditional institutions. Quakers let in gays and atheists, but the politics of the typical member grated; joining the Mormons would involve celibacy until God calls up the prophet and tells them that being gay is okay (which I cautiously expect in less than ten years) and lying about beliefs in the supernatural. Joining the military involves participating in ‘wars’ that I disagree with strenuously, and when I was the right age to do it “don’t ask don’t tell” was still official policy (and, I later learned from an acquaintance who did go to the Academy I would’ve gone to, being openly atheistic was seen as an invitation for hazing by some of the instructors).
I’m not inviting people to join the Mormons. The OP’s curriculum would be better covered by joining a gym, meditation group, public speaking club or graphic design course, which don’t have the problems you mention.
I brought up the Mormons because I seriously considered joining them (and rejected it for the above reasons).
I think you’re fundamentally misunderstanding the nutrient being sought out if you think that the list of four things you mention (individually or all together) would actually satisfy the relevant hunger.
I thought the point was learning skills and interacting with people. If the real point is filling a tribe shaped hole in your soul, I can only repeat my question to Kaj. Are you sure that yearning for a tribe is an emotion that serves your interests?
Given how yearning for a tribe is a “powerful, fundamental, and extremely pervasive motivation” (old paper, but later research has only served to further confirm the general notion), I would guess yes; for me personally, “being in a tribe” seems very much like the strongest unmet terminal goal that I have.
That seems like proving too much, since I don’t yearn for a tribe. Are you sure you aren’t confusing your social needs for a specific dream of fulfilling them?
A motivation can be “extremely pervasive” without being universal. (very few things in psychology are truly universal) You may not share the yearning, but I’ve certainly run into plenty of people who do.
That is possible, and I have made that kind of a mistake before, but if there’s an alternative way of fulfilling them I haven’t found it.
It seems to me like there are flavors of ‘interacting with people’ that require tribe-mates.
Having a tribe is one of my interests.
I think you misunderstand the point. The goal is not to develop skills, the goal is to create an emotional web of support that comes from being a bona fide member of a tightly-knit tribe. You don’t (normally) get that at a gym or a public speaking group.
Possibly excluding some religious communities, which I wouldn’t want to join because I’m not religious, I don’t know of any traditional institutions that would provide general life support. Schools have some support structures in place that are aimed at helping you do better at school, martial arts training supports you become better at martial arts, etc. Which traditional institution is one that you can just join, and which is aimed at making all of its members become the best versions of themselves in all respects?
(By the way, I forgot to reply to this in the earlier comment, but I think that interpreting “start from the assumption of good faith when interacting with other members of the house” as “no criticizing the leader” is… not a particularly charitable interpretation.)
When deciding who to put in power and how much power to give them, the principle of charity is harmful.
It seems to me that institutions that claim to make you better in every way are always scams. The fact that a school will teach you only welding, and will give you a welder certificate in a certain number of weeks if you keep showing up, is a feature. If you join two or three institutions according to your interests, you’ll be fully booked in both self-improvement and social interaction, and it’s still less costly or risky than joining an authoritarian commune.
There’s healthy skepticism and then there’s twisting words wildly beyond any reasonable interpretation...
Also the level of skepticism should be proportionate to the level of authority requested; it makes sense to be more skeptical the more power someone wants. But my reading of the original post agrees with Sinal’s reading, who compares the level of authoritarianism with that of a Boy Scout troop leader. The original post has stuff like the first rule of conduct for a dragon being to protect themselves; it mentioned that people can “hard veto” proposed experimental norms; people are free to leave the experiment if they wish. Duncan’s authority seems to be limited to upholding policies that were agreed upon by group consensus and running them for a limited time; he has mentioned in the comments that he can be removed from power using the kind of procedures one would expect, e.g. a majority vote. The specific examples of his “tyrannical” powers that were given were things like deciding that a specific meeting will be held on Tuesdays even though not everyone wants the meeting to be on a Tuesday.
The Boy Scout troop leader probably has more power over his scouts than Duncan has in the house, and I doubt we’d consider people obviously unsuitable to be scout leaders for the sin of suggesting that scouts should assume good intent in their dealings with each other.
You’re talking like joining this commune would be a huge enormous risk, and I just don’t see that. Sure there’s a risk, but it’s on the same order as joining any other commune or moving in with other roommates—you risk having a miserable time for a while if it turns out you’re not a good fit for each other, and then things may be inconvenient for a while as you need to look for a new place where to live.
Personally I made the mistake of moving in with some wildly incompatible roommates at least once, and have also on other occasions lived together with other people who I’d strongly have preferred not to live together with. Yes, it sucked a lot and made me much more miserable than I probably would have been otherwise. But then I moved out and don’t think I’ve suffered any lasting consequences, and despite the unpleasantness I still don’t consider it a risk on the order of “has to absolutely be avoided”.
Agreed that this is a feature: sometimes one really does only want to learn welding. But if you want to learn dancing and everyone’s only teaching welding, with all the places that claim to teach dancing actually being scams… then that’s a major problem for you, and suggests that you’d get a lot out of it if someone did found a dancing school that actually taught dancing and wasn’t a scam.
I think claiming to teach skills that aren’t taught by any traditional institutions is fishy. (This isn’t an isolated demand, I’ve argued with CFAR folks that they should prioritize research into testing rationality, instead of jumping head first into teaching it.)
Duncan’s project isn’t really about teaching skills, though.
Yeah, when we want to learn things beyond the expertise of a house member (such as when we learned to use firearms during the weekend experiment) we bring in professional help.
The post says it will help you achieve three goals, of which self-improvement is the most important, and gives a list of 15 skills it will help you learn (many of which are fishy by my standard above).
I think what you’re referring to is something like the Holy Grail of institutions. So if someone claims that they’ve found the global optimum of institutions, the right reaction should be one of heavy skepticism. It’s not wrong to seek the global optimum, but when someone proposes that it exists in some well-explored territory based on a somewhat simple model, the argument they should present for it would probably look something like 1) We overlooked some seemingly trivial, but serious details that would have fixed the major issues we had previously and/or 2) Iterating on this idea for a while will not result in diminishing gains for a considerable time.
What we have in society right now is a bunch of local optimums for specific needs. I think we should be prepared for the scenario in which the global optimum looks weird, and is composed of sort of a hodgepodge of various fixes and hacks and specific set-ups to meet different requirements for different people. And I know this looks ugly, but that’s typically what solutions as the output of optimization processes look like. I consider a single hierarchical institution to be a simple model, and therefore consider it unlikely that such an ambitious goal will be reached using such a solution.
So based on my above model of institutions I sort of place low probability on a solution that consists of a simple model already well-explored or without a considerable amount of details tacked-on that have been found through consistent iteration and optimization. Right now I think this experiment will have to be run with significant fail-safe mechanisms in place and outside observation so that this process can actually take place.
Isn’t starting from a simple model and then iterating and optimizing (i.e. exactly what Duncan is proposing) the only way to get to that point?
It’s not obvious to me that Duncan is proposing that. See my comment here. To me, it seems more like iterating and optimizing towards the minimum would get you something far from both the extremes of the libertarian egalitarian model and the one-person-in-charge-of-everything model.
I mentioned in another comment that Duncan’s role seems to be “upholding policies that were agreed upon by group consensus and running them for a limited time”; this does seem like it’s pretty distant from both rampant individualism and one-person-in-charge-of-everything to me.
I’m not sure of how to interpret your referenced comment; you seem to be talking about the “old model” being “cults”, but I don’t know what you mean by cults—I interpret a “cult” to be something like “a small group rallied around a charismatic leader with absolute authority”, but I don’t think that has been the predominant mode of social organization at any point in history?
I interpret “cult” as applicable to both small and large groups and not dependent on whether the leader has charisma or not (It could also refer to small tribes with chieftains, dictatorships, absolute monarchies, etc.). And I think in this regard it has been the predominant mode of social organization throughout history.
But after seeing Scott’s “on fourth thought” I have been more convinced that Duncan has been moving in the direction of placing limits on his power and making sure the appropriate safe-guards are in place, which has updated me away from seeing the pendulum as swinging too far in the opposite direction. I think the question remains whether or not continued updates and iterations will involve further limitations on his authority.
Sure. You are describing a church group, or maybe an entire sect/denomination (see e.g. pretty much all early Protestant movements).
Is it a good idea? As usual, it depends :-/ Sometimes it works out and sometimes it doesn’t. Sometimes you spend a safe and content life doing good work, and sometimes you find yourself killing evil abominations like Catholics.
Besides, such groups evolve and usually not in a good direction. Becoming bureaucratic and ossified is relatively harmless, but being taken over by sociopaths (as per ribbonfarm) can be much worse.
Ok. If you don’t mind, I’ll use you as an interpreter for Duncan, since he doesn’t answer questions much. Can you explain why the idea of a group house with salutes, push-up punishments, restrictions on criticism etc. appeals to you? Is there any evidence that it would help learn skills more effectively, compared to taking a class? Why do you feel that the obvious dangers aren’t dangers, apart from knowing Duncan personally (many real world tyrants were reportedly charming in person) and seeing the list of excuses that’s identical to that of every other cult?
I resisted playing the fallacy game with Duncan because he’s clearly just parroting stuff, but I expected better from you. Okay, let’s go. “You’re being emotional” and “you’re pattern matching” are examples of the bulverism fallacy. Your turn.
OK. I’m even more surprised about you now but let’s drop this.
This person’s post, while containing some overlap with the more true and useful criticism here, is also not the sort of thing I expect people to cite on LW and not, I think, a useful entry in the back and forth here.
On the other hand, the difference in our levels of endorsement of it explains a lot about why our interaction went south in a hurry.
Quoting Qiaochu:
I was tentatively willing to give you some benefit of the doubt even though I don’t know you but I’m really disappointed that you feel the need to score points against a rationalist-adjacent posting to her Tumblr about how your post looks to her from her outside vantage point. I brought a similar-amount-of-adjacent friend to the seder and it freaked her out. Rationalist shit looks bizarre from a couple steps away. You do not have to slam my friend for not being impressed with you.
That’s kind of unfair, considering the sheer amount of point-scoring going on in the original post.
Fair point. I will edit the above to remove point-scoring criticism; if this person wanted to be exposed to it, they would’ve posted here directly. I’ll ask you to leave your comment so it’s clear what originally occurred.
That being said, they certainly have no qualms about tearing into me. Like, my response to them was not a response to “I am unimpressed” or “I have a negative reaction to this,” and I think it’s a little disingenuous or unfair of you to summarize their content thusly. It’s … an asymmetric expectation of charity? Holding a double standard? Or something like that. I’d hope you’d offer feedback to them similar to what you said to me here, to see how they respond.
I know her and she has earned some charity from me. You’re a stranger soliciting a line of credit. Also, her task is “opine on Tumblr” and yours is “benevolent dictatorship”. If you want me to convey to her that your feelings were hurt I could do that for you, I suppose.
It’s less that my feelings were hurt (they were, a little, but I’ve developed a pretty thick skin around “strangers are wrong about me”), and more that you’re saying, to me, “hey, please don’t be uncharitable or overly critical or focus on point-scoring,” and I think the point-scoring exhibited in that post would cause me, in your shoes, to make a symmetric point to my friend. It’s a consistency thing, of supporting the norms I want to see in all places, ignoring partisan or loyalty lines (being willing to call out my allies as much as I’m willing to call out a stranger or an enemy).
I guess if I were to ask you to convey a message, it would be “this person thinks you’ve jumped to unfounded conclusions, and wonders what odds you’d put on ‘I might be wrong.’”
I don’t really see the situations as symmetrical or calling for identical norms.
Thanks. As Lumifer has pointed out, I have become more defensive in the past 36 hours, but I claim it’s almost entirely limited to the two individuals who have shown themselves to be deontologically hostile and extremely overconfident in their models. There’s obviously wiggle room in there to say “Eh, even given that, Duncan, I think you’re overreacting,” but if so, it’s because I feel that after a hundred comments and a multithousand word post (that I didn’t have to make at all, in the first place) I deserve some credit à la I’ve clearly demonstrated willingness to engage positively with criticism and update publicly and admit wrong and so on and so forth (and therefore don’t like comments that presuppose me not being all those things).