I think the main issue here is culture. Like, I agree with you that I think most members of the rationalsphere wouldn’t do well in a military bootcamp, and I think this suggests a failing of the rationalist community—a pendulum that swung too far, and has weakened people in a way that’s probably better than the previous/alternative weakness, but still isn’t great and shouldn’t be lauded. I, at least, would do fine in a military bootcamp. So, I suspect, would the rationalists I actually admire (Nate S, Anna S, Eli T, Alex R, etc). I suspect Eliezer wouldn’t join a military bootcamp, but conditional on him having chosen to do so, I suspect he’d do quite well, also. There’s something in there about being able to draw on a bank of strength/go negative temporarily/have meta-level trust that you can pull through/not confuse pain with damage/not be cut off from the whole hemisphere of strategies that require some amount of battering.
It makes sense to me that our community’s allergic to it—many people entered into such contexts before they were ready, or with too little information, or under circumstances where the damage was real and extreme. But I think “AVOID AT ALL COSTS! RED FLAG! DEONTOLOGICAL REJECTION!” is the wrong lesson to take from it, and I think our community is closer to that than it is to a healthy, carefully considered balance.
Similarly, I think the people-being-unreliable thing is a bullshit side effect/artifact of people correctly identifying flexibility and sensitivity-to-fluctuating-motivation as things worth prioritizing, but incorrectly weighting the actual costs of making them the TOP priorities. I think the current state of the rationalist community is one that fetishizes freedom of movement and sacrifices all sorts of long-term, increasing-marginal-returns sorts of gains, and that a few years from now, the pendulum will swing again and people will be doing it less wrong and will be slightly embarrassed about this phase.
(I’m quite emphatic about this one. Of all the things rationalists do, this one smacks the most of a sort of self-serving, short-sighted immaturity, the exact reason why we have the phrase “letting the perfect be the enemy of the good.”)
I do think Problem 4 can probably be solved incrementally/with a smaller intervention, but when I was considering founding a house, one of my thoughts was “Okay, good—in addition to all the other reasons to do this, it’ll give me a context to really turn a bazooka on that one pet peeve.”
I suspect Eliezer wouldn’t join a military bootcamp, but conditional on him having chosen to do so, I suspect he’d do quite well, also.
Eliezer wasn’t able to complete high school, for what I suspect are related reasons. (The sleep thing may have contributed, but I think it was overdetermined.)
I think I would have been extremely miserable if I had gone through boot camp at 18; I think I would have been able to bear going through it by ~25.
I think a relatively tight analogy can be made between attitudes towards the authoritarianism of a military bootcamp and attitudes towards romantic relationships. Like, if you go through a string of really bad relationships with partners who consistently abused you, you might update that there’s something inherently abusive about relationships and that you just shouldn’t be in one again, ever, because your autonomy is too important. On the other hand there is such a thing as a healthy relationship, even a healthy relationship in which you have less than perfect autonomy because you’ve made some commitments that you’re following through on, and you might be lucky enough to find yourself in one in the future if you’re open to the possibility and search carefully for someone to commit to.
I think I disagree that the pendulum will swing back in the future though. The rationality community being the way it is now, prioritizing flexibility the way it does now, probably has the property that it attracts people who are prioritizing flexibility and turns off people who are looking for reliability. So if anything I expect the problem to get worse over time unless someone makes a deliberate effort to attract looking-for-reliability sorts of people—hopefully Dragon Army can do this.
a relatively tight analogy can be made between attitudes towards the authoritarianism of a military bootcamp and attitudes towards romantic relationships
I don’t get the analogy. So, if you go through a string of really bad military bootcamps? But you need to stay open to the possibility of a really good bootcamp that you can and should commit to?
Yes, but using “military bootcamp” as a symbol of broader kinds of authorities you could submit to, e.g. schools, employers, governments, and keeping in mind that people are learning about how authorities work based on others’ experiences and not just their own.
As someone who’s done the whole military thing (am I alone?), I agree with your view that most members of the rationalsphere would struggle immensely in bootcamp, both in turns of physicality and culture (I’m referring mostly to the Army and Marines here, which focus on actual combat training vs. the Air Force and Navy that don’t).
I totally agree that you would have 0 problems (other than patience with the stupid parts) as you have a high degree of physical ability, emotional resilience, and general cognitive ability. You would very likely excel. I could say the same of Val and Pete, and I’m sure Eli would do well (I don’t know the others you listed well enough to venture a guess).
I have never met Eliezer. However, I suspect he would struggle a great deal and be unlikely to succeed from what I’ve read and been told. I can’t imagine Eliezer playing say football well either. My model of him just says he’s simply not optimized for that kind of environment where his intellectual strengths would be limited and his weaknesses amplified. It’s just not a remotely optimal environment for someone who is (according to my model of him) built like a race car, extreme performance within strict parameters (flat track, maintenance, etc.).
And that’s okay. The military enlisted system at least typically focuses on taking both physical and intellectual generalists and training them to perform a specific job. It’s all about the averages. The cockpit is decidedly not adjusted for individual needs or specialized performance for the vast majority of military personnel.
I do hope you’re at least somewhat right about the long-term, increasing-marginal-returns sorts of gains, since that’s my current strategy for achieving high impact on important matters.
Similarly, I think the people-being-unreliable thing is a bullshit side effect
You may wish to consider that this community has a very high frequency of disabilities which render one non-consensually unreliable.
You may wish to consider that your stance is especially insulting towards those members of our community.
You may wish to reconsider making uncharitable comments about those members of our community. In case it is unclear: “this one smacks the most of a sort of self-serving, short-sighted immaturity” is not a charitable statement.
Oh, I missed this one in the shuffle. Note that you chose to quote less than half a sentence, because if you quoted the whole sentence you’d have a heck of a time setting up the strawman you wanted to knock down.
Hi Duncan, I’m a relative newcomer (this is my first LW thread, though I’ve participated in rationalsphere discussions elsewhere), so this may not carry much weight, but I want to somewhat agree with handoflixue here.
One of my stronger reactions to your post is “this is an impossible set of expectations for me and a lot of others”. Which is fine, obviously you can have expectations that some people can’t live up to, and of course it is very good that you are making these expectations very clear.
But I sort of get the sense that you are a person who is fundamentally capable of being reliable and regularly making good life choices pretty easily, and that you sort of don’t get that for a lot of people these things are really hard even if they understand what the right choice is and are legitimately trying their best to do that.
This is based only partly on your post and somewhat more on a mini-talk which (IIRC) you gave at a CFAR community night where you posed the question “does it even make sense for people to seek out advanced rationality techniques such as the ones discussed here when they’re not displaying basic rationality such as eating a reasonable diet and sleeping enough?”. Even then, this question struck me as dangerously wrong-headed, and now that you are proposing to be in charge of people, this seems to take on more importance.
Advanced rationality techniques, at least when applied to one’s self-conception and life choices, are basically therapy. “Failures of basic rationality” are often better described as “mental health issues”. Therapy is how you deal with mental health issues. People with mental health issues need more therapy/advanced rationality, not less! I’ve seen it hypothesized that one reason we have so many mentally ill rationalists is because people with mental health issues must learn rationality in order to function, at least to some degree that is more than most people need.
I don’t actually know you, so my information is pretty incomplete, but my impression is that if someone fails to act in a way you (and they!) think is reasonable, you’re likely to become baffled and frustrated and try to deal with the problem by imposing stricter expectations & consequences. This might work for some people, but for many, it will just make them miserable and less productive because they will be angry at themselves for failing at things that they “should” be able to do.
I think it’s likely that your way of dealing with this is basically to screen out the people who are likely to react poorly to your approach, in addition to causing others like me to self-select out. That’s fine, I guess, though I would still be on the lookout for this sort of issue as a possible failure mode, and maybe also just demonstrate more compassionate awareness that things like reliability are actually almost impossible for some people, and maybe not attribute all of this to having the wrong culture or mindset.
(My general opinion of your project is “this sounds scary and I want to stay very far away from it, and this makes me somewhat wary of the people involved, and I wouldn’t recommend participation to people I know, at the same time I am really curious about how this will go so selfishly I’m a little glad it’s happening so I can gain information from it”.)
Thanks for the long comment. I really appreciate your candor and perspective—I do think I get the fact that other minds don’t work like mine, but you’re right in sniffing out that a lot of that knowledge is top-down and parts of me are still instinctively typical-minding a lot. I work hard to remind myself, e.g. I have triggers on certain words or feelings that cause me to review memories of specific times when my assumptions about what was going on in someone else’s head were blindingly false.
I think I generally agree with you that there’s a large overlap between rationality and therapy, and I’m intrigued by the hypothesis re: mentally ill rationalists; it seems to be pretty plausible.
Here’s my actual plan if someone fails to act in a way that things seem reasonable. Note that this is the “everything but the kitchen sink option,” including aaaaaallll of the steps one might take, and that for smaller disagreements, this can be done as a speed run or stepwise.
Determine whether to follow up in the moment or later based on the needs of the activity, determine whether to follow up in private, in group, or via delegation based on the apparent needs of the person.
Start by asking. What did they think was going on? What were their thought processes? Assume from the outset that people act in consistent, coherent ways, and that basically everyone is trying to make the world a better place.
Try to pass their ideological Turing test. In other words, try to reflect back to them the priorities they were holding and the goals they were attempting to achieve, and keep falsifying my hypotheses until they give a clear endorsement of my summary.
Ask them to model me, in return (note: one important subthread of how the house will run is a check-in along the lines of “is Duncan clear, consistent, and model-able?”). See if they can predict what my priorities were, and if they have a sense of what I’m reacting to. Do not make this some kind of sick high-pressure quiz dynamic … if they shrug and say “dunno,” I’ll just explain.
Try to lay out, from as birds’-eye as possible a perspective, the conflicting goalsets. Point at the causal chains that brought them into conflict, and highlight my model of where things are broken. Ask them if they have a different model/let them update my picture with a better sense.
Form a new plan for the future; explicitly discuss weighing the goals against one another, and how they ought to stack up. Possibly include other people in the discussion at this point, particularly if the defection seemed to have externalities.
Assume that plan failed. Come up with a plausible explanation for why; try to patch the first or second obvious holes. Form an intention going forward.
Check whether reparations need to be made. Hopefully, there’s a standard formula (as in the pushups example). If not, do a similar process of attempting to converge on a good face-saving/balance-restoring action. If there isn’t a clear satisfactory solution, default to a compromise and schedule a future check-in.
Through all of this, run things by others if either party thinks that’d be beneficial. Also consider things like anxiety/introversion, and have the conversation at a deliberate time rather than forcing it if it’s not urgent.
So yeah, in a sense, this might result in stricter expectations and consequences, but not in a blind, top-down way. In situations where there needs to be an immediate response, I’ll take an action/give an order and expect it to work, but I’ll want to revisit any such quick authoritarian moves after the fact, to explain my thinking and confirm absence of undue harm (and apologize/make amends of my own if necessary).
Overall, though, the idea is to build a high trust environment, and trust goes both ways and is easier to lose than to gain. The thing I want people in the house to actually be justified in believing is “Duncan always has good intentions and is making decisions from some kind of a model. He’ll explain when he can, and if he doesn’t, it’s because he has another model saying why he can’t, and he’ll instead explain both models once the thing is over.”
The idea being that I prove trustworthiness in situations 1-8, and people grant me a little leeway in situation 9. But 1-8 definitely have to come first.
Advanced rationality techniques, at least when applied to one’s self-conception and life choices, are basically therapy. “Failures of basic rationality” are often better described as “mental health issues”. Therapy is how you deal with mental health issues. People with mental health issues need more therapy/advanced rationality, not less! I’ve seen it hypothesized that one reason we have so many mentally ill rationalists is because people with mental health issues must learn rationality in order to function, at least to some degree that is more than most people need.
Um. Quick reply before I go further—I’m really really confident that the community talk night thing you’re remembering either wasn’t me or that the quote doesn’t resemble what I said. I strongly agree with you that that’s a dangerously wrong-headed way to try carving up the world.
That’s not because he didn’t do the exercise. Bootcamp doesn’t care if you lose weight, they only care if you execute the weight loss program. If you doesn’t meet any of the body proportion standards, you just have to perform extra exercise.
Bootcamp (i.e. the military) cares very much about both losing sufficient weight to meet the standard as well as the ability to perform at a basic level of physical fitness. The different U.S. military services have differing standards, but the general requirements are all comparable.
In an environment where the food supply is tightly controlled and there is constant movement, people tend to lose a lot of weight quite rapidly.
However, if you don’t meet the body proportion standards after a certain time, you will be separated from the military.
Part of the program is separating people who don’t lose weight. That doesn’t mean they care about the height/weight, only that the next box is ‘process for separation’.
There’s not a lot other than adherence to procedure that most of the military actually does care about.
I’m not sure if I’m totally missing your point, or if you’re making a point that’s a distinction without a difference.
In Army basic training, there are two standards one must meet:
height/weight, adjusted for age and gender
PT test, which consists of push-ups, sit-ups, and a 2-mile run, with scoring adjusted for age and gender
Either one will get you chaptered out of the Army within certain timeframes. There is a lot of fine print for specific situations (basic training has some extra cushion), but that’s the ground truth. These same principles apply to the military at large, but the standards and fine print differ.
I don’t know how that squares with: “That doesn’t mean they care about the height/weight.”
In an organization so devoted to adherence to procedure, what the procedures are set up to be is often a pretty strong indicator of what the organization cares about...
No individual cares about anything other than the procedures. Thus, the organization as a whole cares only about the procedures. The behavior is similar /with the procedures that exist/ to caring about fitness, but there is also a procedure to change procedure.
If the organization cared about fitness, the procedure to change the height/weight standards would be based on fitness. As it is, it is more based on politics. Therefore I conclude that the Army cares more about politics and procedures than fitness, and any behavior that looks like caring about fitness is incidental to their actual values.
I think the main issue here is culture. Like, I agree with you that I think most members of the rationalsphere wouldn’t do well in a military bootcamp, and I think this suggests a failing of the rationalist community—a pendulum that swung too far, and has weakened people in a way that’s probably better than the previous/alternative weakness, but still isn’t great and shouldn’t be lauded. I, at least, would do fine in a military bootcamp. So, I suspect, would the rationalists I actually admire (Nate S, Anna S, Eli T, Alex R, etc). I suspect Eliezer wouldn’t join a military bootcamp, but conditional on him having chosen to do so, I suspect he’d do quite well, also. There’s something in there about being able to draw on a bank of strength/go negative temporarily/have meta-level trust that you can pull through/not confuse pain with damage/not be cut off from the whole hemisphere of strategies that require some amount of battering.
It makes sense to me that our community’s allergic to it—many people entered into such contexts before they were ready, or with too little information, or under circumstances where the damage was real and extreme. But I think “AVOID AT ALL COSTS! RED FLAG! DEONTOLOGICAL REJECTION!” is the wrong lesson to take from it, and I think our community is closer to that than it is to a healthy, carefully considered balance.
Similarly, I think the people-being-unreliable thing is a bullshit side effect/artifact of people correctly identifying flexibility and sensitivity-to-fluctuating-motivation as things worth prioritizing, but incorrectly weighting the actual costs of making them the TOP priorities. I think the current state of the rationalist community is one that fetishizes freedom of movement and sacrifices all sorts of long-term, increasing-marginal-returns sorts of gains, and that a few years from now, the pendulum will swing again and people will be doing it less wrong and will be slightly embarrassed about this phase.
(I’m quite emphatic about this one. Of all the things rationalists do, this one smacks the most of a sort of self-serving, short-sighted immaturity, the exact reason why we have the phrase “letting the perfect be the enemy of the good.”)
I do think Problem 4 can probably be solved incrementally/with a smaller intervention, but when I was considering founding a house, one of my thoughts was “Okay, good—in addition to all the other reasons to do this, it’ll give me a context to really turn a bazooka on that one pet peeve.”
Eliezer wasn’t able to complete high school, for what I suspect are related reasons. (The sleep thing may have contributed, but I think it was overdetermined.)
I think I would have been extremely miserable if I had gone through boot camp at 18; I think I would have been able to bear going through it by ~25.
I think a relatively tight analogy can be made between attitudes towards the authoritarianism of a military bootcamp and attitudes towards romantic relationships. Like, if you go through a string of really bad relationships with partners who consistently abused you, you might update that there’s something inherently abusive about relationships and that you just shouldn’t be in one again, ever, because your autonomy is too important. On the other hand there is such a thing as a healthy relationship, even a healthy relationship in which you have less than perfect autonomy because you’ve made some commitments that you’re following through on, and you might be lucky enough to find yourself in one in the future if you’re open to the possibility and search carefully for someone to commit to.
I think I disagree that the pendulum will swing back in the future though. The rationality community being the way it is now, prioritizing flexibility the way it does now, probably has the property that it attracts people who are prioritizing flexibility and turns off people who are looking for reliability. So if anything I expect the problem to get worse over time unless someone makes a deliberate effort to attract looking-for-reliability sorts of people—hopefully Dragon Army can do this.
I don’t get the analogy. So, if you go through a string of really bad military bootcamps? But you need to stay open to the possibility of a really good bootcamp that you can and should commit to?
Yes, but using “military bootcamp” as a symbol of broader kinds of authorities you could submit to, e.g. schools, employers, governments, and keeping in mind that people are learning about how authorities work based on others’ experiences and not just their own.
As someone who’s done the whole military thing (am I alone?), I agree with your view that most members of the rationalsphere would struggle immensely in bootcamp, both in turns of physicality and culture (I’m referring mostly to the Army and Marines here, which focus on actual combat training vs. the Air Force and Navy that don’t).
I totally agree that you would have 0 problems (other than patience with the stupid parts) as you have a high degree of physical ability, emotional resilience, and general cognitive ability. You would very likely excel. I could say the same of Val and Pete, and I’m sure Eli would do well (I don’t know the others you listed well enough to venture a guess).
I have never met Eliezer. However, I suspect he would struggle a great deal and be unlikely to succeed from what I’ve read and been told. I can’t imagine Eliezer playing say football well either. My model of him just says he’s simply not optimized for that kind of environment where his intellectual strengths would be limited and his weaknesses amplified. It’s just not a remotely optimal environment for someone who is (according to my model of him) built like a race car, extreme performance within strict parameters (flat track, maintenance, etc.).
And that’s okay. The military enlisted system at least typically focuses on taking both physical and intellectual generalists and training them to perform a specific job. It’s all about the averages. The cockpit is decidedly not adjusted for individual needs or specialized performance for the vast majority of military personnel.
I do hope you’re at least somewhat right about the long-term, increasing-marginal-returns sorts of gains, since that’s my current strategy for achieving high impact on important matters.
You may wish to consider that this community has a very high frequency of disabilities which render one non-consensually unreliable.
You may wish to consider that your stance is especially insulting towards those members of our community.
You may wish to reconsider making uncharitable comments about those members of our community. In case it is unclear: “this one smacks the most of a sort of self-serving, short-sighted immaturity” is not a charitable statement.
Oh, I missed this one in the shuffle. Note that you chose to quote less than half a sentence, because if you quoted the whole sentence you’d have a heck of a time setting up the strawman you wanted to knock down.
Hi Duncan, I’m a relative newcomer (this is my first LW thread, though I’ve participated in rationalsphere discussions elsewhere), so this may not carry much weight, but I want to somewhat agree with handoflixue here.
One of my stronger reactions to your post is “this is an impossible set of expectations for me and a lot of others”. Which is fine, obviously you can have expectations that some people can’t live up to, and of course it is very good that you are making these expectations very clear.
But I sort of get the sense that you are a person who is fundamentally capable of being reliable and regularly making good life choices pretty easily, and that you sort of don’t get that for a lot of people these things are really hard even if they understand what the right choice is and are legitimately trying their best to do that.
This is based only partly on your post and somewhat more on a mini-talk which (IIRC) you gave at a CFAR community night where you posed the question “does it even make sense for people to seek out advanced rationality techniques such as the ones discussed here when they’re not displaying basic rationality such as eating a reasonable diet and sleeping enough?”. Even then, this question struck me as dangerously wrong-headed, and now that you are proposing to be in charge of people, this seems to take on more importance.
Advanced rationality techniques, at least when applied to one’s self-conception and life choices, are basically therapy. “Failures of basic rationality” are often better described as “mental health issues”. Therapy is how you deal with mental health issues. People with mental health issues need more therapy/advanced rationality, not less! I’ve seen it hypothesized that one reason we have so many mentally ill rationalists is because people with mental health issues must learn rationality in order to function, at least to some degree that is more than most people need.
I don’t actually know you, so my information is pretty incomplete, but my impression is that if someone fails to act in a way you (and they!) think is reasonable, you’re likely to become baffled and frustrated and try to deal with the problem by imposing stricter expectations & consequences. This might work for some people, but for many, it will just make them miserable and less productive because they will be angry at themselves for failing at things that they “should” be able to do.
I think it’s likely that your way of dealing with this is basically to screen out the people who are likely to react poorly to your approach, in addition to causing others like me to self-select out. That’s fine, I guess, though I would still be on the lookout for this sort of issue as a possible failure mode, and maybe also just demonstrate more compassionate awareness that things like reliability are actually almost impossible for some people, and maybe not attribute all of this to having the wrong culture or mindset.
(My general opinion of your project is “this sounds scary and I want to stay very far away from it, and this makes me somewhat wary of the people involved, and I wouldn’t recommend participation to people I know, at the same time I am really curious about how this will go so selfishly I’m a little glad it’s happening so I can gain information from it”.)
Thanks for the long comment. I really appreciate your candor and perspective—I do think I get the fact that other minds don’t work like mine, but you’re right in sniffing out that a lot of that knowledge is top-down and parts of me are still instinctively typical-minding a lot. I work hard to remind myself, e.g. I have triggers on certain words or feelings that cause me to review memories of specific times when my assumptions about what was going on in someone else’s head were blindingly false.
I think I generally agree with you that there’s a large overlap between rationality and therapy, and I’m intrigued by the hypothesis re: mentally ill rationalists; it seems to be pretty plausible.
Here’s my actual plan if someone fails to act in a way that things seem reasonable. Note that this is the “everything but the kitchen sink option,” including aaaaaallll of the steps one might take, and that for smaller disagreements, this can be done as a speed run or stepwise.
Determine whether to follow up in the moment or later based on the needs of the activity, determine whether to follow up in private, in group, or via delegation based on the apparent needs of the person.
Start by asking. What did they think was going on? What were their thought processes? Assume from the outset that people act in consistent, coherent ways, and that basically everyone is trying to make the world a better place.
Try to pass their ideological Turing test. In other words, try to reflect back to them the priorities they were holding and the goals they were attempting to achieve, and keep falsifying my hypotheses until they give a clear endorsement of my summary.
Ask them to model me, in return (note: one important subthread of how the house will run is a check-in along the lines of “is Duncan clear, consistent, and model-able?”). See if they can predict what my priorities were, and if they have a sense of what I’m reacting to. Do not make this some kind of sick high-pressure quiz dynamic … if they shrug and say “dunno,” I’ll just explain.
Try to lay out, from as birds’-eye as possible a perspective, the conflicting goalsets. Point at the causal chains that brought them into conflict, and highlight my model of where things are broken. Ask them if they have a different model/let them update my picture with a better sense.
Form a new plan for the future; explicitly discuss weighing the goals against one another, and how they ought to stack up. Possibly include other people in the discussion at this point, particularly if the defection seemed to have externalities.
Assume that plan failed. Come up with a plausible explanation for why; try to patch the first or second obvious holes. Form an intention going forward.
Check whether reparations need to be made. Hopefully, there’s a standard formula (as in the pushups example). If not, do a similar process of attempting to converge on a good face-saving/balance-restoring action. If there isn’t a clear satisfactory solution, default to a compromise and schedule a future check-in.
Through all of this, run things by others if either party thinks that’d be beneficial. Also consider things like anxiety/introversion, and have the conversation at a deliberate time rather than forcing it if it’s not urgent.
So yeah, in a sense, this might result in stricter expectations and consequences, but not in a blind, top-down way. In situations where there needs to be an immediate response, I’ll take an action/give an order and expect it to work, but I’ll want to revisit any such quick authoritarian moves after the fact, to explain my thinking and confirm absence of undue harm (and apologize/make amends of my own if necessary).
Overall, though, the idea is to build a high trust environment, and trust goes both ways and is easier to lose than to gain. The thing I want people in the house to actually be justified in believing is “Duncan always has good intentions and is making decisions from some kind of a model. He’ll explain when he can, and if he doesn’t, it’s because he has another model saying why he can’t, and he’ll instead explain both models once the thing is over.”
The idea being that I prove trustworthiness in situations 1-8, and people grant me a little leeway in situation 9. But 1-8 definitely have to come first.
This reminds me of Romeo’s comment over here:
http://lesswrong.com/lw/oym/how_id_introduce_lesswrong_to_an_outsider/dryk
Um. Quick reply before I go further—I’m really really confident that the community talk night thing you’re remembering either wasn’t me or that the quote doesn’t resemble what I said. I strongly agree with you that that’s a dangerously wrong-headed way to try carving up the world.
Oh, sorry for that mistake, then! Probably it was someone else. feels mildly embarrassed
I’m glad to hear you agree with my assessment of that way of thinking. In that case not very much of my comment actually stands.
Thank you for your thoughtful response!
Doesn’t Eliezer delete comments on Facebook that suggest exercise as a means of weight loss?
That’s not because he didn’t do the exercise. Bootcamp doesn’t care if you lose weight, they only care if you execute the weight loss program. If you doesn’t meet any of the body proportion standards, you just have to perform extra exercise.
Bootcamp (i.e. the military) cares very much about both losing sufficient weight to meet the standard as well as the ability to perform at a basic level of physical fitness. The different U.S. military services have differing standards, but the general requirements are all comparable.
In an environment where the food supply is tightly controlled and there is constant movement, people tend to lose a lot of weight quite rapidly.
However, if you don’t meet the body proportion standards after a certain time, you will be separated from the military.
Part of the program is separating people who don’t lose weight. That doesn’t mean they care about the height/weight, only that the next box is ‘process for separation’.
There’s not a lot other than adherence to procedure that most of the military actually does care about.
I’m not sure if I’m totally missing your point, or if you’re making a point that’s a distinction without a difference.
In Army basic training, there are two standards one must meet:
height/weight, adjusted for age and gender
PT test, which consists of push-ups, sit-ups, and a 2-mile run, with scoring adjusted for age and gender
Either one will get you chaptered out of the Army within certain timeframes. There is a lot of fine print for specific situations (basic training has some extra cushion), but that’s the ground truth. These same principles apply to the military at large, but the standards and fine print differ.
I don’t know how that squares with: “That doesn’t mean they care about the height/weight.”
In an organization so devoted to adherence to procedure, what the procedures are set up to be is often a pretty strong indicator of what the organization cares about...
No individual cares about anything other than the procedures. Thus, the organization as a whole cares only about the procedures. The behavior is similar /with the procedures that exist/ to caring about fitness, but there is also a procedure to change procedure.
If the organization cared about fitness, the procedure to change the height/weight standards would be based on fitness. As it is, it is more based on politics. Therefore I conclude that the Army cares more about politics and procedures than fitness, and any behavior that looks like caring about fitness is incidental to their actual values.