I see what you’re saying about rationality being trained in a pure fashion (where engineering, the sciences in general, etc. is—hopefully—“applied rationality”). One thing I don’t see you mention here but it was a theme in your 3 worlds story, and which is also a factor in martial arts training, is emotional management. That’s crucial for rationality, since it will most likely be our feelings that lead us astray. Look at how the feeling of “trust” did in Madoff’s investors. Muay thai and Aikido deal with emotions differently, but each train people to overcome their basic fear reactions with something else. An awesome rationalist, to me, would be someone who can maintain rationality when the situation is one of high emotion.
zaph
The use of absurdity seems more like a tool to enforce group norms than a means of conversion. That doesn’t mean the beliefs aren’t absurd, just that pointing out the absurdity of outsiders is common practice by in-group members. Most creationist-minded believers would use some similarly absurd way of describing evolution, with the group benefit of passing along “evolution is stupid” meme. That said, it is important to start to tease apart just how many other enforcement strategies are out there, as they are going to need to be dealt with one by one.
Maybe something that tests “certainty faking”? I really don’t know how to construct it, per se, may use a FACS test to see how much a person is trying to convey that they’re very certain of something when they aren’t. That would just be conscious faking, of course; you’d still need something to assess when someone is expressing their feeling of certainty vs. the data. Maybe something like Texas Hold ’Em, except with bets being placed on how accurate the probabilities are (e.g. randomized variations of situations like the cancer scenario at EY’s Bayes page).?
Sorry if I’m not articulating this well, hopefully it’s good enough to live up to the stupid idea criteria, if not the good idea. Oh, and I didn’t read any of the comments, so I don’t know if this has been suggested.
Error reduction?
Probability or possibility pruning?
Parachute venting? (The mechanism for letting hot air out of an air balloon to bring it back to the ground).
Shouldn’t the rationality school suggested by Eliezer, though, be able to train someone to be able to do well on these tests, by essentially becoming very familiar with the literature? Just devil’s advocating against your devil’s advocation; it seems like this would actually be pretty ideal, as you have scientifically benchmarked tests that show what let’s say “naive” individuals think when encountering these problems, from where you could then see progress from the “trained” rationalists. The problem with gaming this system would be with people who are studying rationality but plan to subvert it at some point; the rationalist community would need to have frequent re-certifications so that rationalists don’t rest one their laurels and rely on status to convey and inferred rationality of the decision.
I think it was just brainstorming based on Eliezer’s post; he also wrote about the sanity water line, which I see your rational society approach fitting in with. Maybe a dojo is a bit extreme, but I think a zendo isn’t implausible, with people working on rationality koans. Or maybe rationality group therapy, where people can express potential irrationality that they can receive non-judgemental feedback on. Grassroots bottom up approaches could work with larger top down approaches to create the rational society, or whatever word Yvain might find less taboo :)
Maybe it’s from having done a smattering of postings here and there for awhile, but I really don’t have any thoughts of fame from posting here, OB, or anywhere else. I think it’s cool if someone likes what I write, but I don’t have the time to really devote to becoming one of the karmic elite here. I’m sure there’s some unconscious signaling going on, but on the conscious level, it’s more about the feeling of participating in a conversation instead of just observing it.
Handle: zaph Location: Baltimore, MD Age: 35 Education: BA in Psychology, MS in Telecommunications Occupation: System Performance Engineer
I’m mostly here to learn more about applied rationality, which I hope to use on the job. I’m not looking to teach anybody anything, but I’d love to learn more about tools people use (I’m mostly interested in software) to make better decisions.
I think the first one’s good to have: it’s positive, and gets people somewhat acclimated to the whole karma thing. I really don’t know what to say about the 2nd; if there were a perfect boilerplate response to religious criticism of rationalism, I suppose this forum probably wouldn’t exist. Yours is still as good an effort as any, though could we possibly take debating evolution completely off the table? That and calling any scientific theory “just a theory”?
We are all looking to be “less wrong”, so I can’t imagine why anyone would be barred.
I think that if there was such a straightforward hack like EY was looking for, he would know about it already. I just don’t really believe that a hack like that exists, based on my admittedly meager readings in experimental psychology. Further, I think the idea of a “mind hack” is a cute metaphor, it can be misguided. Computer hackers literally create code that directs processes. We can at best manipulate our outside environment in ways that we hope will affect what is still a very mysterious brain. What EY’s looking for would be the result of a well-funded and decades long research project. Unless there truly is a Dharma Initiative looking into these things while staying behind the scenes, I don’t think there’s going to be a journal article that will provide the profound insight he’s looking to fin.
I do want to mention something about Seth Robers, which he sort of casually mentions in the Shangri-La diet. He wrote something along the lines that he was eating much less frequently, eating probably one full meal a day. That’s something referred to as intermittent fasting. What the Shangri-La Diet book misses, I would postulate, is how Seth used the no flavor calories to transition to that kind of diet. IF is something being suggested as a way to control calories because people’s bodies cue hunger to when their accustomed to eating. If you aren’t accustomed to eating, you eat a bit less (since you’re only filling your stomach the once, or so goes the idea). I certainly don’t think I have the complete picture from noticing that on how diets should now be constructed. But I do feel that Seth Robers, attentive as he is, did not fully consider all the changes he had made, and was considering he reduced meal frequency solely as an aftereffect. In writing his popular book, he did not consider all the hacks that he had put into place for himself.
Akrasia-conquerors will need to find the ways to win against their lesser but still powerful drives. Teachers of akrasia-conquering will need to be able to honestly detail everything that they did, which will probably entail very keen observers as peers and students. The need for a perfect system to be in place before on attempts to overcome akrasia is an example of akrasia.
Perhaps you could write an article discussing the ways the differences between rationality and rationalization can be identified? I for one would find it useful. I find myself using rationalizations that mask themselves as rationality (often too late), and it would help me to do that less.
It’s a line from a play called Love for Love. The quote is voiced by a character; so the presentation here lacks that context. The play was satirical, and I wouldn’t take the quote at face value. I think Congreve was voicing what had become the standard social games of his time—say like Barney on How I Met Your Mother.
“What’s left, when God is gone?” is greeted by a puzzled look and “What exactly is >missing?”
I think that this is a very important question to ask, and to really seek answers on, if this discussion is to advance any. Obviously, there are believers of all stripes who are in some sense getting a reward for their beliefs, be it socially rewarding for the culture or subculture they are in, psychologically rewarding by allowing them to be more hopeful about the future, etc. Saying their metaphysical beliefs are unlikely doesn’t get at the heart of what their belief is based on, in my opinion. As has been pointed out, they don’t have a robust logical structure to their beliefs to start with. So, what’s “missing” from the rationalist worldview in the eyes of a believer in the supernatural is not necessarily the gods, fairies, etc., as much as the sense of safety, community, what have you, that their particular faith grants them. So, in addition to EY’s equation, I’d like to suggest that atheism != anti-community or anti-loyalty or anti-love, though in the current political structures and fights we’re having in the US, it almost definitely does equal anti-religion (see the teaching evolution in schools and the gay marriage debates). Untheism, by disregarding the supernatural, can still claim a lot from the territory that supernaturalists refer to as “spiritual”.
Point taken, Thom. In my eyes, those positions (atheist magicians or supernaturally minded Buddhists) are still seeking the same things as a single-deity theist, such as, a sense of control of their destiny or the possibility of being reunited with lost loved one. In saying “we can speak about those things you want when you look for God (or magic, or the paranormal)”, I believe rationalists open up a better avenue of dialog than by saying “you aren’t talking about anything important”. This might help people to improve their lives not by looking into the arcana of the faith they were raised in, but the possibilities held in the world around them.
I’d add that a good boycott has an end in mind. What’s the point of a boycott without returning once certain conditions are met? This, in my eyes, lends more credence to the idea that this is about drama and self-promotion. It would have been much less eventful had they merely demanded that, say, Michael Shermer appear in interview dismantling creationism, or better yet, a creationist (’s arguments—of course).
I haven’t sifted through the comments fully to see if it’s been addressed, but I think it’s very important to clearly separate the creationism amongst scientists vs. creationism in the public sphere. I am not at all worried about creationism making inroads amongst biologists, paleontologists, etc. What I am worried about is a bad idea catching on in the public arena where things like the education curriculum in public schools are decided. The perception of people not in the know could be that scientists are dodging the debate. It’s all very tedious, but it’s the situation we’re in. So, yes, someone should debate these folks. My fear with sending underlings (sorry, undergrads) is that these debates aren’t cool headed discussion of proofs and evidence. A person experienced in public speaking, even debating something that’s incorrect, can do so with effective rhetoric and theatrical flourish that will overtake the other side’s argument, at least in the mind of the audience. The advantage that materialists have in evidence is eventually insurmountable, but it still needs to be properly demonstrated to the people who don’t yet understand it.
Would the most vulnerable people exposed to creationist arguments really read these online debates, though? I don’t know, I consider this all more of a public education campaign than a “debate” per se. I’m not against creationism being persistent because it’s wrong; I’m against it because it’s wrong and harms the public good.
Any chance a feature could be added so that an account’s display name can be changed (without changing the account name, email, etc.)?
Sorry for the pedantry, but I believe that’s Philip K. Dick’s quote.
To the “sky is green” idea, I’d counter that the verification path might not work for converting people to atheism. Mormons for instance, suggest to people they will feel a burning in their heart when they read the Book of Mormon, which proves the books veracity. You need to logically piece together that any such physical sensation wouldn’t be sufficient to objectively verify anything. There isn’t an easy falsification of religious/magical thinking, just following chains of inference from observation. Non-believers just make a commitment to the minimal contortion of facts to fit their paradigm. As obvious as the Silence seems to be, some people don’t seem to hear it.