Allow me to add an experimental test to this: try sleeping in a well-ventilated room (eg. multiple windows open, box fan in a window) and see if incidents decrease.
Intrism
Greetings, LessWrongers. I call myself Intrism; I’m a serial lurker, and I’ve been hiding under the cupboards for a few months already. As with many of my favorite online communities, I found this one multiple times, through Eliezer’s website, TVTropes, and Methods of Rationality (twice), before it finally stuck. I am a student of computer science, and greatly enjoy the discipline. I’ve already read many of the sequences. While I can’t say I’ve noticed an increase in rationality since I’ve started, I have made some significant progress on my akrasia, including recently starting on an interesting but unknown LW-inspired technique which I’ll write up once I have a better idea of how well it’s performing.
Over 20% of women in the U.S. experience domestic violence. The incidence of sociopathy is at or below 5% so it’s more likely that an abusive male in a relationship is not a sociopath.
That isn’t logically valid. It’s possible for a single person to abuse more than one woman. Therefore, the percentage of abusers in the population is likely lower than the percentage of abused. I don’t know how much lower that is, but “less than 10%” is entirely plausible.
I’d argue that being a repeat offender is, for any crime and especially those with low conviction rates, more likely than being a repeat victim, by simple logic of “an offender chooses, a victim does not.” You are right, though, in that I should have mentioned the possibility.
Maybe later, but as it is the application’s a bit hacked together; I’d be a bit embarrassed to show it around, honestly. I’m going to clean it up soon, so I might open-source it then.
I can do that. I can’t promise soon, because I’ve got quite a bit of classwork to do, but I’ve set a deadline for two weeks from now. Expect to see something before then.
Could you try using smaller candy?
The way the feeder is built, that wouldn’t really help. It dispenses a constant volume, not a set number of candies. I could try to reduce the dispensed volume further, but I think other techniques would be best to try first.
if the reward is in the system, I tend not to wait very long before using it.
This seems OK to me.
It’s not a problem except insofar as it interferes with some of the rules.
Or perhaps giving 1 candy per N points? Or giving a candy with probability 1/N?
These are the two big options I’m considering for next time. I’m leaning towards the “1 candy per N points” model, because that allows me to “gamify” the system with a big XP bar.
every 10 seconds, with probability ~20%, show a message in the background. When that appears, I examine my thoughts just prior to it appearing and reward if they were about work or other productive things.
Have you had any problems with the context switching? It seems like being interrupted every ~50 seconds would make me less productive.
you can eat or drink whatever you want whenever you want
Not quite. I don’t have any candy easily available to me (I suppose I could buy more, but that would be a pain), aside from what’s in the machine. Theoretically, I could eat that whenever I want, but I have some pretty strong incentives not to do so. I’ve precommitted to not taking any candy out, and I don’t want to break the precommitment (plus doing so would probably ruin the system forever). And, of course, there’s a very well-placed Schelling fence helping me stay honest, so it’s not even that hard.
One of those syllables is unnecessary. Try Rationality: How to Be Less Wrong.
I wonder if it’s possible to hack an electronic air freshener in this way.
I’ve never known an electronic air freshener which wouldn’t be more useful as a punishment than a reward.
Right now, I’m using Tom’s Planner, a Gantt chart app, for one of my class projects. It’s working really well.
I’ve considered that sort of thing, but I’m not really very good at estimating how long things take. I had considered building a time-estimation game into my scheduler, but so far that hasn’t worked out.
If your typing speed has a strong correlation to how quickly you’re getting your work done
Unfortunately, work speed only loosely correlates with utility. Bad code written quickly can waste more time than it saves.
I’d be worried about eating candy constantly for the sake of your teeth.
It’s not that constant. I’m fairly sure I’ve been eating less candy under the system than I would if I had a no-strings-attached bag of candy corn, anyhow...
More concisely, the article presents a long and elaborate rebuttal to the name “tribalism” without actually discussing the concept of tribalism at all. It also points out the fancy in Eliezer’s fanciful example at great length.
I’ve never really observed that. Actually, my impression has always been that there’s a profusion of firehose-style group blogs like Huffington Post or the Daily Kos (with LessWrong being an unusually successful version of these), but that slow, thoughtful, non-instant-response, essay-format content like More Right’s present lineup can be hard to find. The only thing I’d suggest regarding content volume is that regular, frequent updates would be helpful.
The name’s a bit clever. However, I don’t think it’s a very good idea to make it so close to the name of a better-known website, as that makes it unusually prone to accidental corruption. This is made doubly unfortunate by the fact that contamination with the name “LessWrong” will invert the meaning; I’ve nearly flubbed it as “More Wrong” multiple times already.
It appears that the tribalism post has vanished—the link has gone dead, and it’s not on the main page anymore. What’s up with that? Will it be coming back later?
I sometimes see refutations of pro-religious arguments on this site, but no refutations of good arguments.
What good arguments do you think LW hasn’t talked about?
My point in posting this is simply to ask you—what, in your opinion, are the most legitimate criticisms of your own way of thinking?
Religion holds an important social and cultural role that the various attempts at rationalist ritual or culture haven’t fully succeeded at filling yet.
There are as many atheists who have never heard a decent defense of religion as there are religious fundamentalists who have never bothered to think rationally.
This seems improbable, considering that there are vastly more religious people than atheists.
I’ve found many intelligent atheists, and I’m sure that there are rational intellectuals out there who disagree with LW. But where are they?
As far as I know, most criticism of LW focuses on its taking certain strange problems seriously, not on atheism. LW has an unusual focus on Pascal-like problems, on artificial intelligence, on acausal trade, on cryonics and death in general, and on Newcomb’s Problem. Many of these focuses result in beliefs that other rationalist communities consider “strange.” There is also some criticism of Eliezer’s position on quantum mechanics, but I’m not familiar enough with that issue to comment on it.
If lights that are turned off are flickering, I recommend getting an electrician in to look at them. That’s clearly not supposed to happen (should be impossible, actually), and might be an indication of a potential electrical fire hazard. Just curious, does this house often blow breakers?