Just another Bay Area Singulatarian Transhumanist Libertarian Rationalist Polyamor-ish coder & math nerd. My career focuses on competitive governance; personally I’m very into personal development (“Inward & upward”); lately I’ve gotten super into cultivation novels because I want to continuously self-improve until my power has grown to where I can challenge the very heavens to protect humanity.
patrissimo
Yeah, but you’d get lots of applause!
Eliezer Yudkowsky’s keyboard only has two keys: 1 and 0.
The speed of light used to be much lower before Eliezer Yudkowsky optimized the laws of physics.
Wow, SIAI has succeeded in monetizing Less Wrong by selling karma points. This is either a totally awesome blunder into success or sheer Slytherin genius.
- 26 Feb 2011 15:27 UTC; 2 points) 's comment on Making money with Bitcoin? by (
One of the greatest benefits I’ve gotten from (westernized) Buddhism is the idea that a resistance to reality is at the root of much unhappiness.
It seems absurd to me that the human mind so constantly wishes that reality was different—I don’t see how it serves our evolutionary needs. But while I don’t have an explanation, it is amazing how often I find myself denying reality instead of “Immediate adaptation to the realities of the situation! Followed by winning!”. For example, when I encounter bad, unexpected auto traffic, whining is such a horribly unproductive reaction that it still boggles my mind every time I do it. Yet in many moods (already tired, stressed) it is my default response.
I think many rationalists would get a lot more personal happiness out of working on this single concept, as well as improving strategy for our causes, than many of the narrower and more complex ideas presented on OB/LW.
You seem to be assuming that what you want to hear is how people should be learning to communicate (“I’d prefer they skip it”), but part of the point is that we are not like most people. If you want to communicate effectively with the broader population, then you have to focus on what they like to hear, not judge communication suggestions based on whether you would like hearing it.
Also, I love brevity, but I charitably assumed that the politeness examples were exaggerated to make the point. Exaggerated examples, while they often bother analytical types who already get the point (“but that’s too far the other way!”) are (IMHO) quite useful at helping get across new ideas by magnifying them.
And compactness is hard, as is habit change. So developing compact politeness seems harder than developing politeness and then polishing it with brevity and clarity. Maybe too hard for some people—one habit at a time is often easier.
Couldn’t it just be an erroneous application of (an intuited version of) Newton’s law of cooling, which says that heat transfer is linearly proportional to heat difference? They assume that the thermostat temperature is setting the temperature of the heating element, and then apply their intuited Newton’s Law.
Seems pretty rational to me.
Sure, there are two ways to work on the problem. One is to increase willpower. The other is to learn tricks not to use it. I agree the second one is better. But let’s take this back to the context of Less Wrong and its effects.
Paul Graham’s tricks include turning off the internet. The “distractions and temptations” he wants you to remove from your office are things like Less Wrong. The existence of Less Wrong is the existence of a temptation tuned to those who wish to become more rational and more effective at achieving their goals. This makes it just as bad a thing in Graham’s analysis as in mine!
“Working on stuff you like”, and “rationalizing that stuff you like is work” are very different. The former is great when you can do it. The latter is the type of rationalization that Paul talked about in his recent essay Self-Indulgence, where the wost time-wasters are those that don’t feel like time-wasters:
The most dangerous way to lose time is not to spend it having fun, but to spend it doing fake work. When you spend time having fun, you know you’re being self-indulgent. Alarms start to go off fairly quickly. If I woke up one morning and sat down on the sofa and watched TV all day, I’d feel like something was terribly wrong. Just thinking about it makes me wince. I’d start to feel uncomfortable after sitting on a sofa watching TV for 2 hours, let alone a whole day.
And yet I’ve definitely had days when I might as well have sat in front of a TV all day—days at the end of which, if I asked myself what I got done that day, the answer would have been: basically, nothing. I feel bad after these days too, but nothing like as bad as I’d feel if I spent the whole day on the sofa watching TV. If I spent a whole day watching TV I’d feel like I was descending into perdition. But the same alarms don’t go off on the days when I get nothing done, because I’m doing stuff that seems, superficially, like real work.
That is what I am claiming Less Wrong is—something that seems, superficially, like real personal growth work.
One possible solution is to have education financed by equity rather than loans, the third party who pays for your education does so in return for some share of future income. Besides the obvious effect of funding profitable education, this has the totally awesome side-effect of giving great incentive to an organization to figure out exactly how much each person’s income will be increased by each job—which includes predicting salary, probability of graduating, future macro trends, etc.
The third party wouldn’t have much incentive to predict what jobs will be most fun (only whether you will hate it so much you quit), but at least a big chunk of the problem would be solved. Personally I think the solution would involve “higher education is rarely worth it”, and direct people towards vocational training or just getting a damn job. But I could be wrong—the great thing about a mechanism is that I don’t have to be right about the results to know that it would make things more efficient :).
I love saying crazy things that I can support, and I thrive on the attention given to the iconoclast, so I find it impossible to answer this.
The only beliefs that I wouldn’t feel comfortable saying here are beliefs that I want to be true, want to argue for, but I know would get shredded. This is one reason I try to hang out with smart, argumentative people—so that my concern about being shredded in an argument forces me to more carefully evaluate my beliefs. (With less intelligent people, I could say false things and still win arguments).
I am skeptical that we can win without the Dark Arts.
There are lots of people out there with bad goals and wrong beliefs and powerful skills at persuading and manipulating people to take on those beliefs and help those goals. Like marketers and politicians. If we want resources for our goals, and to spread our beliefs, we need to learn the techniques of persuasion and memetics.
This isn’t a video game, the world doesn’t care about Light and Dark, and it isn’t set up so that the good guys can win. Those who employ the best techniques for achieving their goals are more likely to achieve their goals. In a world where good people refuse to learn how to persuade others and gain power, the world will be ruled by bad people. That’s how it is now, and I’m sick of it.
I’m Gray and proud of it. Shades of gray matter—a lot—but White is for losers.
Play poker for significant amounts of money. While it only tests limited and specific areas of rationality, and of course requires some significant domain-specific knowledge, poker is an excellent rationality test. The main difficulty of playing the game well, once one understands the basic strategy, is in how amazingly well it evokes and then punishes our irrational natures. Difficulties updating (believing the improbable when new information comes in), loss aversion, takeover by the limbic system (anger / jealousy / revenge / etc), lots of aspects that it tests.
This post is wonderful! The general category of “codified knowledge about best practices on how to do something important gained from doing it for hundreds of hours” is way underrepresented on LW. The density of practical experience makes it harder to write than finding a study or bias and musing about it, but it also makes it a lot more useful.
I look forward to helping replicate these practices in the Bay Area. Although achieving gender balance here is going to be a pretty significant challenge...
I worry that new year’s resolutions are a Schelling point for failed self-improvement that, by using a fundamentally flawed approach, tend to fail and then discourage people from future attempts at positive change.
Can we try to switch to the meme of “Annual retreat & reflect about one’s life, goals, and habits”, rather than these so frequently failed “resolutions”, whose very name implies that the solution is more “resolve”, and thus the problem is insufficient “resolve”, rather than insufficient experimentation, knowledge about habit formation, realism about achievable change, or any of the other numerous actual reasons?
I mean, it’s 2010, and we know we lose weight through hacks, not the application of more willpower—same goes for anything else.
3 - It better be very focused, with a strong cultural element that says “do this or you are doing nothing”, on in-person practice and feedback, otherwise it will just be wankery—“social skills porn” posts that people read and write without ever learning anything. You know, kind of like Less Wrong is rationalist porn :). While I’m sure that lots of people read PUA without practicing it, there is a strong cultural tradition that PUA is all about “the field” and you can’t practice it very far without going into “the field”. If you don’t have that, you are doomed.
I have a long post about this coming up, with a pretty similar viewpoint to yours, just a more general goal, and similar technical requirements, we should talk.
I’m productive, and I’ve been paid > $100/hr for my work (at Google, before moving to the non-profit sector), and could have multiple offers to do that again in multiple fields anytime I wanted.
I loved parts of my work, sure, but there were also large parts of it that I had to forcibly direct my attention to. The best tasks to be the most productive are rarely the most fun. And in a world of compelling entertainment, reading the latest blogs, books, watching TV, surfing the web, are always fighting for people’s attention. Mine at least. To direct my attention to productive activities, to my consciously chosen goals and the best tasks to achieve them, is hard Work.
Yes, there are moments of flow, moments we love, moments that draw our attention. And the more of those, the better we’ve chosen our work. But I think you have a huge selection bias—it may be that the most productive people are the ones who enjoy a coincidence between what they do and what draws their attention, but I doubt that very many jobs offer that overlap or that we can employ very many people that way. Hence, for most people, the way to be more productive is to get better at directing their attention.
As another angle, I completely love my current employment role—running an organization trying to build startup countries on the ocean. I love the mission I work on, I love the people I work with, I am one of those incredibly fortunate people who is doing what they love. But the tasks I need to accomplish each day to work towards my audacious and inspiring goal? Yawn. Bleh. I think that’s just because inspiring goals often require boring subgoals and tasks, not because I haven’t picked the right job.
Feed confirmatory evidence to others, give them tests to run which you know beforehand are confirmatory
This is not a way to take advantage of confirmation bias. Confirmation bias means that others look for confirming evidence for their true theories, and ignore disconfirming evidence. This process is not much affected by you adding extra confirmatory evidence—they can find plenty on their own. Instead, it is a way to fool rational people—for example, Bayesians who update based on evidence will update wrong if fed biased evidence. Which doesn’t really fit here.
The way to actually use confirmation bias to convince people of things is to present beliefs you want to transmit to them as evidence for things they already believe. Then confirmation bias will lead them to believe this new evidence without question, because they wish to believe it to confirm their existing beliefs.
At the risk of provoking defensiveness I will say that it really sounds like you are trying to rationalize your preferences as being rational when they aren’t.
I say this because the examples that you were giving (local food kitchen, public radio), when compared to truly efficient charities (save lives, improve health, foster local entrepreneurship), are nothing like “save 9 kids + some other benefits” vs. “save 10 kids and nothing else”. It″s more like “save 0.1 kids that you know are in your neighborhood” vs. “save 10 kids that you will never meet” (and that’s probably an overestimate on the local option). Your choice of a close number is suspicious because it is so wrong and so appealing (by justifying the giving that makes you happy).
The amount of happiness that you create through local first world charities is orders of magnitude less than third world charities. Therefore, if you are choosing local first world charities that help “malnourished” kids who are fabulously nourished by third world standards, we can infer that the weight you put on “saving the lives of children” (and with it, “maximizing human quality-adjusted life years”) is basically zero. Therefore, you are almost certainly buying warm fuzzies. That’s consumption, not charity. I’m all for consumption, I just don’t like people pretending that it’s charity so they can tick their mental “give to charity” box and move on.
But part of my point is that LW isn’t “focusing on rationality”, or rather, it is focusing on fun theoretical discussions of rationality rather than practical exercises that are hard to work implement but actually make you more rational. The self-help / life hacking / personal development community is actually better (in my opinion) at helping people become more rational than this site ostensibly devoted to rationality.
I’m disappointed at how few of these comments, particularly the highly-voted ones, are about proposed solutions, or at least proposed areas for research. My general concern about the LW community is that it seems much more interested in the fun of debating and analyzing biases, rather than the boring repetitive trial-and-error of correcting them.
Anna’s post lays out a particular piece of poor performance which is of core strategic value to pretty much everyone—how to identify and achieve your goals—and which, according to me and many people and authors, can be greatly improved through study and practice. So I’m very frustrated by all the comments about the fact that we’re just barely intelligent and debates about the intelligence of the general person. It’s like if Eliezer posted about the potential for AI to kill us all and people debated how they would choose to kill us instead of how to stop it from happening.
Sorry, folks, but compared to the self-help/self-development community, Less Wrong is currently UTTERLY LOSING at self-improvement and life optimization. Go spend an hour reading Merlin Mann’s site and you’ll learn way more instrumental rationality than you do here. Or take a GTD class, or read a top-rated time-management book on Amazon.
Talking about biases is fun, working on them is hard. Do Less Wrongers want to have fun, or become super-powerful and take over (or at least save) the world? So far, as far as I can tell, LW is much worse than the Quantified Self & time/attention-management communities (Merlin Mann, Zen Habits, GTD) at practical self-improvement. Which is why I don’t read it very often. When it becomes a rationality dojo instead of a club for people who like to geek out about biases, I’m in.