LW is to rationality as AIXI is to intelligence

Apparently LW does a great job on refining rationality and dissolving confusions. But is it helpful when it comes to anything apart from designing Friendly AI, apart from a purely academic treatment of rationality? I’m currently unable to benefit from what I have so far read on LW, it actually made me even more unproductive, to an extent that I get nothing done anymore. Let me explain...

You have to know that I’m still in the process of acquiring a basic education. If I say basic, I mean basic. Since I got almost no formal education, what I do know (or know about) is largely on a very low level, yet I am plagued by problems that are themselves on a level that require the intellect and education of the folks here on LW. The problem with that is that I’m yet lacking most of the skills, tools and requisite know-how while the problems in question concern me as well. This often causes me to get stuck, I can’t decide what to do. It also doesn’t help much that I am the kind of person who is troubled by problems others probably don’t even think about. An example from when I was much younger (around the age of 13) is when I was troubled by the fact that I could accidentally squash insects when walking over grass in our garden. Since I have never been a prodigy, far from it, it was kind of an unsolvable problem at that time, especially since I am unable to concentrate for very long and other similar problems are accumulating in my mind all the time. So what happened? After a time of paralysis and distress, as it happens often, I simply became reluctant and unwilling, angry at the world. I decided that it is not my fault that the world is designed like that and that I am not smart enough to solve the problem and do what is right. I finally managed to ignore it. But this happens all the time and the result is never satisfactory. This process too often ends in simply ignoring the problem or becoming unwilling to do anything at all. What I’m doing is not effective it seems, it already stole years of my life in which I could have learnt mathematics or other important things or done what I would have liked to do. You might wonder, shouldn’t this insight cause me to ignore subsequent problems and just learn something or do what I want to do, do something that is more effective? Nope, it is exactly the kind of mantra that LW teaches that always makes me think about it rather than ignoring the problem and trying to reach my goals. Namely that the low probability of a certain event might be outweighed by the possible positive or negative ‘utility’ that the problem implies, especially ethical considerations. What could happen if I just ignore it, if I instead pursue another goal?

It’s partly the choice that is killing me, do X or Y or continue thinking about either doing X or Y, or maybe search for some superior unknown unknown activity Z? For how long should I think about a decision and how long should I think about how long I should be thinking about it? Maybe the best analogy would be the browsing of Wikipedia on a subject that is unknown to you and over your head and clicking the first link to a page that explains a certain term you don’t know just to repeat that process until you end up with 10 additional problems on an entry that is only vaguely relevant to the original problem you tried to solve. The problem is still there and you’ve to make the decision to ignore it, pursue it further or think about what to do.

Recently I had blood vessel crack in my eye. Nothing to worry about, but I searched for it and became subsequently worried if something like that could happen in my brain too. It turned out that about 6 out of 100 people are predisposed for such brain aneurysms, especially people with high blood pressure. Now I might have a somewhat abnormal blood pressure and additional activity might make some blood vessel in my brain leak. Should I stop doing sports, should I even stop thinking too much because it increases the blood circulation in my brain (I noticed that I hear my blood flow when thinking too hard)? But how can I decide upon it without thinking? So I looked up on how to check if I was predisposed and it turned out that all tests are too risky. But maybe it would be rational to stop doing anything that could increase the blood pressure until there are less risky tests? And so I lost a few more days without accomplishing anything I wanted to accomplish.

How I feel about LW

LW makes me aware of various problems and tells me about how important it is to do this or that but it doesn’t provide the tools to choose my instrumental goals. Thanks to LW I learnt about Solomonoff induction. Great...fascinating! But wait, I also learnt that there is a slight problem: “the only problem with Solomonoff induction is that it is incomputable” Phew, thanks for wasting my time! See what I mean? I’m not saying that there is something wrong with what LW is doing, but people like me are missing some mid-level decision procedures on how to approach all the implications. I wish LW would also be teaching utilizable rationality skills by exemplifying the application of rationality to, and the dissolving of, real-life problems via the breakdown of decision procedures.

Take for example some of the top scoring posts. I intuitively understood them, agreed and upvoted them. My initial reaction was something along the lines of “wow great, those people think like me but are able to write down all I thought to be true.” Yes, great, but that doesn’t help me. I’m not a politician who’s going to create a new policy for dealing with diseases. Even if I was, that post would be completely useless because it is utopic and not implementable. The same could be said about most other posts. Awesome but almost completely useless when it comes to living your life. ‘Confidence levels inside and outside an argument’ was an really enlightening post but only made me even more uncertain. If there is often no reason to assume very low probabilities then I’m still left with the very high risks of various possibilities, just that they suddenly became much more likely in some cases.

The problem with LW is that it tells me about those low probability high risk events. But I don’t know enough to trust myself enough to overpower my gut feeling and my urge to do other things. I’d like to learn math etc., but maybe I should just work as baker or street builder to earn money to donate it to the SIAI? Maybe I should read the sequences to become more certain to be able to persuade myself? But maybe I should first learn some math to be able to read the sequences? But maybe I don’t need that and would waste too much time learning math when I could earn money? And how do I know what math is important without reading the sequences? And what about what I really want to do, intuitively, should I just ignore that?