Probably not, but I’ll try to restate the message and motivation:
“I notice that wanting to do something is psychologically very different from aversion to not doing something. I have observed that attraction to saving far mode people and the like if taken very seriously is often the result of the latter. I observe and assert that the type of mind that does this is a disproportionately important mind to influence with “rationalist” or SingInst memes. This is the type of mind that truly groks Eliezer’s aversion to lost purposes. I theorize that this type of mind is sometimes formed by being around an abundance of double binds, though I am unwilling to put forth evidence strongly favoring this hypothesis. I think it is important to make a good impression on that type of mind and to avoid negatively reinforcing the anti-anti-virtuous behaviors associated with that type of mind, especially as it is a type of mind that is generally oversensitive to negative reinforcement and could become completely paralyzed. I notice that we specifically do not know how to create the skill of avoiding lost purposes which also makes it important to avoid negatively influencing those who already happen to have the skill. I have created this post to further the agenda of setting up a culture that doesn’t repel and perhaps even attracts this type of mind.
As a related side note, I notice that the skill of avoiding lost purposes is very important and wish to express some distress that no apparent effort has been put into addressing the problem. I assert that most “aspiring rationalists” do not seem to even aspire to attain this fundamental skill of rationality, and thus cannot actually be aspiring to rationality, even if they aspire to aspire to what they think is rationality. I thus implicitly claim that I would be able to tell if they were averse to lost purposes, but am unwilling to give evidence of this. I choose to be deliberately misleading about my confidence in this judgment to provoke interesting people to reply in indignation.”
I observe and assert that the type of mind that does this is a disproportionately important mind to influence with “rationalist” or SingInst memes.
From a Singularity perspective, the importance of rationality evangelism is being way overrated. There is still a tendency to mix up rationality and intelligence, as if becoming more rational will produce radically superior problem-solving skills. But if we’re talking about how to solve a problem like Friendly AI design, then what you need above all are people with high intelligence and relevant knowledge. “Aversion to lost purposes”, whatever that is, might be a trait of talented idealistic personalities who get distressed by dead hopes and organizational dysfunction, but some people learn early that that is normality, and their own progress is all the more streamlined for not fighting these facts of life.
In my opinion, the main source of the morale needed to sustain an effort like FAI research, in the midst of general indifference and incomprehension, is simply a sense among the protagonists that they are capable of solving the problem or of otherwise making a difference, and that derives in turn from a sense of one’s own abilities. If the objective is to solve the most difficult problems, and not just to improve the general quality of problem-solving in society, then rationality evangelism is a rather indiscriminate approach.
Agree that rationality evangelism (edit:) might be overrated, the importance is spreading the Friendliness-might-be-important memes far and apparently SingInst is using “rationality” as one of their memetic weapons of choice. I personally am not suggesting this memetic strategy is a well-thought-out one. “Aversion to lost purposes” totally doesn’t at all mean getting distressed because this world isn’t the should world, it means the thing that Eliezer talks about in his post “Lost Purposes”.
Moreover, given what’s being said here, teaching an aversion may be the wrong tack. I suspect it’s more motivating to get strong, positive feedback when your efforts align with your goals. It’s hard to state the positive condition clearly; it’s far easier to point at instances of lost purposes and disapprove than to point at clear goal-oriented behavior and approve. It might be useful to learn, though.
Exactly, it’s tricky. I don’t know if anyone else will find this funny, but here’s a conversation I had recently:
Me: “Alright, I think I’ve decided to make myself even more horribly afraid of the consequences of flinching away from examining lost purposes and not thinking things through from first principles.” Other: “Um um um um so I’m not sure that’s a good idea...” Me: “Why? See, it’s possible that it will destroy my motivation system, but the best thinkers I know by far all seem to have this tendency. My only comparative advantage at this point is in thinking well. Therefore...” Other: “You bastard.”
Could you say that in English?
Probably not, but I’ll try to restate the message and motivation:
“I notice that wanting to do something is psychologically very different from aversion to not doing something. I have observed that attraction to saving far mode people and the like if taken very seriously is often the result of the latter. I observe and assert that the type of mind that does this is a disproportionately important mind to influence with “rationalist” or SingInst memes. This is the type of mind that truly groks Eliezer’s aversion to lost purposes. I theorize that this type of mind is sometimes formed by being around an abundance of double binds, though I am unwilling to put forth evidence strongly favoring this hypothesis. I think it is important to make a good impression on that type of mind and to avoid negatively reinforcing the anti-anti-virtuous behaviors associated with that type of mind, especially as it is a type of mind that is generally oversensitive to negative reinforcement and could become completely paralyzed. I notice that we specifically do not know how to create the skill of avoiding lost purposes which also makes it important to avoid negatively influencing those who already happen to have the skill. I have created this post to further the agenda of setting up a culture that doesn’t repel and perhaps even attracts this type of mind.
As a related side note, I notice that the skill of avoiding lost purposes is very important and wish to express some distress that no apparent effort has been put into addressing the problem. I assert that most “aspiring rationalists” do not seem to even aspire to attain this fundamental skill of rationality, and thus cannot actually be aspiring to rationality, even if they aspire to aspire to what they think is rationality. I thus implicitly claim that I would be able to tell if they were averse to lost purposes, but am unwilling to give evidence of this. I choose to be deliberately misleading about my confidence in this judgment to provoke interesting people to reply in indignation.”
From a Singularity perspective, the importance of rationality evangelism is being way overrated. There is still a tendency to mix up rationality and intelligence, as if becoming more rational will produce radically superior problem-solving skills. But if we’re talking about how to solve a problem like Friendly AI design, then what you need above all are people with high intelligence and relevant knowledge. “Aversion to lost purposes”, whatever that is, might be a trait of talented idealistic personalities who get distressed by dead hopes and organizational dysfunction, but some people learn early that that is normality, and their own progress is all the more streamlined for not fighting these facts of life.
In my opinion, the main source of the morale needed to sustain an effort like FAI research, in the midst of general indifference and incomprehension, is simply a sense among the protagonists that they are capable of solving the problem or of otherwise making a difference, and that derives in turn from a sense of one’s own abilities. If the objective is to solve the most difficult problems, and not just to improve the general quality of problem-solving in society, then rationality evangelism is a rather indiscriminate approach.
Agree that rationality evangelism (edit:) might be overrated, the importance is spreading the Friendliness-might-be-important memes far and apparently SingInst is using “rationality” as one of their memetic weapons of choice. I personally am not suggesting this memetic strategy is a well-thought-out one. “Aversion to lost purposes” totally doesn’t at all mean getting distressed because this world isn’t the should world, it means the thing that Eliezer talks about in his post “Lost Purposes”.
How much effort has been put into teaching an aversion to lost purposes? What has been tried and what have the failures looked like?
Moreover, given what’s being said here, teaching an aversion may be the wrong tack. I suspect it’s more motivating to get strong, positive feedback when your efforts align with your goals. It’s hard to state the positive condition clearly; it’s far easier to point at instances of lost purposes and disapprove than to point at clear goal-oriented behavior and approve. It might be useful to learn, though.
We must thoroughly research this. :j
Exactly, it’s tricky. I don’t know if anyone else will find this funny, but here’s a conversation I had recently:
I recognize this mental state! I don’t know if that’s hilarious or terrifying. :/
This actually got me thinking, though… I’m working on a top level comment now.