Idk, for me it’s a matter of instinct. The stuff written by Vassar, and others like Benquo etc, feels like mindfuckery to me. Also I met Vassar in person and he came across as someone who’s very confident and very wrong at the same time.
If I remember correctly, the topic of disagreement was healthcare systems. I said I liked the Swiss one. Vassar replied that it couldn’t possibly work well, because the only working healthcare system in the world was Metamed, his startup at the time which later failed. A roomful of rationalists (quite high caliber, we were at a MIRI workshop) nodded along to him. I think Eliezer was in the room too but noncommittal? Anyway I was almost in disbelief about this, left the party pretty soon, and it must’ve played a role in my drifting away from the rationalist scene overall.
If I remember correctly, the topic of disagreement was healthcare systems. I said I liked the Swiss one. Vassar replied that it couldn’t possibly work well, because the only working healthcare system in the world was Metamed, his startup at the time which later failed. A roomful of rationalists (quite high caliber, we were at a MIRI workshop) nodded along to him. I think Eliezer was in the room too but noncommittal? Anyway I was almost in disbelief about this, left the party pretty soon, and it must’ve played a role in my drifting away from the rationalist scene overall.
I’ve never actually lived there, but this is what I imagine the bay area in general is like, not just rationalists.
(I live in the Bay Area, and this really seems extremely far from a representative experience of what it’s like to live here. There are definitely many weird and often interesting contrarians, but ‘seeing a room nod along with a crazy statement’ is really not what I’ve observed happening here. Vigorous debate and disagreement is quite common)
I do have the impression that the Vassarites are underestimating how expensive information is, and overestimating how much is known.
I think this makes them mistaken about some things both on the object level (plausibly like in your case) and on the meta-level, and that it is an obstacle for some of their communication.
Maybe if I got involved with them, I could better convince them of the cost of information.
However I think they also have very different standards for what is acceptable performance, which seem in principle achievable if one had improvements in some of the areas Vassarites point as, but which is rarely achieved otherwise.
Update: In a twitter chatroom, I was trying to ask the Vassarites the complaints they have about rationalism and EA, so I could write it up in a list and interrogate rationalists/EAs about it, etc..
We got halfway done with making the list, and then Michael Vassar got frustrated and decided to leave, with the reasoning that giving me the complaints they have requires too much effort/attention, and I should already have information sufficient to notice that “EA is in general fraudulent if taken literally and is an attempt by non-literal language to prevent literal language if taken non-literally”.
… I am not active in EA, so I am not sure where I would get the information from, to be honest.
Edit: Update 2: He might have changed his mind. We will see.
Idk, for me it’s a matter of instinct. The stuff written by Vassar, and others like Benquo etc, feels like mindfuckery to me. Also I met Vassar in person and he came across as someone who’s very confident and very wrong at the same time.
If I remember correctly, the topic of disagreement was healthcare systems. I said I liked the Swiss one. Vassar replied that it couldn’t possibly work well, because the only working healthcare system in the world was Metamed, his startup at the time which later failed. A roomful of rationalists (quite high caliber, we were at a MIRI workshop) nodded along to him. I think Eliezer was in the room too but noncommittal? Anyway I was almost in disbelief about this, left the party pretty soon, and it must’ve played a role in my drifting away from the rationalist scene overall.
I’ve never actually lived there, but this is what I imagine the bay area in general is like, not just rationalists.
(I live in the Bay Area, and this really seems extremely far from a representative experience of what it’s like to live here. There are definitely many weird and often interesting contrarians, but ‘seeing a room nod along with a crazy statement’ is really not what I’ve observed happening here. Vigorous debate and disagreement is quite common)
I do have the impression that the Vassarites are underestimating how expensive information is, and overestimating how much is known.
I think this makes them mistaken about some things both on the object level (plausibly like in your case) and on the meta-level, and that it is an obstacle for some of their communication.
Maybe if I got involved with them, I could better convince them of the cost of information.
However I think they also have very different standards for what is acceptable performance, which seem in principle achievable if one had improvements in some of the areas Vassarites point as, but which is rarely achieved otherwise.
Update: In a twitter chatroom, I was trying to ask the Vassarites the complaints they have about rationalism and EA, so I could write it up in a list and interrogate rationalists/EAs about it, etc..
We got halfway done with making the list, and then Michael Vassar got frustrated and decided to leave, with the reasoning that giving me the complaints they have requires too much effort/attention, and I should already have information sufficient to notice that “EA is in general fraudulent if taken literally and is an attempt by non-literal language to prevent literal language if taken non-literally”.
… I am not active in EA, so I am not sure where I would get the information from, to be honest.
Edit: Update 2: He might have changed his mind. We will see.
Ok NOW he left.