Beware Evaporative Cooling of Group Beliefs.
I am for the policy, although heavy-heartedly. I feel that one of the pillars of Rationality is that there should be no Stop Signs and this policy might produce some. On the other hand, I think PR is important, and that we must be aware of evaporative cooling that might happen if it is not applied.
On a neutral note—We aren’t enemies here. We all have very similar utility functions, with slightly different weights on certain terminal values (PR) - which is understandable as some of us have more or less to lose from LW’s PR.
To convince Eliezer—you must show him a model of the world given the policy that causes ill effects he finds worse than the positive effects of enacting the policy. If you just tell him “Your policy is flawed due to ambiguitiy in description” or “You have, in the past, said things that are not consistent with this policy”—I place low probability on him significantly changing his mind. You should take this as a sign that you are Straw-manning Eliezer, when you should be Steel-manning him.
Also, how about some creative solutions? An special post tag that must be applied to posts that condone hypothetical violence which causes them to only be seen to registered users—and displays a disclaimer above the post warning against the nature of the post? That should mitigate 99% of the PR effect. Or, your better, more creative idea. Go.
tl;dr—I went to the July minicamp, met interesting, ambitious people and am still applying things I learnt at camp months later to a subjectively great effect. Also, instructors and speakers were good, the food was good and I had lots of fun.
I went to a previous CFAR camp, and so can help give evidence regarding how helpful this might be from my personal experience. (I am not affiliated with CFAR).
I signed up for the July minicamp, not really knowing what to expect (and flying half-way around the world to get there). Having gone, I’m very happy that I did (although that might just be me rationalizing my choice). Here are some things that are more objective that happened (Note: That camp had a different syllabus than the workshop):
I learned new skills I use every day. For Example—Curiosity (I think about how to be more curious often), Value of information and micro-econ related things (Which I now use to convince my friends to do research before buying expensive things and gauge how much effort to put into negotiating with clients), habit forming and how to have more rational discussions. Its hard to assess whether they brought me a significant improvement (I don’t know how life would be not knowing them), but for the more concrete ones (GTD, Anki) the results are immediate. I feel that the fuzzier ones are also helping me (measured for example by how I feel better when using them compared to not) - but that’s harder to say for sure.
I met a diverse group of people. Some of them I would call ambitious, and most I would call interesting, but that’s a judgment call. They did include managers of small companies, programmers, software freelancers, scientists of various fields, journalists and a movie producer. Also, I think I gained from talking to every single one of them. I am still in touch with a couple of them.
I learned about X-Risk. While I am very into the instrumental rationality area of Less Wrong, I was never very sold on the FAI-Cryo subjects, and at camp I got to talk to smart people with opinions different than mine and explore the subject. I am still not sold, btw.
I learned about subjects outside curriculum, from neuroscience to professional poker. This came from casual discussions with other participants.
I had lots of fun. Hanging out with cool, smart, rationality minded people is awesome.
From a personal perspective the CFAR staff did a tremendous job. The speakers were good at speaking, prepared and cared about their material. The camp instructors were constantly available to talk to one-on-one and were knowledgeable and passionate about rationality in general and improving me as a camp-goer specifically. The environment was clean and the food was good. (I think the workshop is in a different venue, so no guarantees on the food :) )
Again, all in all I am very happy that I went and think it was a good investment. This doesn’t mean you should go—but given that you are similar to me (analytic, loves instrumental rationality, software entrepreneur, doesn’t already know the subject matter, doesn’t hang out with lots of rationalists day to day, social), I would give an over 50% probability that you would be happy that you did.