I would not worry about that for three reasons: 1) I am very shy online. Even posting this took several days and I did not look at the comments for almost a day after. 2) I am bringing this here first to see if it is worth considering, and also because I want input not only on the idea, but on the idea of spreading it further. 3) I would never identify myself with MIRI, etc. not because I would not want to be identified that way, but because I have absolutely not earned it. I also give everyone full permission to disavow me as a lone crackpot as needed should that somehow become a problem. That said, thank you for bringing this up as a concern. I had already thought about it, which is one of the reasons I was mentioning it as a tentative consideration for more deliberation by other people. That said, had I not, it could have been a problem. A lot of stuff in this area is really sensitive, and needs to be handled carefully. That is also why I am nervous to even post it.
All of that said, I think I might make another tentative proposal for further consideration. I think that some of these ideas ARE worth getting out there to more people. I have been involved in International NGO work for over a decade, studied it at university, and have lived and worked in half a dozen countries doing this work, and had no exposure to Effective Altruism, FHI, Existential Risk, etc. I hang out in policy/law/NGO circles, and none of my friends in these circles talk about it either. These ideas are not really getting out to those who should be exposed to them. I found EA/MIRI/Existential Risk through the simulation argument, which I read about on a blog I found off of reddit while clicking around on the internet about a year ago. That is kind of messed up. I really wish I had stumbled onto it earlier, and I tentatively think there is a lot of value in making it easier for others to stumble onto it into the future. Especially policy/law types, who are going to be needed at some point in the near future anyway.
I also feel that the costs of people thinking that people have “weird ideas” should probably be weighed against the benefits of flying the flag for other like-minded people to see. For the most part, people not liking other people is not much different than them not knowing about them, but having allies and fellow-travelers adds value. It is more minds to attack difficult problems at more angles, more policy makers listening when it is time to make some proposals, and it is more money finding its way into MIRI/FHI/etc. It might be worth trying to make existential risk a more widely known concern, a bit like climate change. It would not necessarily even have to water down LW, as it could be that those interested in the LW approach will come here, and those from other backgrounds, especially less technical backgrounds, find lateral groups. In climate change now, there are core scientists, scientists who dabble, and a huge group of activist types/policy people/regulators with little to no interest in the science who are sort of doing their own thing laterally to the main guys.
I would not worry about that for three reasons: 1) I am very shy online. Even posting this took several days and I did not look at the comments for almost a day after. 2) I am bringing this here first to see if it is worth considering, and also because I want input not only on the idea, but on the idea of spreading it further. 3) I would never identify myself with MIRI, etc. not because I would not want to be identified that way, but because I have absolutely not earned it. I also give everyone full permission to disavow me as a lone crackpot as needed should that somehow become a problem. That said, thank you for bringing this up as a concern. I had already thought about it, which is one of the reasons I was mentioning it as a tentative consideration for more deliberation by other people. That said, had I not, it could have been a problem. A lot of stuff in this area is really sensitive, and needs to be handled carefully. That is also why I am nervous to even post it.
All of that said, I think I might make another tentative proposal for further consideration. I think that some of these ideas ARE worth getting out there to more people. I have been involved in International NGO work for over a decade, studied it at university, and have lived and worked in half a dozen countries doing this work, and had no exposure to Effective Altruism, FHI, Existential Risk, etc. I hang out in policy/law/NGO circles, and none of my friends in these circles talk about it either. These ideas are not really getting out to those who should be exposed to them. I found EA/MIRI/Existential Risk through the simulation argument, which I read about on a blog I found off of reddit while clicking around on the internet about a year ago. That is kind of messed up. I really wish I had stumbled onto it earlier, and I tentatively think there is a lot of value in making it easier for others to stumble onto it into the future. Especially policy/law types, who are going to be needed at some point in the near future anyway.
I also feel that the costs of people thinking that people have “weird ideas” should probably be weighed against the benefits of flying the flag for other like-minded people to see. For the most part, people not liking other people is not much different than them not knowing about them, but having allies and fellow-travelers adds value. It is more minds to attack difficult problems at more angles, more policy makers listening when it is time to make some proposals, and it is more money finding its way into MIRI/FHI/etc. It might be worth trying to make existential risk a more widely known concern, a bit like climate change. It would not necessarily even have to water down LW, as it could be that those interested in the LW approach will come here, and those from other backgrounds, especially less technical backgrounds, find lateral groups. In climate change now, there are core scientists, scientists who dabble, and a huge group of activist types/policy people/regulators with little to no interest in the science who are sort of doing their own thing laterally to the main guys.