More specifically, topics other than Friendly AI. Groups other than SIAI and FHI that are working on projects to reduce specific x-risks that might happen before anyone has a chance to create a FAI. Cost/benefit analysis of donating to these projects instead of or in addition to SIAI and FHI.
I thought the recent post on How to Save the World was awesome, and I would like to see more like it. I would like to see each of the points from that post expanded into a post of its own.
Is LW big enough for us to be able to form sub-groups of people who are interested in specific topics? Maybe with sub-reddits, or a sub-wiki? Regular IRC/Skype/whatever chat meetings? I still haven’t thought through the details of how this would work. Does anyone else have ideas about this?
Existential Risks
More specifically, topics other than Friendly AI. Groups other than SIAI and FHI that are working on projects to reduce specific x-risks that might happen before anyone has a chance to create a FAI. Cost/benefit analysis of donating to these projects instead of or in addition to SIAI and FHI.
I thought the recent post on How to Save the World was awesome, and I would like to see more like it. I would like to see each of the points from that post expanded into a post of its own.
Is LW big enough for us to be able to form sub-groups of people who are interested in specific topics? Maybe with sub-reddits, or a sub-wiki? Regular IRC/Skype/whatever chat meetings? I still haven’t thought through the details of how this would work. Does anyone else have ideas about this?