There seem to be more and more people who think sacrificing their life to help build FAI is an ethical imperative.
I have always been extremely curious about this. Do people really sacrifice their lifes or is it largely just empty talk?
It seems like nobody who wouldn’t do anything else anyway is doing something. I mean, I don’t think Eliezer Yudkowsky or Luke Muehlhauser would lead significantly different lifes if there were no existential risks. They are just the kind of people who enjoy doing what they do.
Are there people who’d rather play games all day but sacrifice their lifes to solve friendly AI?
If developing AGI were an unequivocally good thing, as Eliezer used to think, then I guess he’d be happily developing AGI instead of trying to raise the rationality waterline. I don’t know what Luke would do if there were no existential risks, but I don’t think his current administrative work is very exciting for him. Here’s a list of people who want to save the world and are already changing their life accordingly. Also there have been many LW posts by people who want to choose careers that maximize the probability of saving the world. Judge the proportion of empty talk however you want, but I think there are quite a few fanatics.
I don’t think Eliezer Yudkowsky or Luke Muehlhauser would lead significantly different lifes if there were no existential risks. They are just the kind of people who enjoy doing what they do.
I think at one point Eliezer said that, if not for AGI/FAI/singularity stuff, he would probably be a sci-fi writer. Luke explicitly said that when he found out about x-risks he realized that he had to change his life completely.
I have always been extremely curious about this. Do people really sacrifice their lifes or is it largely just empty talk?
I sacrificed some very important relationships and the life that could have gone along with them so I could move to California, and the only reason I really care about humans in the first place is because of those relationships, so...
This is the use of metaness: for liberation—not less of love but expanding of love beyond local optima.
Due to comparative advantage not changing much is actually a relativly good, straightforward strategy: just farm and redirect money.
As an example of these Altruistic Ones user Rain been mentioned, so they are out there. They all be praised!
Factor in time and demographics. A lot of LWlers are young people, looking for ways to make money; they are not able to spend much yet, and haven’t had much impact yet. Time will have to show whether these stay true to their goals, or wether they are tempted to go the vicious path of always-growing investments into status.
I have always been extremely curious about this. Do people really sacrifice their lifes or is it largely just empty talk?
It seems like nobody who wouldn’t do anything else anyway is doing something. I mean, I don’t think Eliezer Yudkowsky or Luke Muehlhauser would lead significantly different lifes if there were no existential risks. They are just the kind of people who enjoy doing what they do.
Are there people who’d rather play games all day but sacrifice their lifes to solve friendly AI?
If developing AGI were an unequivocally good thing, as Eliezer used to think, then I guess he’d be happily developing AGI instead of trying to raise the rationality waterline. I don’t know what Luke would do if there were no existential risks, but I don’t think his current administrative work is very exciting for him. Here’s a list of people who want to save the world and are already changing their life accordingly. Also there have been many LW posts by people who want to choose careers that maximize the probability of saving the world. Judge the proportion of empty talk however you want, but I think there are quite a few fanatics.
Indeed, Eliezer once told me that he was a lot more gung-ho about saving the world when he thought it just meant building AGI as quickly as possible.
I think at one point Eliezer said that, if not for AGI/FAI/singularity stuff, he would probably be a sci-fi writer. Luke explicitly said that when he found out about x-risks he realized that he had to change his life completely.
I sacrificed some very important relationships and the life that could have gone along with them so I could move to California, and the only reason I really care about humans in the first place is because of those relationships, so...
— Nick Tarleton’s twist on T.S. Eliot
Due to comparative advantage not changing much is actually a relativly good, straightforward strategy: just farm and redirect money.
As an example of these Altruistic Ones user Rain been mentioned, so they are out there. They all be praised!
Factor in time and demographics. A lot of LWlers are young people, looking for ways to make money; they are not able to spend much yet, and haven’t had much impact yet. Time will have to show whether these stay true to their goals, or wether they are tempted to go the vicious path of always-growing investments into status.
I’m too irreparably lazy to actually change my life but my charitable donations are definitely affected by believing in FAI.