Having been born/brought up in germany, where religion is almost a nontopic, I always read Science Fiction/Fantasy and always felt inclined to rational decisions. In my 12/13th year of school I had an exceptional good philosophy teacher, and found myself to find Utilitarism on some level logical. Again some years later (some before the time of the comment) I finally updated my mind/made the rational conclusion:
I should choose the most efficient path to reduce suffering in the world. I saw only two conclusions, getting really rich or becoming a successful uncorrupted politican. Since I was to lazy, felt not being able to assign relevant probability to reach either of these goals, and did not want to overthrow my other ethical views (dont become a power-hungry-politician or bad capitalist) I did not pursue it further. I started to study mechanical engineering, it gives like, A LOT of money and will enable me to further suffering-reducing technology.
Then I stumbled upon Eliezer Yudkowskis argument:
most bang for the buck/euro by accelerating FAI. A better writer than myself wrote:
″ I felt my entire ethical system restructuring over the course of about five seconds—a very peculiar feeling, let me tell you.”-and lost focus again due to having a completely broken motivation system and psychological problems- I am running on even faultier hardware than most. I am working on both topics now. Around nine months later I discovered HP-MOR via TV-Tropes-recommended FanFics and voila- here I am. Right now I am taking my time to decide wether or not I really assign FAI à la Yudkowski any relevant probability or its just a lazy excuse, and wether its cultish salvatory-aspects do worry me or thats just some kind of bias. Updating ones mind is hard!
Having been born/brought up in germany, where religion is almost a nontopic, I always read Science Fiction/Fantasy and always felt inclined to rational decisions. In my 12/13th year of school I had an exceptional good philosophy teacher, and found myself to find Utilitarism on some level logical. Again some years later (some before the time of the comment) I finally updated my mind/made the rational conclusion:
I should choose the most efficient path to reduce suffering in the world. I saw only two conclusions, getting really rich or becoming a successful uncorrupted politican. Since I was to lazy, felt not being able to assign relevant probability to reach either of these goals, and did not want to overthrow my other ethical views (dont become a power-hungry-politician or bad capitalist) I did not pursue it further. I started to study mechanical engineering, it gives like, A LOT of money and will enable me to further suffering-reducing technology.
Then I stumbled upon Eliezer Yudkowskis argument: most bang for the buck/euro by accelerating FAI. A better writer than myself wrote:
″ I felt my entire ethical system restructuring over the course of about five seconds—a very peculiar feeling, let me tell you.”-and lost focus again due to having a completely broken motivation system and psychological problems- I am running on even faultier hardware than most. I am working on both topics now. Around nine months later I discovered HP-MOR via TV-Tropes-recommended FanFics and voila- here I am. Right now I am taking my time to decide wether or not I really assign FAI à la Yudkowski any relevant probability or its just a lazy excuse, and wether its cultish salvatory-aspects do worry me or thats just some kind of bias. Updating ones mind is hard!