Edit: quote syntax anyone?
I feel like embracing humanity, but actively striving to overcome the “outright obnoxious” parts like biases IS humanism. At least, moreso than just adopting an “I love humanity unconditionally” attitude. I think harry’s patronus, as eliezer’s own would likely be, represents not just simple anthropocentrism, but the hope for a better future for humanity without losing those “my-values” that make us distinctively human.
Having a patronus that takes the shape of an intelligent life form with your values and no obnoxiousness is just representing abstractly that hope for the future of humanity.
I think the underlying values are one in the same. And the difference in shape does not correspond to a difference in concept.
Hi, my name is Tyler and I’ve been lurking LW for the last few months. I’m a full-time university student in California. Like others, I’ve refrained from posting because I feel I’m not yet quite up-to-date on many of the issues discussed here, though i’d considered many of them before ever finding LW.
I found LW through Yudkowski.net which I found through one of Eli’s more technical articles that popped up on a google search when I was first becoming interested in Artificial Intelligence. Since then, i’ve developed an interest in the big R.
As I read the sequences (I’m nearly through and I’ve been at it a while now) I am often pleasantly surprised when Eli brings up a topic that i’d previously considered, and even more so when he explains it. Overall, the zeitgeist of the LW community really appeals to me. I’m often frustrated at listening to people i know say things that would get torn apart here on LW. I guess i’m just glad to know that there’s a community here to which i can both learn tremendously, and hopefully contribute.
I’m working on filling in the holes right now, and the old adage “the more you know, the more you know you don’t know” is really having its way with me right now.