We got married in a small town near St. Catharine’s, Ontario, a few weeks after it became legal there.
Thanks for the charity links. I find practical and aesthetic value in the challenging aspect of “shut up and multiply,”(http://lesswrong.com/lw/n3/circular_altruism/), particularly in the example you linked about purchasing charity efficiently. However, it seems to me that oversimplification can occur when we talk about human suffering.
(Please forgive me if the following is rehashing something written earlier.) For example, multiplying a billion people’s suffering for 1 second to make it equal to a billion seconds of consecutive suffering to make it seem way more bad than a million consecutive seconds—almost 12 straight days—of suffering done by one person is just plainly, rationally wrong. One proof of that is that distributing those million seconds as one-second bursts at regular intervals over a person’s life is better than the million consecutive seconds because the person is not otherwise unduly hampered by the occasional one-second annoyances, but would probably become unable to function well in the consecutive case, and might be permanently injured (a la PTSD). My point is there’s something missing from the equation, and that potential lies at the heart of the human impulse to be irrational when presented with the same choice as comparative gain vs. comparative loss.
My name is Scott Starin. I toyed with the idea of using a pseudonym, but I decided that this site is related enough to my real world persona that I should be safe in claiming my LW persona.
I am a spacecraft dynamics and control expert working for NASA. I am a 35-year old man married to another man, and we have a year-old daughter. I am an atheist, and in the past held animist and Christian beliefs. I would describe my ethics as rationally guided with one instinctive impulse to the basic Christian idea of valuing and respecting one’s neighbor, and another instinctive impulse to mistrust everyone and growl at anyone who looks like they might take my food. Understanding my own humanity and human biases seems a good path toward suppressing the instinctive impulses when they are inappropriate.
I came to this site from an unrelated blog that briefly said something like “Eliezer Yudkowsky is frighteningly intelligent” and linked to this site. So, I came to see for myself. I’ve read through a lot of the sequences. I really enjoyed the Three Worlds Collide story and forced my husband to read it. EY does seems to be intelligent, but I’m signing up because he and the rest of the community seem to shine brightest when new ideas are brought in. I have some ideas that I haven’t seen expressed, so I hope to contribute.
One area where I might contribute is from my professional interest in the management of catastrophic risk of spacecraft failure, which shares some ideas with biases associated with existential risk to the human species. Yudkowsky’s book chapter on the topic was really helpful.
Another area is in the difference between religious belief and religious practice. The strong tendency to reject religious belief by members of the LW community may come at the expense of really understanding what powerful emotional, and yet rational, needs may be met by religious practice. This is probably a disservice to those religious readers you have who could benefit from enhanced conversation with LW atheists. Religious communities serve important needs in our society, such as charitable support for the poor or imprisoned and helping loved ones who are in real existential crisis (e.g. terminally ill or suicidal), etc. (Some communities may even produce benefits that outweigh the costs of whatever injury to truth and rationality they may do.) It struck me that a Friendly AI that doesn’t understand these needs may not be feasible, so I thought I should bring it up.
I hope readers will note my ample use of “may” and “might” here. I haven’t come to any firm conclusions, but I have good reasons for my thoughts. (I’ll have to prove that last claim, I know. As a good-faith opener, I do go to a church that has a lot of atheist members—not agnostics, true atheists, like me.) I confess the whole karma thing at this site causes me some anxiety, but I’ve decided to give it a try anyway. I hope we can talk.
(Since I’m identifying myself, I am required by law to say: Nothing I write on this site should be construed as speaking for my employer. I won’t put a disclaimer in every post—that could get annoying—only those where I might reasonably be thought to be speaking for or about my work at NASA.)