I wish this kind of stuff was taught to more children. Too few people fall in love with reality.
Michael_G.R.
The dust speck is described as “barely enough to make you notice”, so however many people it would happen to, it seems better than even something a lot less worse than 50 years of horrible torture. There are so many irritating things that a human barely notices in his/her life, what’s an extra dust speck?
I think I’d trade the dust specks for even a kick in the groin.
But hey, maybe I’m missing something here...
I’m currently reading Global Catastrophic Risks by Nick Bostrom and Cirkovic, and it’s pretty scary to think of how arbitrarily everything could go bad and we could all live through very hard times indeed.
That kind of reading usually keeps me from having my soul sucked into this imagined great future...
“later came to reject (on a deliberate level) the idea that the Bible was not written by the hand of God
Don’t you mean “was written by...” here?
Here’s my theory on this particular AI-Box experiment:
First you explain to the gatekeeper the potential dangers of AIs. General stuff about how large mind design space is, and how it’s really easy to screw up and destroy the world with AI.
Then you try to convince him that the solution to that problem is building an AI very carefuly, and that a theory of friendly AI is primordial to increase our chances of a future we would find “nice” (and the stakes are so high, that even increasing these chances a tiny bit is very valuable).
THEN
You explain to the gatekeeper that this AI experiment being public, it will be looked back on by all kinds of people involved in making AIs, and that if he lets the AI out of the box (without them knowing why), it will send them a very strong message that friendly AI theory must be taken seriously because this very scenario could happen to them (not being able to keep the AI in a box) with their AI that hasn’t been proven to stay friendly and that is more intelligence than Eliezer.
So here’s my theory. But then, I’ve only thought of it just now. Maybe if I made a desperate or extraordinary effort I’d come up with something more clever :)
Robin, could you explain your reasoning. I’m curious.
Humans get barely noticeable “dust speck equivalent” events so often in their lives that the number of people in Eliezer’s post is irrelevant; it’s simply not going to change their lives, even if it’s a gazillion lives, even with a number bigger than Eliezer’s (even considering the “butterfly effect”, you can’t say if the dust speck is going to change them for the better or worse—but with 50 years of torture, you know it’s going to be for the worse).
Subjectively for these people, it’s going to be lost in the static and probably won’t even be remembered a few seconds after the event. Torture won’t be lost in static, and it won’t be forgotten (if survived).
The alternative to torture is so mild and inconsequential, even if applied to a mind-boggling number of people, that it’s almost like asking: Would you rather torture that guy or not?
Yvain wrote: “The deal-breaker is that I really, really don’t want to live forever. I might enjoy living a thousand years, but not forever. ”
I’m curious to know how you know that in advance? Isn’t it like a kid making a binding decision on its future self?
As Aubrey says, (I’m paraphrasing): “If I’m healthy today and enjoying my life, I’ll want to wake up tomorrow. And so on.” You live a very long time one day at a time.
“so you don’t throw up every time you remember what you did on your vacation.”
Oh man. If this AI thing doesn’t work out, maybe you can try comedy?
I read on some skeptics blog that Jim Carey left $50 million to Jenny McCarthy. That sure could fund the SIAI for a while...
“What if the alternative was for the U.S. to firebomb and blockade Japan [...]”
That was probably another possibility, but certainly not the only alternative to nuking cities.
How about nuking somewhere very visible but not so populated with the message: “We have more where that came from. Surrender or the next one won’t be in a daisy field.” ?
Personally this year I’m thankful for the Earth’s molten interior:
http://michaelgr.com/2008/11/28/be-thankful-for-the-earths-molten-interior/
Definitely good advice on textbooks.
I’ve been slowly, sloooowly reading Molecular Biology of the Cell (5th ed, brand new) by Alberts, and Lehninger: Principles of Biochemistry (4th ed). So far, I prefer the first one.
Until recently I was too intimidated to buy them because that’s far from what I studied, but now I regret it. I should have started sooner.
I’d definitely like to have that 500 pages book in my library as reference, and give the shorter popular book as gift to friends (or my future kids?).
Only a small subset of the small group (relatively) of people who have read these blog posts as they were published will use them as reference later. Almost nobody (relatively) will be rediscovering them in a few years. That’s simply the nature of blogging. Who’s reading 3 years old BoingBoing posts right now?
I’m currently reading “Godel, Escher, Bach”, and from what I’ve read here, I think that Eliezer’s book could become something like that. Maybe not a Pulitzer (but who knows?), but certainly something special that changes the way people think.
“Jaron’s laughter seems largely the laughter of frustrated politesse. This comes out in his speech when he repeats to EY “I’ve been having this discussion for decades.”″
I think that’s BS. If Jaron didn’t want to discuss AI, then why agree to a BGTV episode with Eliezer, a research fellow at the Singularity Institute for Artificial Intelligence.
Eliezer tried to understand what Jaron was saying and asked him questions to get him to better explain his positions. jaron pretty much never tried to make himself clear (probably because there wasn’t much to explain in the first place), and he never really explained what he didn’t like about Eliezer’s position.
How long he’s been having this conversation (“for decades” or whatever) only means that he’s been having it for a long time, not that he has convincing arguments or that there’s any value to what he says.
It seems like this post isn’t as clear as it could be—or at least not as clear as Elizer’s best posts.
Either it needs another draft, or the problem lies with me and I just need to re-read it more carefully...
Keep using whatever examples and anecdotes you think best make your points, Eliezer. If that person doesn’t like what you write, he/she can just skip it.
“Most Americans of the time were unabashedly racist, had little concept of electricity and none of computing, had vaguely heard of automobiles, etc.”
So if you woke up in a strange world with technologies you don’t understand (at first) and mainstream values you disagree with (at first), you would rather commit suicide than try to learn about this new world and see if you can have a pleasant life in it?
Eliezer, could we get a status update on the books that will (I hope) come out of all this material you’ve been writing?
Is it still part of the grand plan, or did that change?
“Jack, I’ve spoken on many occasions previously but I was never in Toastmasters.”
If you’re planning to be speaking for money, Toastmasters might be a good investment. I would recommend at least checking it out to see what you could get out of it. Since you are not a beginner, you should check out ‘advanced’ clubs.
With public speaking, there’s nothing like experience. TM allows you to practice in a friendly environment where you can try new approaches (doesn’t matter if they fail), and to benefit from the knowledge of a group of people who have been doing this for a while and should be able to give you more useful feedback than most other groups.
You can also use the club as a way to practice for media appearances (tv interviews, radio, etc).
Small Typo Alert: The second quote should be attributed to “Mastering Eishin-Ryu Swordsmanship”
“Ryu”, not “Ruy”.
“That’s what Richard Dawkins understands that Michael Rose doesn’t—that Reason is not a game.”
Dawkins is also acutely aware that his opponents won’t always play fair, and have often quoted him and other scientists out of context to try to make it seem like they hold position that they don’t actually hold. That’s why he wants to have a tape recorder when he dies, so there can’t be rumors about his “deathbed conversion”.