(Emotional level note: Please upgrade your politeness level. I’ve been rude earlier, but escalating is a bad move even then; I’m de-escalating now. Your current politeness level is generating signs of treating debate as a conflict, and of trolling.)
Basing your ethics
Can you clarify that phrase? I can only parse it can “deriving your ethics from”, but ethical systems are derived from everyday observations like “Hey, it seems bad when people die”, then reasoning about it. Then the ethics exist, and “intergalactic civilizations are desirable” come from them.
Maybe you meant “designating those notions as the most desirable things”? They are consequences of the ethical system, yeah, but “The thing you desire most is impossible”, while bad news, is no reason to change what you desire. (Which is why I called sour grapes.)
delusion
You seem to confuse “A positive Singularity is desirable” (valuing lives, ethical systems) and “A positive Singularity is likely” (pattern-matching with sci-fi).
science fiction delusion
You are invoking the absurdity heuristic. “Intergalactic civilizations and singularities pattern-match science fiction, rather than newspapers.” This isn’t bad if you need a three-second judgement, but is quite faillible (e.g., relativity, interracial marriage, atheism). It would be better to engage with the meat of the argument (why smarter-than-human intelligence is possible in principle, why AIs go flat or FOOM, why the stakes are high, why a supercritical AI is likely in practice (I don’t actually know that one)), pinpoint something in particular, and say “That can’t possibly be right” (backing it up with a model, a set of historical observations, or a gut feeling).
[religious vocabulary]
It’s common knowledge on LW that both the rationality thing (LW) and the AI things (SIAI) are at unusually high risk of becoming cultish. If you can point to a particular problem, please do so; but reasoning by analogy (“They believe weird things, so do religions, therefore they’re like a religion”) proves little. (You know what else contained carbon? HITLER!)
you have a holy mission to save the universe
Are we talking feasibility, or desirability?
If feasibility, again, please point out specific problems. Or alternate ways to save the universe (incrementally, maybe, with charity for the poorest like VillageReach, or specialized research like SENS). Or more urgent risks to address (“civilization crumbles”). Or reasons why all avenues for big change are closed, and actions that might possibly slightly increase the probability of improving the world a little.
If desirability, well, yeah. People are dying. I need to stop that. Sure, it’s hubris and reaching above myself, sure I’m going to waste a lot of money on the equivalent of alchemy and then do it again on the next promising project (and maybe get outright scammed at some point), sure after all that I’m going to fail anyway, but, you know, Amy is dead and that shouldn’t happen to anyone else.
Because honest debaters can think they’re matching each other’s politeness level and go from “Hey, you have a bug there” to “Choke on a buckets of cock”. If AlphaOmega refuses to de-escalate, or if ey still looks like a troll when polite, I’ll shrug and walk away.
“Yo momma’s a cultist” is worthless, but be wary of ignoring all dissenters—evaporative cooling happens. (OTOH, Usenet.)
Edit: Aaand yup, ey’s an ass. Oh well, that’ll teach me a lesson.
Trolls serve an important function in the memetic ecology. We are the antibodies against outbreaks of ideological insanity and terminal groupthink. I’ve developed an entire philosophy of trolling, and am obligated to engage in it as a kind of personal jihad.
That site is obsolete. I create new sites every few months to reflect my current coordinates within the Multiverse of ideas. I am in the process of launching new “Multiversalism” memes which you can find at seanstrange.blogspot.com
There is no Universal truth system. In the language of cardinal numbers, Nihilism = 0, Universalism = 1, and Multiversalism = infinity.
(Emotional level note: Please upgrade your politeness level. I’ve been rude earlier, but escalating is a bad move even then; I’m de-escalating now. Your current politeness level is generating signs of treating debate as a conflict, and of trolling.)
Can you clarify that phrase? I can only parse it can “deriving your ethics from”, but ethical systems are derived from everyday observations like “Hey, it seems bad when people die”, then reasoning about it. Then the ethics exist, and “intergalactic civilizations are desirable” come from them.
Maybe you meant “designating those notions as the most desirable things”? They are consequences of the ethical system, yeah, but “The thing you desire most is impossible”, while bad news, is no reason to change what you desire. (Which is why I called sour grapes.)
You seem to confuse “A positive Singularity is desirable” (valuing lives, ethical systems) and “A positive Singularity is likely” (pattern-matching with sci-fi).
You are invoking the absurdity heuristic. “Intergalactic civilizations and singularities pattern-match science fiction, rather than newspapers.” This isn’t bad if you need a three-second judgement, but is quite faillible (e.g., relativity, interracial marriage, atheism). It would be better to engage with the meat of the argument (why smarter-than-human intelligence is possible in principle, why AIs go flat or FOOM, why the stakes are high, why a supercritical AI is likely in practice (I don’t actually know that one)), pinpoint something in particular, and say “That can’t possibly be right” (backing it up with a model, a set of historical observations, or a gut feeling).
It’s common knowledge on LW that both the rationality thing (LW) and the AI things (SIAI) are at unusually high risk of becoming cultish. If you can point to a particular problem, please do so; but reasoning by analogy (“They believe weird things, so do religions, therefore they’re like a religion”) proves little. (You know what else contained carbon? HITLER!)
Are we talking feasibility, or desirability?
If feasibility, again, please point out specific problems. Or alternate ways to save the universe (incrementally, maybe, with charity for the poorest like VillageReach, or specialized research like SENS). Or more urgent risks to address (“civilization crumbles”). Or reasons why all avenues for big change are closed, and actions that might possibly slightly increase the probability of improving the world a little.
If desirability, well, yeah. People are dying. I need to stop that. Sure, it’s hubris and reaching above myself, sure I’m going to waste a lot of money on the equivalent of alchemy and then do it again on the next promising project (and maybe get outright scammed at some point), sure after all that I’m going to fail anyway, but, you know, Amy is dead and that shouldn’t happen to anyone else.
Right, so why feed him?
Because honest debaters can think they’re matching each other’s politeness level and go from “Hey, you have a bug there” to “Choke on a buckets of cock”. If AlphaOmega refuses to de-escalate, or if ey still looks like a troll when polite, I’ll shrug and walk away.
“Yo momma’s a cultist” is worthless, but be wary of ignoring all dissenters—evaporative cooling happens. (OTOH, Usenet.)
Edit: Aaand yup, ey’s an ass. Oh well, that’ll teach me a lesson.
Trolls serve an important function in the memetic ecology. We are the antibodies against outbreaks of ideological insanity and terminal groupthink. I’ve developed an entire philosophy of trolling, and am obligated to engage in it as a kind of personal jihad.
According to the web site linked in your profile, you are attempting to actively poison the memetic ecology by automated means. I’m not sure how to answer that, given that the whole site goes far over the top with comic book villainy, except to say that this particular brand of satire is probably dangerous to your mental health.
That site is obsolete. I create new sites every few months to reflect my current coordinates within the Multiverse of ideas. I am in the process of launching new “Multiversalism” memes which you can find at seanstrange.blogspot.com
There is no Universal truth system. In the language of cardinal numbers, Nihilism = 0, Universalism = 1, and Multiversalism = infinity.