Not everyone wants to kill with the intent of affecting the behavior of the punished, which in this case would be canceling all future behaviors. Some might want to punish by killing because they feel that is the proper response to the crime of the punished. Even if the punisher somehow knows that the one they are erasing will never behave that way again. Such people see certain behaviors as a permanent stain on a person’s life record and they believe the only correct punishment is to end them.
PeteG
>A punishment is when one agent (the punisher) imposes costs on another (the punished) in order to affect the punished’s behavior.
If a person punishes another by subtracting the other’s life, this is not done to affect the other’s behavior.
And I certainly see a very great difference between humanity continuing forever, versus humanity continuing to Graham’s Number and then halting
You can’t use “humanity” and “Graham’s Number” in the same sentence.
Oh! Oh! Oh! Oh! Oh! U shitheads think you are doing something to me with your insect downvotes? Oh! Oh! Oh! Oh! Oh! The only thing U did is prove to me that U all eat SHIT! Not a single one of U can leave a sensible comment here. The next person who downvotes me is gonna get their ass beat by a kangaroo!
To all of you who downvote my post and my comments: EAT SHIT AND DIE!
Another retarded person downvoted my post and my comment. Is this post too much for you idiots to handle, or what? You would rather read and upvote some garbage about football?
Whoever downvoted my post: you are a stupid sack of shit that did not understand anything I wrote.
Layers Of Mind
The AI tells me that I believe something with 100% certainty, but I can’t for the life of me figure out what it is. I ask it to explain, and I get: “ksjdflasj7543897502ijweofjoishjfoiow02u5”.
I don’t know if I’d believe this, but it would definitely be the strangest and scariest thing to hear.
Rationality is winning that doesn’t generate a surprise; randomly winning the lottery generates a surprise. A good measure of rationality is the amount of complexity involved in order to win, and the surprise generated by that win. If to win at a certain task requires that your method have many complex steps, and you win, non-surprisingly, then the method used was a very rational one.
“You should have spent much more of your time in this debate convincing your tangled friend that, if she were to abandon her religious belief (or belief in belief, or whatever), she would still be able to feel good about herself and good about life; that life would still be a happy meaningful place to be.”
I don’t think Eliezer cared so much to correct someone’s one wrong belief as much as he cared to correct the core that makes many such beliefs persist. Would he really have helped her if all his rational arguments failed, but his emotional one succeeded? My guess is that it wouldn’t be a win for him or her.
How to Bind Yourself To Reality is the number one thing people should GET. But my guess is that this one might not be teachable.
Most frequent would have to go to my avoidance of settling with cached thoughts. I notice, revise, and completely discard conclusions much more regularly and effectively when I recognize the conclusion was generated as soon as a question was asked.
The Wrong Question sequence was amazing. One of the very unintuitive sequences that greatly improved my categorization methods. Especially with the ‘Disguised Queries’ post.
Like I said, some people would punish by killing not to affect the behavior of the punished (neither to deter nor to incapacitate), but because they would see it as the morally right thing to do, given the crime.
Zack, you are mistaken about highlighting Nick’s sentence as “hitting the mark”.