Holy crap, why wasn’t I invited to this? I’m only a short train ride away!
Bayeslisk
Just curious: has anyone explored the idea of utility functions as vectors, and then extended this to the idea of a normalized utility function dot product? Because having thought about it for a long while, and remembering after reading a few things today, I’m utterly convinced that the happiness of some people ought to count negatively.
I’m efficient, you have a least effort bias, he’s just lazy.
Some people ought to have pain inflicted on them until their utility functions become sensible in the face of the threat of more pain from the same source for the same reason. This need not take the form of limitless pain: the marginal utility curve could easily fall off really fast. Not having to deal with such people will make lots of people very happy, and them in the long run happy as well. See: sociopaths and ostensibly this guy.
To figure out how much you care about other people being happy as defined by how much they want similar or compatible things to you, in a reasonably well-defined mathematical framework.
This still doesn’t change the fact that hearing about Mr. Rich Misogynist here enjoying a 7-figure trust fund, mistreating women, and generally being happy at the expense of others makes me generally unhappy, indicating a negative term for his happiness in my utility function.
You and another comment here are making me reevaluate my categories for why I weight something negatively. Let me get back to you after I’ve had a chance to think about it more.
EDIT: For purposes of clarity, I’m going to respond to your post as well as this one there.
I hadn’t been aware that those five things were so badly tangled up for me. This and another comment here are making me reevaluate my categories for why something should be weighted negatively for me. Let me get back to you when I’ve had a chance to think a little.
Yeah, but Wedding Planner 1′s deep vitriolic moral hatred of the lemon chiffon cake that delights Wedding Planner 2 that abused her as a young girl or Wedding Planner 2′s thunderous personal objection to the enslavement of his family that went into making the cocoa for the devil’s food cake that Wedding Planner 1 adores could easily make them refuse to share said delicious white cake with raspberry buttercream to the point where either would very happily destroy it to prevent the other from getting any. This seems suboptimal, though.
OK. Having had a chance to think about it, I think I have a reasonable idea of why it is I desire any of those things in some situations. I thought it over with three examples: first, the person I linked to. Second, an ex of mine, with whom I parted on really bad terms. Third, a hypothetical sociopath who would like nothing more than for me to suffer infinitely, as a unique terminal value.
*Wishing that person X would behave otherwise My desire for this seems self-evident. When people do things I disapprove of, I desire that they stop. The odd thing is that in all of the three cases, I would award them points just for stopping:the stopping just removes disutility already there, and can’t go above 0.
*Being glad if person X suffers I definitely wouldn’t be happy if they just suffered for no reason. I would still feel a little bad for them if someone ran over their cat. That said, types of suffering you could classify as “poetic” in some sense appeal to me very much: said “banker bro” getting swindled and catching Space AIDS (or even being forcibly transitioned into a woman!), or, as is seeming increasingly likely, said ex’s current relationship ending as badly as it seems to be. My brain locks up and crashes when presented with the third case, though. I think I’d just be happy for them to suffer regardless.
*Believing that making person X suffer will cause them to behave otherwise. On balance, I’m not sure that it would make a difference in any of the three cases. Case 1 is too self assured, and the other two just don’t care about me.
*The world will be a better place is person X would behave otherwise. Case 1 could actually be this. He might actually achieve success, and then screw up, at best, several peoples’ lives. Case 2 is too small-scale. Case 3, I actually can’t justify this at all: the only people who will care are people who want to see me happy.
*The world will be a better place if person X suffers. I don’t delude myself that this is pretty much ever true, except very indirectly.
In the interest of full disclosure, I’m half-Korean, and for reasons of familial history, feel rather strongly about the whole Japan thing. That doesn’t stop me from enjoying tasty age tofu or losing my shit laughing whenever I watch Gaki no Tsukai, and indeed seeking out both. But I do have somewhat of a stake of pride in seeing people who deny war crimes, particularly these, suffering similarly to above. Political opponents are similar: I wouldn’t derive satisfaction from Rick Santorum breaking his leg. I’d be very happy to learn that he’s a closeted gay man whose wife will have to have an abortion.
I made one when I was bored, long ago when my grandmother still ran her store and my uncle still ran his immigration law firm on the third floor, and when I was obsessed with knot theory, out of computer paper, tape, and a lot of hard pencil. I still use it, and it cost me next to nothing.
EDIT: If requested (however unlikely) I will happily deliver a picture, and either a push or a bouillon cube (your choice). EDIT THE SECOND: it was requested! http://imgur.com/a/kxanI
As much as I love Banks, this sounds like a massive set of applause lights, complete with sparkling Catherine wheels. Sometimes, you have to do shitty things to improve the world, and sometimes the shitty things are really shitty, because we’re not smart enough to find a better option fast enough to avoid the awful things resulting from not improving at all. “The perfect must not be the enemy of the good” and so on.
It’s good as an exhortation to build a Schelling fence, but without that sentiment, it’s pretty hollow. Reading the context, though, I agree with you: it’s a reminder that feeling really sure about something and being willing to sacrifice a lot of you and other (possibly unwilling) people to create a putative utopia probably means you’re wrong.
“Sorrow be damned, and all your plans. Fuck the faithful, fuck the committed, the dedicated, the true believers; fuck all the sure and certain people prepared to maim and kill whoever got in their way; fuck every cause that ended in murder and a child screaming. She turned and ran...”
(As an aside, I now have the perfect line for if I ever become an evil mastermind and someone quotes that at me: “But you see, murder and children screaming is only the beginning!”)
Well, no. It’s against the promise of how many utilons you can pile up on the other arm of the scale, which may well not pay off at all. I’m reminded of a post here at some point whose gist was “if your model tells you that your chances of being wrong are 3^^^3:1 against, it is more likely that your model is wrong than that you are right.”
It’s not a matter of “the plan might go wrong”, it’s a matter of “the plan might be wrong”, and the universal part comes from “no, really, yours too, because you aren’t remotely special.”
Well, that’s nice in principle, and easy to think, but how do you actually go convincing yourself to consistently feel it? If you have an answer, I sincerely want to know it, because I’ve become acquainted (doing the first labwork of my life) this summer with feeling like an absolute fraud, despite reasonable success and complete inexperience.
I know of it. I was trying to avoid the term because it feels wrong, feeds the wrong, and also rationalist taboo. Also reminding myself that feeling dumb means you’re learning, and feeling really dumb means you’re learning a lot. Just ignoring it won’t help, I don’t think, and other mental issues do point to that I’m really, really badly calibrated.
As it turns out, I consider myself worthy only when I’m better than someone, which sometimes takes the form of being able to help others, exert control over situations, or solve problems myself. This tends to spiral into feeling (self-)loathing when reading about some fictional people—Lazarus Long is a good example. At the moment, mental issues prevent me from consistently feeling worthy just for existing.
Done! Do you want a bouillon cube or a push? Think wisely.
This is interesting and a little scary and intriguing. I can’t wait for the posts.