IMO, even E is problematic: where did the torture-information come from in the first place?
Bayeslisk
I do not feel up to defending myself against multiple relatively hostile people. My apologies for having a belief that does not correspond to the prevailing LW memeplex. Kindly leave me alone to be wrong.
Yes, that was what I was getting at. Like I said elsewhere—game theory is not evil. It’s just horrifyingly neutral. I am not using inhuman as bad; I am using inhuman as unfriendly.
I have no idea how the Holodomor is germane to this discussion.
That is actually not true at all. I was actually planning on abandoning this trainwreck of an attempt at dissent. But since you’re so nice:
http://en.wikipedia.org/wiki/RAND_Corporation
http://en.wikipedia.org/wiki/Thomas_Schelling#The_Strategy_of_Conflict_.281960.29
OK, I think I was misunderstood and also tired and phrased things poorly. Game theory itself is not a bad thing; it is somewhat like a knife, or a nuke. It has no intrinsic morality, but the things it seems to tend to be used for, for several reasons, wind up being things that eject negative externalities like crazy.
Yes, but this seems to be most egregious when you advocate letting millions of people starve because the precious Market might be upset.
I guess I’m mostly reacting to RAND and its ilk, having read the article about Schelling’s book (which I intend to buy), and am thinking of market failures, as well.
Nonhuman agents use X → X does not necessarily and pretty likely does not preserve human values → your overuse of X will cause you not to preserve human values. Being a jerk in a style of Cthulhu I use to mean being a jerk incidentally. Eyesight is not a means of interacting with people, and maximization is not a bad thing if you maximize for the right things, which game theory does not necessarily do.
I have a strong desire to practice speaking in Lojban, and I imagine that this is the second-best place to ask. Any takers?
Alternatively, aerosolized agonium, for adequate values of sufficiently long-lived and finely-tuned agonium.
Observation: game theory is not uniquely human, and does not inherently cater to important human values.
Immediate consequence: game theory, taken to extremes already found in human history, is inhuman.
Immediate consequence the second: Austrian school economics, in its reliance on allowing markets to come to equilibrium on their own, is inhuman.
Conjecture: if you attempt to optimize by taking your own use of game theory and similar arts to similar extremes, you will become a monster of a similar type.
Observation: a refusal to use game theory in considerations results in a strictly worse life than otherwise, and possibly its use more often, more intensely, and with less puny human mercy will result in a better life for you alone.
Conjecture: this really, really looks like the scary and horrifying spawn of a Red Queen race, defecting on PD, and being a jerk in the style of Cthulhu.
Thoughts?
Continue laying siege to me; I’m done here.
This seems interesting in the horrifying way I have been considering excising from myself due to the prevalence of hostile metastrategic bashes: that is, people find you are the kind of person who flat-out welcomes game theory making a monster of em, and then refuses to deal with you, good day, enjoy being a sociopath, and without the charm, to boot.
YOU COUNT TWELVE.
I am mildly deathly allergic to pretty much all nuts.
EBWOP: On further reflection I find that since most of Thingspace instantaneously destroys the universe,
EV(U(spoopy creppy black box)) >>>
EV(U(object from Thingspace)).
However, what I was trying to get at was that
EV(U(spoopy creppy black box)) ⇐
EV(U(representative object from-class: chance-based deal boxes with “normal” outcomes)) <=
EV(U(representative object from-class: chance-based deal boxes with Thingspace-like outcomes)) <=
EV(U(representative object from-class: chance-based deal boxes with terrifyingly creatively imaginable outcomes))
I think that avatar-of-you-in-this-presented-scenario does not remotely have avatar-of-me-in-this-scenario’s best interests at heart, yes.
We should do more of these.
The high-energy exotic plasma not from this universe does not love or hate you. Your universe is simply a false vacuum with respect to its home universe’s, which it accidentally collapses.
“Interesting” tends to mean “whatever it would be, it does that more” in the context of possibly psuedo-Faustian bargains and signals of probable deceit. From what I know, I do not start with reason to trust you, and the evidence found in the OP suggests that I should update the probability that you are concealing information updating on which would lead me not to use the black box to “much higher”.
Something’s brewing in my brain lately, and I don’t know what. I know that it centers around:
-People were probably born during the Crimean War/US Civil War/The Boxer Rebellion who then died of a heart attack in a skyscraper/passenger plane crash/being caught up in, say WWII.
-Accurate descriptions of people from a decade or two ago tend to seem tasteless. (Casual homophobia) Accurate descriptions of people several decades ago seem awful and bizarre. (Hitting your wife, blatant racism) Accurate descriptions of people from centuries ago seem alien in their flat-out implausible awfulness. (Royalty shitting on the floor at Versailles, the Albigensian Crusade, etc...)
-We seem no less shocked now by social changes and technological developments and no less convinced that everything major under the sun has been done and only tweaks and refinements remain than people of past eras did.
I guess what I’m saying is that the Singularity seems a lot more factually supported-ly likely than it otherwise might have been, but we won’t realize we’re going through it until it’s well underway because our perception of such things will also wind up going faster for most of it.