Would you trade those base emotions for a paperclip?
Tenek
No. I intend to revive one. Possibly all four, if necessary. Consider it thawing technology so advanced it can revive even the pyronics crowd.
That doesn’t help maximize paperclips, though. If you make all decisions based on two criteria—paperclip count and emotions—then the only situation in which those decisions differ from what you would have decided based solely on paperclip count is one in which you choose an outcome with fewer paperclips but a better emotional result.
If you were to refuse my offer, you would not only be losing a paperclip now, but also increasing the likelihood that in the future, you will decide to sacrifice paperclips for emotion’s sake. Perhaps you will one day build a paperclip-creator that creates one paperclip per second, and I will threaten to destroy a paperclip unless you shut it down. If you care too much about the threatened paperclip you might comply, and then where would you be? Sitting in an empty room where paperclips should have been.
Well, I would have done some research and gotten a warm fuzzy feeling out of expanding my knowledge, but if you’re going to displace that motivation with only a chance at a measly $10 I guess it’s not worth my time.
On further consideration of the Moody fight—as soon as Harry walked into the office, shouldn’t he have seen all his invisible copies as well? Ch. 56 -
Bellatrix was still transparent within the Cloak, but to Harry she was no longer hidden, he knew that she was there, as obvious to him as a Thestral. For Harry had only loaned his Cloak, not given it; and he had comprehended and mastered the Deathly Hallow that had been passed down through the Potter line.
How is WrongBot going to learn to think and write more skillfully by moving to a place that’s collectively worse at doing so?
We can tweak the experiment a bit to clarify this. Suppose the coin is flipped before she goes to sleep, but the result is hidden. If she’s interviewed immediately, she has no reason to answer other than 1⁄2 - at this point it’s just “flip a fair coin and estimate P(heads)”. What information does she get the next time she’s asked that would cause her to update her estimate? She’s woken up, yes, but she already knew that would happen before going under and still answered 1⁄2. With no new information she should still guess 1⁄2 when woken up.
So you’re replacing ‘lose a portion of magic’ with ‘risk being sent to Azkaban’? It changes the cost of binding but it certainly doesn’t remove it.
Makes sense. Six possible one-directional (A loves B, B loves A, etc) relationships that can be either present or not, so 2^6 = 64. Each person has 3 graphs where they’re disconnected but the others are not (A loves B, B loves A, A and B love each other), and one where there are no connections at all. 64 − 3*3 − 1 = 54.
I don’t. GL’s canon strategy provides a perfectly reasonable explanation for all his supposed feats, and I didn’t see anything in 86 to suggest this is going to be a major divergence. I expect it’ll go something like the other CoS reference—at some future point we’ll get a “gee, looks like he’s just a fraud, moving on” with a possible joke about him teaching Defense.
Guilt is an added cost to making decisions that benefit you at the expense of others. (Ideally, anyways.) It encourages people to cooperate to everyone’s benefit. Suppose we have a PD matrix where the payoffs are: (defect, cooperate) = (3, 0) (defect, defect) = (1, 1) (cooperate, cooperate) = (2, 2) (cooperate, defect) = (0, 3) Normally we say that ‘defect’ is the dominant strategy since regardless of the other person’s decision, your ‘defect’ option payoff is 1 higher than ‘cooperate’.
Now suppose you (both) feel guilty about betrayal to the tune of 2 units: (defect, cooperate) = (1, 0) (cooperate, cooperate) = (2, 2) (defect, defect) = (-1, −1) (cooperate, defect) = (0, 1)
The situation is reversed - ‘cooperate’ is the dominant strategy. Total payoff in this situation is 4. Total payoff in the guiltless case is 2 since both will defect. In the OP $10-button example the total payoff is $-90, so people as a group lose out if anyone pushes the button. Guilt discourages you from pushing the button and society is better for it.
So, random possible goals… make invincible, intelligent horcrux. Provoke Harry to attempted murder.
I am somewhat suspicious of announcing one’s own vulnerability as opposed to just killing her in the half-second it takes for AK.
It should already be pretty high though—Harry even points it out at the time (Rule 1 of Unforgivable Curse Safety) and Quirrell equivocates it away by mixing up etiquette rules with safety rules. That might just as easily have ended with “I just shot Bahry in the face” considering how fast the spell must be going—probably <100 ms to recognize he can’t dodge in time, and push him away.
You might want to check out TVTropes.
There’s a difference between learning a skill and learning a skill while remaining human. You need to decide which you want.
Learning how to transplant a kidney is much easier when you have a few dozen people to experiment on. (I think that was the idea, anyways...)
I’m not using it as an example of why they’re good. I’m offering it as an example because it’s relevant to the topic.
Adding a cost to circumvent the law makes you less likely to do so, though. If you keep hiring people who are decidedly suboptimal because you have to use a lousy approximation of whatever characteristic you want, you might give up on it.
I get that you would rather, given that you’re going to be rejected for your age/skin color/gender/etc, be told why. But if you want to reduce the use of those criteria, then banning it will stop the people who care a small amount (i.e not enough to bother getting around it.)
Then we can suggest that they’re temporarily dead, but they’re still dead, so it’s a “grave”. Religions have been saying that death is temporary for thousands of years anyways, it wouldn’t be anything new.
Well, Jack doesn’t want any thinking at all, so I’m not sure if that’s better or worse than fuzziness.
The pinnacle of cryonics technology will be a time machine that can at the very least, take a snapshot of someone before they died and reconstitute them in the future. I have three living grandparents and I intend to have four living grandparents when the last star in the Milky Way burns out. (50%)