“One thousand five hundred years ago, everybody knew the Earth was the center of the universe. Five hundred years ago, everybody knew that the Earth was flat… and fifteen minutes ago, you knew people were alone on this planet. Think about what you’ll know tomorrow.”—Agent K, “Men in Black”
Bindbreaker
Yes in all cases, but absolutely only if reversible.
I am asexual and thus have not experienced any of the romantic/sexual emotions. I feel as if doing so would almost certainly help my understanding of others, as well as broaden my emotional range. However, I seem to do quite fine without these emotions, and they seem to cause more problems than they are worth in many of the people around me. Therefore I would only take such pills if they were reversible, as my present state is quite happy and the alternative could certainly be worse.
Isn’t that exactly the sort of thing that this community is supposed to avoid doing, or at least recognize as undesirable and repress?
I’ve found this to be true as well. Calling someone a fool in casual conversation is bizarrely more insulting than calling them a damn fool, as everyone will understand that the latter is a joke but the former might be taken seriously.
It’s my impression that, regardless of whether or not you actually have status, acting like you do is probably undesirable, as it gets you thinking in the wrong patterns.
Why is this post so highly rated? As far as I can tell, the author is essentially saying that immortality will not happen in the future because it has not already happened. This seems obviously, overtly false.
- 21 Jan 2010 23:54 UTC; 6 points) 's comment on Easy Predictor Tests by (
I prefer this one, and yes, it really is that short.
Started reading the first one—from the prologue alone, Kellhus seems absurdly strong/skilled/fast. He reads people’s minds by looking at the patterns of their facial muscles, catches arrows out of the air, kills large groups of enemies by himself in hand-to-hand combat, etc. I’m not sure what lessons could really be derived from this, since these actions are far beyond the realm of normal human ability. Does the series/book get any better, or am I missing something here?
I don’t find that that’s necessarily correct. For example, this post of mine expressing skepticism about cryonics or this one questioning a highly rated post were both fairly highly rated. I think needless contrarianism gets downvoted, but reasonable arguments generally don’t, even if they advance unpopular cases.
It is the result of a net negative or zero vote on each. Independent of any action by other members, I know I’ve upvoted three of your posts—“I’ve already retracted the word “legitimate” as being redundant...” “I may have memory of always existing...” and “Anyway, not to worry. We can still be sure of taxes.” I am not sure why you would doubt me on this.
Did you read any of the articles here or on Overcoming Bias before signing up?
This post was obviously a joke, but “we should kill this guy so as to avoid social awkwardness” is probably a bad sentiment, revival or no revival.
“My interest is in the future because I am going to spend the rest of my life there.”
Charles F. Kettering
Fifthed.
The user name “Alicorn” seems gender-indeterminate to me.
I’m pretty sure this would indicate that the AI is definitely not friendly.
Fake difficulty applies to multiplayer too. Anything that adds barriers to entry or needless clicks is fake difficulty. Games like Starcraft, where you sometimes end up fighting the interface instead of your opponent, have a lot of fake difficulty. If you’re going by That Other Site’s definition of fake difficulty, the #1 thing on the list is “Bad technical aspects make it difficult,” which certainly seems to apply!
For example, in Starcraft you have to micro all your workers to different mineral patches at the start of the game in order to get the most efficient economy possible. This is fake difficulty because games with real interfaces allow you to select all and click once, then the workers automatically fan out. Starcraft requires at least 8 (in practice usually 10) clicks in order to accomplish what other games do in 2. Further, some of the Starcraft community actually wants this “feature” to be preserved for Starcraft 2, as it “adds skill.” Fortunately, I don’t think Blizzard is going to acquiesce.
I suspect that short, concise posts and long, thought-out ones both get higher karma than ones that fall in between.
I’m in the “amassing resources” phase at present. Part of the reason I’m on this site is to try and find out what organizations are worth donating to.
I am in no way a hero. I’m just a guy who did the math, and at least part of my motivation is selfish anyway.
This might get me blasted off the face of the Internet, but by my (admittedly primitive) calculations, there is a >95% chance that I will live to see the end of the world as we know it, whether that be a positive or negative end. I do not see any reason to sign up for cryonics, as it will merely constitute a drain on my currently available resources with no tangible benefit. I am further unconvinced that cryonics is a legitimate industry. I am, of course, open to argument, but I really can’t see cryonics as something that would rationally inspire this sort of reaction.
- 21 Jan 2010 23:54 UTC; 6 points) 's comment on Easy Predictor Tests by (
In one of the discussions surrounding the AI-box experiments, you said that you would be unwilling to use a hypothetical fully general argument/”mind hack” to cause people to support SIAI. You’ve also repeatedly said that the friendly AI problem is a “save the world” level issue. Can you explain the first statement in more depth? It seems to me that if anything really falls into “win by any means necessary” mode, saving the world is it.