The lesswrong community never claimed to answer prayers.
Still, if people here knew how to cure cancer, most of them would.
The lesswrong community never claimed to answer prayers.
Still, if people here knew how to cure cancer, most of them would.
As do true communism. Has there ever been such a democracy? How did it found out the utility for all its voters? Sounds to me like a “no true Scotsman” encompassing all known government systems. I think you should modify your statement to something like “A democracy should try and protect the interests of its minorities, as well as those of the majority.”
A gun could blow minds in any era.
I’m sorry, I couldn’t help myself.
Hello lesswrong community!
“Who am I?” I am a Network Engineer, who once used to know a bit of math (sadly, not anymore). Male, around 30, works in IT, atheist—I think I’ll blend right in.
“How did I discover lesswrong?” Like the vast majority, I discovered lesswrong after reading HPMOR many years ago. It remains my favourite book to this day. HPMOR and the Sequences taught me a lot of new ideas and, more importantly, put what I already knew into a proper perspective. By the time HPMOR was finally finished, I was no longer sure where my worldview happened to coincide with Mr. Yudkowsky, and where it was shaped by him entirely. This might be due to me learning something new, or a mixture of wishful thinking, hindsight bias and the illusion of transparency, I don’t know. I know this—HPMOR nudged me from nihilism to the much rosier and downright cuddly worldview of optimistic nihilism, for which I will be (come on singularity, come on singularity!) eternally grateful.
“When did I became a rationalist?” I like to think of my self as rational in my day-to-day, but I would not describe myself as a rationalist—by the same logic that says a white belt doesn’t get to assume the title of master for showing up. Or have I mixed those up and “rational” is the far loftier description?
“Future plans?” I am now making a second flyby over the Sequences, this time with comments. I have a few ideas for posts that might be useful to someone and a 90% complete plotline for an HPMOR sequel (Eliezer, you magnificent bastard, did you have to tease a Prologue?!!!).
Looking forward to meeting some of you (or anyone, really) in the comments and may we all survive this planet together.
“Neither true nor false...” Not so. We gather such stories and treasure them. But at the end of the day, we label them fiction (or mythology, if some portion of humanity believed them to be true at some point) and know better than to go looking for Hogwarts. We know fiction is not corresponding with reality, not part of the map, in other words—not true. In every sense that matter, we treat fiction as false.
All that is good and proper—as long as such works don’t claim to describe factual events.
OG Darwin—harsh, but also unfair. Terribly, terribly unfair.
What Liliet B said. Low priors will screw with you even after a “definitive” experiment. You might also want to take a look at this: https://www.lesswrong.com/posts/XTXWPQSEgoMkAupKt/an-intuitive-explanation-of-bayes-s-theorem
Backing up… everything. Deploying changes to test environment before deploying to production. Accepting Murphy’s Law unto yourself. Looking twice before crossing the street. Developing a blanket policy of general paranoia. Promoting a blanket policy of general paranoia. Developing alcoholism. Promoting alcoholism. Etc...
Edit: I forgot arguably the most important one: admiting you cannot reliably do better than the market by picking individual stocks (nobody can!) and buying market ETFs instead.
Time for some necroing. People who suffer from depression are trying to achieve levels of happiness corresponding with reality (maybe not with the express purpose of clearer perception of reality, but still...)
I can imagine a condition causing someone to experience excessive happiness—such person could conceivably want to lower his level of happiness, so he could grieve for the loss of a loved one.
Feelings should be rational—https://www.lesswrong.com/posts/SqF8cHjJv43mvJJzx/feeling-rational
As Carlin said in one of his routines, self-confidence (in relation to achievement) is like the fuel gauge in a car. Turns out, messing with it doesn’t actually let you go further (he claimed to base this assertion on studies, so I am sure it’s true). Happiness may be similar, serving better as a motivator rather than a terminal value in itself.
In any case, I suspect most people here would not climb in a tub filled with orgasmium.
But if you want to mess with the gauge regardless, I know a stupid method that works: stand with your back straight, shoulders wide, head held high. Smile broadly (showing teeth). Hold this pose for 5 minutes (by an actual clock).
Thinking happy/funny thoughts is optional. Being grateful for the state of (at least relatively) good health and trying to enjoy each breath (you might have a finite amount, after all) are also optional.
With this method, I could be happy during my own funeral. And yet, I am not maintaining MAXIMUM HAPPINESS 24⁄7. Why? Turns out, constant happiness can be quite boring. Still, the method is not at all useless—sometimes the gauges actually need calibration and I do enjoy the option very much indeed. (And to think some people pay for drugs… What a waste.)
Necroing because that seems an often expressed sentiment I oppose strongly to:
May it be to poor education at a younger age, traumatic experiences, simply having the wrong education for the current playfield, being sacked at an older age, having no available finances to support further education, a lack of intelligence or simply spiraling down the road of depression due to a lack of chances or being stuck in a debt one can never recover of in a lifetime… these are all scenario’s in which the player on the Lotto actually rationally pays for the soothing dream of a better (financial) future. A future that will not happen if one would not win the lottery.
Was that a fully general excuse for stupid behaviour on behalf of poor people, or are you just happy to meet me?
I wonder, would this be used for gambling only, or could it also cover drug abuse, alcoholism, crime?
Spare me the pity party. Reality doesn’t care about sob stories—it doesn’t matter how much in debt you are, how unfortunate your circumstances and how great your need—you are not going to win the lottery. Shielding people from this fact is not doing them any favours.
Edited: the first sentence was needlessly rude and confrontational, also spelling.
They actually don’t. Glossing over all the details, anyone who bought bitcoin 13 years ago (and just left it alone) received far better return than anyone buying into the proposed lottery would have. Results matter.
God showing up and granting all humans Wolverine’s healing factor would be evidence he exists. Providing a good explanation of why he permitted disease in the first place might convince me he is not as evil as described in the Bible.
Edit: Aliens playing god would still be far more likely, but the above scenario would be evidence in favour of the god hypothesis.
This was meant as a joke. Sorry if the intent is not obvious.
Thank you! You have no idea how happy your reply makes me! In an irrationally large part, because I’ve seen your name in a book, but I just cannot help myself. You are alive! (Duh!) More importantly, the lesswrong community is alive! (Double Duh!, but going through the Sequences’ comments can be a bit discouraging—like playing the first levels of a MMORPG, while the experienced player base has moved on to level 50.) Hopefully, we’ll have many interesting discussions once I catch up. So much to look forward to! Will Alicorn be there? Will TheOtherDave explain what happened to the original Dave? You guys are legends.
P.S. Sorry for the delayed response, I didn’t notice the number next to the bell earlier. I’ll make sure to check it frequently from now on.
No worries :) and no reason to be sorry- the bell is quite obvious on PC, but my android phone only shows it when scrolling. Probably an issue on my side.
If there are any true Relativists or Selfishes, we do not hear them—they remain silent, non-players.
A truly selfish person will still be concerned with PR. Concealing one’s selfishness might prove impossible, or at the very least, inconvenient. Far better to convince society it’s a virtue. Especially if one is already standing on a heap of utility and does not feel like sharing.
Yep. Young males have engaged in high risk/high reward behaviour for personal glory/the good of the tribe since the dawn of time. One of the socially accepted and encouraged outlets for this behaviour is called being a warrior.
Necroing, I couldn’t help myself:
An accurate estimate of anyone else’s psychology is a dubious benefit in strategic interactions that depend solely on being able to predict the actions of friend and foe.
An accurate estimate of anyone else’s psychology should improve your ability to predict their actions.
By parity of reasoning, the rational principle of seeking our own advantage allows us to use our enemies at our pleasure, and treat them as is most convenient for us. For our civic nature is defined by the constitution of our state; and to the extent that foreign subjects do not agree in nature with us, and their affects are different in nature from our affects, we would be ill served by extending our habitual notions of humanity, formed through intercourse with our compatriots, to anyone that does not partake of our social compact.
Beautiful. If I may misquote Tanos—”Using christianity to destroy christianity.” I am not sure Spinoza would agree with the notion that foreigners are non-human, though.
Leaving questions of theology and morality aside, the point of the above post is that thinking of your enemies as non-human will intervene with your ability to accurately model their motivations and predict/influence their future behaviour.
But compared to outright lies, either honesty or silence involves less exposure to recursively propagating risks you don’t know you’re taking.
Only if you value unblemished reputation over the short term gain provided by the lie. Fooling some of the people some of the time might be sufficient for an unscrupulous agent.
Just so. And a belief that leads to correct predictions will (generally) be more useful than a belief that doesn’t.
I think I see a confusion with the term “eviction” here. There is a difference between believing X exists (knowing about X) and believing X is true (believing X). So, “evicting X” should be understood as “no longer believing X”, rather than “erazing all knowledge of X” (which happens involuntary anyway).
I hope this was helpful, as this is my first comment, too. Anyway, I’ve lurked awhile and I don’t think anyone here would begrudge you raising an honest question.
P.S. Welcome to less wrong :) !!!
Edit: formatting.