People who believe that a small group is going to take over the universe to save it by making the seed of an artificial general intelligence, that is undergoing explosive recursive self-improvement, extrapolate the coherent volition of humanity, while acausally trading with other superhuman intelligences across the multiverse.
Do you really need more awesomeness?! Don’t tell your doctor!
Well, Less Wrong is awesome if only for statements like this:
I bet there’s at least one up-arrow-sized hypergalactic civilization folded into a halting Turing machine with 15 states, or something like that. [...] It might perhaps be more limited than this in mere practice, if it’s just running on a laptop computer or something.
...muggers who use their powers from outside the Matrix.
...3^^^3 people getting dust specks in their eyes.
Once you grasp the full scope of Less Wrong, statements that would otherwise seem extraordinary begin to pale in comparison:
Whoever knowingly chooses to save one life, when they could have saved two—to say nothing of a thousand lives, or a world—they have damned themselves as thoroughly as any murderer.
What makes Less Wrong awesome is that it shows how the most extraordinary beliefs are actually hold by atheist rationalists:
According to the article, the AGI was almost completed, and the main reason his effort failed was that the company ran out of money due to the bursting of the bubble. Together with the anthropic principle, this seems to imply that Ben is the person responsible for the stock market crash of 2000.
For example, you can convince everyone that quantum immortality works by killing them along with yourself. (This shouldn’t pose any risk if you’ve already convinced yourself :-) Paul Almond has proposed that this can solve the Fermi paradox: we don’t see alien civilizations because they have learned to solve complex computational problems by civilization-level quantum suicide, and thus disappeared from our view.
Well, Less Wrong is awesome if only for statements like this:
I bet there’s at least one up-arrow-sized hypergalactic civilization folded into a halting Turing machine with 15 states, or something like that. [...]
Surely that loses points for speculating about what we already know. A simple counter would produce a bitmap of this universe’s space-time matrix after a little while.
What makes Less Wrong awesome? Its members.
People who believe that a small group is going to take over the universe to save it by making the seed of an artificial general intelligence, that is undergoing explosive recursive self-improvement, extrapolate the coherent volition of humanity, while acausally trading with other superhuman intelligences across the multiverse.
Do you really need more awesomeness?! Don’t tell your doctor!
Well, Less Wrong is awesome if only for statements like this:
And that only scratches the surface! There are...
...hamsters wearing skirts materializing in midair.
...baby-eating aliens.
...muggers who use their powers from outside the Matrix.
...3^^^3 people getting dust specks in their eyes.
Once you grasp the full scope of Less Wrong, statements that would otherwise seem extraordinary begin to pale in comparison:
What makes Less Wrong awesome is that it shows how the most extraordinary beliefs are actually hold by atheist rationalists:
Nowhere but here can you find similar ideas:
or
or
Surely that loses points for speculating about what we already know. A simple counter would produce a bitmap of this universe’s space-time matrix after a little while.