https://www.wattpad.com/myworks/263500574-singularity-soon
Flaglandbase
This makes me wonder why there is no religion that says that God is infinitely evil, but has to allow limited non-evil things to exist to maximize possible tortures in the future.
So a micromort is one millionth of a fatality, so there has been one fatality every 91 million miles
Quite a few folks “believe” in a rapid AI timeline because it’s their only hope to escape a horrible fate. They may have some disease that’s turning their brain into mush from the inside out, and know there is exactly zero chance the doctors will figure something out within the next century. Only superhuman intelligence could save them. My impression is that technological progress is MUCH slower than most people realize.
Maybe the problem is figuring out how to realistically simulate a SINGLE neuron, which could then be extended 302 or 100,000,000,000 times. Also due to shorter generation times any random c.elegans has 50 times more ancestors than any human, so evolution may have had time to make their neurons more complex.
Cryptocurrencies can make you less vulnerable to lawsuit extortion by making you appear poorer. The government will suck out your bank account before you know what happened.
This seems like it should have much much more upvotes and be expanded upon.
Looks like all possible minds are always being generated in the complexity of all places.
What I’m about
Yes, but it would be nice to have a backup research effort to try to find a way to record the contents of a human brain “from the outside” without having to scan all the neurons, by inventing a series of brilliantly clever mind and memory tests. Like an extension of DARPA’s LifeLog project that was cancelled for useless “privacy” reasons in 2004. This would of course be an extreme longshot.
Not if it’s an incomplete or low-fidelity mind reconstruction, and that may be the only type possible with this method.
Almost all possible minds that think their existence is meaningful are really just chaotically delusional pattern combinations.
But the minds of which the most identical copies exist throughout reality may all have evolved in consistent universes.
Putin might want to get rid of anyone who opposes him by letting them emigrate. Even worse, he could follow Castro’s playbook from the Mariel Boatlift of 1980, and send all his most dangerous and violent criminals to the West.
Personal imitation software
I suspect there infinitely many copies of each of our minds spread throughout the Omniverse (or certainly more than a hundred).
These minds have identical experiences, but may live under different laws of physics without knowing it. A lucky minority must live in universes where vacuum decay is impossible, including almost all of our distant descendants.
But it is worrying and unpleasant that we seem to live so close to the beginning of time rather than an endless utopia—almost as if that won’t happen at all. The only solution may be that young universes are somehow constantly being generated within older universes.
I’ve read about so many horrible scenarios here on LW alone that it seems to me the highest universal law should be that euthanasia should always be allowed. So I’m definitely not going to criticize the mouse whatever it does.
The first such programs would only predict a few common activities. Less common activities would require more software to predict. Deep Learning requires many specialized sub-programs working together in a hierarchy.
J.K. Rowling could probably manipulate Lesswrong as she sees fit by buying the site, shadowbanning all commenters, and putting up new comments using their names (but preventing the real users from seeing these) were they slowly become convinced witchcraft real.
If AI is ever solved it will be trivially easy to provide minds with intensely deep meaning, even if they spend forever solving simple problems for higher-level minds.
“If you never open the transparent envelope then the opaque envelope will have always contained a debt of one pound” .
A debt from who to whom?