I personally regard this entire subject as a memetic hazard, and will rot13 accordingly.
Jung qbrf rirelbar guvax bs Bcra Vaqvivqhnyvfz, rkcynvarq ol Rqjneq Zvyyre nf gur pbaprcg juvpu cbfvgf:
… gung gurer vf bayl bar crefba va gur havirefr, lbh, naq rirelbar lbh frr nebhaq lbh vf ernyyl whfg lbh.
Gur pbaprcg vf rkcynvarq nf n pbagenfg sebz gur pbairagvbany ivrj bs Pybfrq Vaqvivqhnyvfz, va juvpu gurer ner znal crefbaf naq gur Ohqquvfg-yvxr ivrj bs Rzcgl Vaqvivqhnyvfz, va juvpu gurer ner ab crefbaf.
V nfxrq vs gurer jrer nal nethzragf sbe Bcra Vaqvivqhnyvfz, be whfg nethzragf ntnvafg Pybfrq naq Rzcgl Vaqvivqhnyvfz gung yrnir BV nf gur bayl nygreangvir. Vpbcb Irggbev rkcynvarq vg yvxr guvf:
PV pnaabg znantr fngvfsnpgbevyl gur “pbagvahvgl ceboyrz” (jung znxrf lbh gb pbagvahr gb erznva lbh va gvzr). Guvf vf jul va “Ernfba naq Crefbaf”, Qrerx Cnesvg cebcbfrq RV nf n fbyhgvba. Va “V Nz Lbh”, Qnavry Xbynx cebcbfrq BV, fubjvat gung grpuavpnyyl gurl ner rdhvinyrag. Fb pubbfvat orgjrra RV naq BV frrzf gb or n znggre bs crefbany gnfgr. Znlor gurve qvssreraprf zvtug or erqhprq gb n grezvabybtl ceboyrz. Bgurejvfr, V pbafvqre BV zber fgebat orpnhfr vg pna rkcynva jung V pnyyrq “gur vaqvivqhny rkvfgragvny ceboyrz” [Jung jr zrna jura jr nfx bhefryirf “Pbhyq V unir arire rkvfgrq?”]
Gur rrevrfg cneg nobhg gur Snprobbx tebhc “V Nz Lbh: Qvfphffvbaf va Bcra Vaqvivqhnyvfz” vf gung gur crbcyr va gung tebhc gerng gur pbaprcg bs gurer orvat bayl bar crefba gur fnzr jnl gung Puevfgvnaf gerng gur pbaprcg bs n Tbq gung jvyy qnza gurve ybirq barf gb Uryy sbe abg oryvrivat va Uvz. Vg’f nf vs ab bar va gur tebhc ernyvmrf gur frpbaq yriry vzcyvpngvbaf bs gurer abg orvat nalbar ryfr, be znlor gurl qba’g rira pner.
I think it is true. Self awareness is not hardware (wetware, whatever-ware) dependent. Just upload yourself and everything would be just fine. You’ll be on two places at the same time, but with no communications between your instances, the old and the new one.
The same situation here, only that you have more than one natural born upload. Many billion, in fact.
The naturalism leads to this (frightening) conclusion.
I am not sure what do you mean by this blackboxing.
But to think, that the process of consciousnesses will work inside a computer, but will not work inside some other human skull—is naive.
It should work either on both places or nowhere.
People respond to this with “My memories are crucial, they are my unique identifier!”. Well, you can forget pretty much everything and you will feel the same way. Besides, at every moment that you are self aware, you are remembering different little pieces of everything, doesn’t matter what exactly. Might be a memory of a total solar eclipse, millions have almost the same short movie in their heads. Nothing unique here,
The consciousnesses is a funny algorithm, running everywhere. This is why, you should care about the future and behave accordingly at the present time.
Black boxing is when a complicated process is skipped over in reasoning. You supposed that mind uploading was possible for the sake of argument, to support a conclusion outside of the argument.
I see no reason, why uploading would be impossible. As I see no reason, why interstellar traveling would be impossible.
I have no idea how to actually do both, but that’s another matter.
If the naturalistic view is valid, it is difficult to see a reason why those two would be impossible. But if the Universe is a magic place, then of course. It’s possible that they are both impossible due to some spell of a witch, or something.
Still, I do assign a small probability to the possibility, that the consciousnesses is something not entirely computable and therefor not executable on some model of a Turing machine. But then again, the probability for this I see quite negligible.
Of course. If some of us are right, the consciousness is an algorithm running on a substrate able to compute it.
Then, the transplantation to another substrate is sure possible. How difficult this copping actually is, I wonder.
That all, assuming no magic is involved here. No spirituality, no soul and no other holly crap.
But when we embrace the algorithmic nature of consciousness, intelligence, memories and so on, we lose the unique identifier, so dear to most otherwise rational people. Their mantra goes “You only live once!” or “Everyone is unique and unrepeatable person!”. Yes, sure. So when I was born, a signal traveled across the Universe to change it from the place I could be born, to a place this possibility now expired for good? May I ask, is this signal faster then light? If it isn’t … well, it isn’t good enough.
I am just an algorithm, being computed here and there, before and now.
I forgot to mention this, but I also tried my hand at writing an essay about this sort of thing: finding the physical manifestation of consciousness. If I could vouch for the rigor of it, I’d have posted it to the Facebook group already, but alas, I can’t., though it may be of some use here.
Identifying the physical manifestation of consciousness.
Identifying the final place where physical cause and mental effect meet has been one of neuroscience’s top questions, and as many of us know, is known as the “Hard Problem”. I’d like to try my hand at making a set of rules for the development of a procedure that would pry out the location of that “final destination”. The process is by elimination, ruling out as many intermediaries between consciousness and cause as possible until no intermediary remains. At such a point, it must be concluded that the cause in question is consciousness itself. The principles outlined identify the characteristics of an intermediary, so that they may be cut out. A cause is only an intermediary if it violates any one of these principles:
Instantaneous Change: A change to this physical thing must create an immediate change in mental state. For example, if the heart is our soul, shooting a person in the heart shouldn’t even leave a millisecond of perception, or people with heart disease should also develop psychiatric symptoms not attributable to stress in the course of their illness.
Predictable Change: If a small change in physical state produces a small change in mental state, then a increasing the magnitude of that same change should increase the corresponding mental state without producing any surprises. If increasing that physical change begins to produce the effects of a smaller, but different physical change, then there’s still an intermediary between physical and mental. For example, SSRI’s lift certain kinds of depression, but continued usage can “burn out” serotonin receptors, which means that chemicals like SSRI’s cannot possibly be considered “units of consciousness”.
Unique Change/Repeatability: A change in the state of this physical thing must create a mental state that is unique to that physical change. In graphing terms, value x cannot map to more than one value of y. If there’s more than one possible y value or multiple’s x’s can create the same y, then there’s still an intermediary between physical and mental. For example, and continuing from above, one could start to wonder if “receptors” are the “units of consciousness” and work from there by asking if it’s possible to reproduce a mental state using something other than neurotransmitter receptors. If this possible, then the “unique change” clause is violated by having multiple x’s mapping onto the same y, which implies that there’s an intermediary between neurotransmitter receptors and mental states.
Suppose an LED and its switch are the same thing. To demonstrate this, we put it through the three ( principles to see how the system behaves. Failing any one of these tests indicates that we need to go deeper.
For the Instantaneous Change principle, we can just grab a hypothetical Planck-time high speed camera. If the state change of both the light and the switch are both perfectly in sync with each other even at Planck-time, then they are both the same object. This is not the case, as even the femto-second camera demonstrated on TED Talks could show.
The Predictable Change principle is unapplicable, because there are only two possible states, on/off for the switch, and their two directly correlated states, on/off for the light, so we move on. We can’t very well add a third state for the switch and and expect any kind of change.
Unique Change can be tested by looking at the switch. It appears to be between a power source and the LED light. The method of Alexander the Great would have us cut the switch out of the circuit and see what happens when we pull the wires together. Do the wires, which have the two states, connected/unconnected, correlate directly with the LD’s states of on/off? If so, then the switch was not the LED, for the states of the LED are not permanently changed.
Suppose you have two similar but extremely complicated systems that put compound pendulums to shame and both of which have different starting conditions. Would the state of one system ever be identical to the state of the other at any state that has occurred, or will occur, with system two?
That’s a really cool proof, but phase space can be exponentially large, especially for an “extremely complicated” system. It also requires finite bounds on system parameters.
For that to break my “extremely high probability”, there would have to be relatively few orbits in the phase space approaching a space-filling set of curves, which is itself extremely unlikely, unless you can think up some pathological example.
Their mantra goes “You only live once!” or “Everyone is unique and unrepeatable person!”.
He suggested that it was possible for a person to be repeated, mental state and all, given enough time. I thought to conceptualize the minds of people as being like extremely complicated systems with chaotic interactions to ask if his belief could be true.
Oh I see what you meant now. You don’t become somebody else, which implies there’s an existing mental state that has existed before- you become somebody new.
No, not somebody new. The same consciousness algorithm is running and I am indistinguishable from the consciousness algorithm.
It is not I am you”, it is I am equal consciousness and You are equal consciousness. Therefor *I am you.
For you can change every part of your body and every piece of your memories. Until you are self aware, it’s you. Even with a different body somewhere else.
Honestly? I’d start taking antidepressants, and then embark on a a life-long quest to destroy the Universe via high energy particle experiments, or perhaps an unfriendly AI.
Honestly, I’d start taking antidepressants, and then embark on a a life-long quest to destroy the Universe via high energy particle experiments. Still not used to how the commenting works, this comment was not retracted.
I endorse this theory and it all adds up to normality: in the end, the theories that you offer as alternatives are all true. (I have not read anything other than your comment.)
Whew, Karma. Also, why did this get downvoted so much? I’d appreciate the skepticism a lot more in the form of an argument. (No, seriously, I’d appreciate skeptical argument way more than any abstract philosophical argument should be appreciated)
I just have a robust memetic immune defence system that at once recognises the absurdity of the suggested viewpoint, and that apart from the warm fuzzies it may induce from contemplating the Deep Wisdom that “we are all One!”, it has no implications for anticipated experiences.
I don’t understand why everyone thinks this is such a good thing.
Well, I don’t think warm fuzzies from Deep (i.e. fake) Wisdom are a good thing. Does anyone here? I prefer to get mine from reality, or from fiction, not from the latter passed off as the former.
I mean, I don’t understand why this would be a source of warm fuzzies. Everyone else is really you? That means none of the people I care about ever existed! I can’t imagine people continuing to function with a belief like that, and yet there it is, a Facebook group whose members smile knowingly at each other, each member fully complacent with the idea that none of the others really exist.
Maybe if your life is miserable (e.g., let’s say you are estranged from your family, you are unemployed or have a soul-crushing job, and/or you have no close friends and no romantic prospects) you get a thrill out of believing that none of it is real, that those bothersome people you interact with are in fact only aspects of yourself.
If there’s only one person and everyone else is simulated in their minds then that simulation is powerful and uncontrollable enough that for all practical purposes they can act like there are other people.
This concept is unlike your example, because it is still possible for this one person carrying the simulation to create an offspring or clone, and it would in time become two separate people. Open Individualism states that if the one person carrying the simulation were to somehow reproduce themselves, there would still only be one person.
I personally regard this entire subject as a memetic hazard, and will rot13 accordingly.
Jung qbrf rirelbar guvax bs Bcra Vaqvivqhnyvfz, rkcynvarq ol Rqjneq Zvyyre nf gur pbaprcg juvpu cbfvgf:
Gur pbaprcg vf rkcynvarq nf n pbagenfg sebz gur pbairagvbany ivrj bs Pybfrq Vaqvivqhnyvfz, va juvpu gurer ner znal crefbaf naq gur Ohqquvfg-yvxr ivrj bs Rzcgl Vaqvivqhnyvfz, va juvpu gurer ner ab crefbaf.
V nfxrq vs gurer jrer nal nethzragf sbe Bcra Vaqvivqhnyvfz, be whfg nethzragf ntnvafg Pybfrq naq Rzcgl Vaqvivqhnyvfz gung yrnir BV nf gur bayl nygreangvir. Vpbcb Irggbev rkcynvarq vg yvxr guvf:
Gur rrevrfg cneg nobhg gur Snprobbx tebhc “V Nz Lbh: Qvfphffvbaf va Bcra Vaqvivqhnyvfz” vf gung gur crbcyr va gung tebhc gerng gur pbaprcg bs gurer orvat bayl bar crefba gur fnzr jnl gung Puevfgvnaf gerng gur pbaprcg bs n Tbq gung jvyy qnza gurve ybirq barf gb Uryy sbe abg oryvrivat va Uvz. Vg’f nf vs ab bar va gur tebhc ernyvmrf gur frpbaq yriry vzcyvpngvbaf bs gurer abg orvat nalbar ryfr, be znlor gurl qba’g rira pner.
I think it is true. Self awareness is not hardware (wetware, whatever-ware) dependent. Just upload yourself and everything would be just fine. You’ll be on two places at the same time, but with no communications between your instances, the old and the new one.
The same situation here, only that you have more than one natural born upload. Many billion, in fact.
The naturalism leads to this (frightening) conclusion.
Doesn’t that black box the process of uploading?
I am not sure what do you mean by this blackboxing.
But to think, that the process of consciousnesses will work inside a computer, but will not work inside some other human skull—is naive.
It should work either on both places or nowhere.
People respond to this with “My memories are crucial, they are my unique identifier!”. Well, you can forget pretty much everything and you will feel the same way. Besides, at every moment that you are self aware, you are remembering different little pieces of everything, doesn’t matter what exactly. Might be a memory of a total solar eclipse, millions have almost the same short movie in their heads. Nothing unique here,
The consciousnesses is a funny algorithm, running everywhere. This is why, you should care about the future and behave accordingly at the present time.
Black boxing is when a complicated process is skipped over in reasoning. You supposed that mind uploading was possible for the sake of argument, to support a conclusion outside of the argument.
I see no reason, why uploading would be impossible. As I see no reason, why interstellar traveling would be impossible.
I have no idea how to actually do both, but that’s another matter.
If the naturalistic view is valid, it is difficult to see a reason why those two would be impossible. But if the Universe is a magic place, then of course. It’s possible that they are both impossible due to some spell of a witch, or something.
Still, I do assign a small probability to the possibility, that the consciousnesses is something not entirely computable and therefor not executable on some model of a Turing machine. But then again, the probability for this I see quite negligible.
Does it matter what consciousness is made out of for mind uploading to be possible?
Of course. If some of us are right, the consciousness is an algorithm running on a substrate able to compute it.
Then, the transplantation to another substrate is sure possible. How difficult this copping actually is, I wonder.
That all, assuming no magic is involved here. No spirituality, no soul and no other holly crap.
But when we embrace the algorithmic nature of consciousness, intelligence, memories and so on, we lose the unique identifier, so dear to most otherwise rational people. Their mantra goes “You only live once!” or “Everyone is unique and unrepeatable person!”. Yes, sure. So when I was born, a signal traveled across the Universe to change it from the place I could be born, to a place this possibility now expired for good? May I ask, is this signal faster then light? If it isn’t … well, it isn’t good enough.
I am just an algorithm, being computed here and there, before and now.
I forgot to mention this, but I also tried my hand at writing an essay about this sort of thing: finding the physical manifestation of consciousness. If I could vouch for the rigor of it, I’d have posted it to the Facebook group already, but alas, I can’t., though it may be of some use here.
Wait, so that’s where the whole ‘YOLO’ thing/meme comes from? I notice that I am confused...
How does this square with chaos theory, which models behaviour that diverges greatly due to infinitesimal changes at the start?
What has it got to do with chaos theory?
Suppose you have two similar but extremely complicated systems that put compound pendulums to shame and both of which have different starting conditions. Would the state of one system ever be identical to the state of the other at any state that has occurred, or will occur, with system two?
No, with extremely high probability.
How does that relate to whatever Thomas was saying? For that matter, what is Thomas saying?
Are you sure?
That’s a really cool proof, but phase space can be exponentially large, especially for an “extremely complicated” system. It also requires finite bounds on system parameters.
For that to break my “extremely high probability”, there would have to be relatively few orbits in the phase space approaching a space-filling set of curves, which is itself extremely unlikely, unless you can think up some pathological example.
It does weaken my statement, though.
He suggested that it was possible for a person to be repeated, mental state and all, given enough time. I thought to conceptualize the minds of people as being like extremely complicated systems with chaotic interactions to ask if his belief could be true.
How the identity of a single person squares with it? Wouldn’t a tiny change convert me into somebody else?
At no point has one cubic centimeter of air been exactly like another cubic centimeter of air.
At no point you are exactly the same, as you were seconds ago.
Oh I see what you meant now. You don’t become somebody else, which implies there’s an existing mental state that has existed before- you become somebody new.
No, not somebody new. The same consciousness algorithm is running and I am indistinguishable from the consciousness algorithm.
It is not I am you”, it is I am equal consciousness and You are equal consciousness. Therefor *I am you.
For you can change every part of your body and every piece of your memories. Until you are self aware, it’s you. Even with a different body somewhere else.
Just wondering, does Less Wrong have a procedure for understanding concepts that are incredibly distant from direct experience?
What would you do on the hypothesis that this was true that you wouldn’t do on the hypothesis that it was false?
Honestly? I’d start taking antidepressants, and then embark on a a life-long quest to destroy the Universe via high energy particle experiments, or perhaps an unfriendly AI.
Honestly, I’d start taking antidepressants, and then embark on a a life-long quest to destroy the Universe via high energy particle experiments. Still not used to how the commenting works, this comment was not retracted.
I endorse this theory and it all adds up to normality: in the end, the theories that you offer as alternatives are all true. (I have not read anything other than your comment.)
How can they, if they’re mutually exclusive?
Whew, Karma. Also, why did this get downvoted so much? I’d appreciate the skepticism a lot more in the form of an argument. (No, seriously, I’d appreciate skeptical argument way more than any abstract philosophical argument should be appreciated)
The belief that they are mutually exclusive is confusion.
I don’t understand.
(Partially derot13ing for clarity:)
Nonsense on stilts. Next!
I like your phrasing, but how is this so?
I just have a robust memetic immune defence system that at once recognises the absurdity of the suggested viewpoint, and that apart from the warm fuzzies it may induce from contemplating the Deep Wisdom that “we are all One!”, it has no implications for anticipated experiences.
I don’t understand why everyone thinks this is such a good thing. I wouldn’t have rot13′d this post if I thought this was a good thing.
Well, I don’t think warm fuzzies from Deep (i.e. fake) Wisdom are a good thing. Does anyone here? I prefer to get mine from reality, or from fiction, not from the latter passed off as the former.
I mean, I don’t understand why this would be a source of warm fuzzies. Everyone else is really you? That means none of the people I care about ever existed! I can’t imagine people continuing to function with a belief like that, and yet there it is, a Facebook group whose members smile knowingly at each other, each member fully complacent with the idea that none of the others really exist.
Maybe if your life is miserable (e.g., let’s say you are estranged from your family, you are unemployed or have a soul-crushing job, and/or you have no close friends and no romantic prospects) you get a thrill out of believing that none of it is real, that those bothersome people you interact with are in fact only aspects of yourself.
This is a kind of META argument. “How miserable you must be, to suggest something like this …”
Doesn’t matter how miserable or not he is. It only matters if he is right or not.
I’m just answering Rukifellth’s question as to how could someone derive warm fuzzies from such a belief, not making any kind of argument against it.
I would derive a great number of warm fuzzies from an argument against it.
“People are crazy, the world is mad.” Having boggled at them, I pass by.
If there’s only one person and everyone else is simulated in their minds then that simulation is powerful and uncontrollable enough that for all practical purposes they can act like there are other people.
The concept is unlike traditional solipsism, if that’s what you’re referring to?
I haven’t read past what you posted but it seems identical to me.
This concept is unlike your example, because it is still possible for this one person carrying the simulation to create an offspring or clone, and it would in time become two separate people. Open Individualism states that if the one person carrying the simulation were to somehow reproduce themselves, there would still only be one person.
Past what I posted? Where are you?
In your head.