If human mind is the first copy of a brain that has been uploaded to an computer, than it deserves the same rights as any human. There is a rule against running more than one instance of the same person at the same time.
Human mind created on my own computer from first principles, so to speak of, does not have any rights, but there is also a law in place to prevent such agents from being created, as human minds are dangerous toys.
Plans to enforce thought-taboo devices are likely to fail, as no self-respecting human being would allow such an crude ingerence of third parties into their own thought process. I mean, it starts with NO THINKING ABOUT NANOTECHNOLOGY and in time changes to NO THINKING ABOUT RESISTANCE .
EDIT:
Also, assuming that there is really a need to extract some information from an individual, I would reluctantly grant government right to create temporary copy of an individual to be interrogated, interrogate (i.e. torture) the copy and then delete it shortly afterwards. It is squicky, but in my head superior to leaving the original target with memories of interrogation.
Plans to enforce thought-taboo devices are likely to fail, as no self-respecting human being would allow such an crude ingerence of third parties into their own thought process.
I don’t think that’s the case. If I would present a technique about how everyone on LessWrong could install in himself Ugh-fields that prevents that person from engaging in akrasia I would think there would be plenty of people who would welcome the technique.
As far as my default mental model goes, ugh-field is a term that’s specific enough to filter out your examples.
But I have no problem accepting a mental model that defines the term more broadly. How narrowly you want to define terms always depends on the purpose for which you want to use them.
A surgery altering the tongue so that consuming food is painful.
Even if you want to lose weight, you probably don’t want all eating to hurt.
There’s however the real world treatment of barbaric surgery that works for most people who want to lose weight. It’s however not without it’s issues.
If I would present a technique about how everyone on LessWrong could install in himself Ugh-fields that prevents that person from engaging in akrasia I would think there would be plenty of people who would welcome the technique.
I don’t know. That would depend on your credibility, reversibility of the procedure, etc.
So, say, a startup says “Here is a mind-control device, implant it into your head and it will make you unable to engage in akrasia. That’s all it does, honest! Trust us! Oh, and if there are bugs we’ll fix it by firmware updates.”—how many people would be willing to do it?
Of course you need a trustworthy source for the technology. But as far as spreading new technology there will always be a bunch of people who trust certain people.
There are quite a few people. All people I have seen face to face.
Unfortunately I’m still quite bad at switching on real trust that you need to do things like that mentally without implanting chips.
However the farthest I went in that direction was a hypnosis episode where I allowed someone to switch off my ability to pee accidentally for a short while.
*Just to be clear: I”m not claiming that you can eliminate akrasia completely through hypnosis.
Uh… I agree with you that it really just depends on the marketing, and thought of people willingly mounting thought-taboo chips seems quite possible in the your given context. The connotations of “Though Crime” moved my away from thinking what are possible uses of such techniques towards why the hell should I allow other people to mess with my brain?
I cannot even think about the variety of interesting ways in which though-blocking technology can be applied.
I wouldn’t say they are doomed to fail because it is a slippery slope to NO THINKING ABOUT RESISTANCE , but I would say that is a good reason to object to thought-taboo devices.
I think a law stopping you from creating a second copy of a human or creating a new human counts as a thought crime, if the copy or new human is being run in your mind.
I guess it is kind of a slippery slope, indeed. There are probably ways in which it could work only as intended (hardwired chip or whatever), but allowing other people to block your thoughts is only a couple of steps from turning you into their puppet.
As for simulation as though crime, I am not sure. If they need to peek inside your brain to check if you are not running illegally constructed internal simulations, the government can just simulate a copy of you (with a warrant, I guess), either torture or explicitly read it’s mind (either way terrible) to find out what is going on and then erase it (I mean murder, but government does it so it is kind of better, except not really.).
Your approval of such measures, probably depends on the relative values that you assign to freedom and privacy.
The way I can see it in sci-fi terms:
If human mind is the first copy of a brain that has been uploaded to an computer, than it deserves the same rights as any human. There is a rule against running more than one instance of the same person at the same time.
Human mind created on my own computer from first principles, so to speak of, does not have any rights, but there is also a law in place to prevent such agents from being created, as human minds are dangerous toys.
Plans to enforce thought-taboo devices are likely to fail, as no self-respecting human being would allow such an crude ingerence of third parties into their own thought process. I mean, it starts with NO THINKING ABOUT NANOTECHNOLOGY and in time changes to NO THINKING ABOUT RESISTANCE .
EDIT:
Also, assuming that there is really a need to extract some information from an individual, I would reluctantly grant government right to create temporary copy of an individual to be interrogated, interrogate (i.e. torture) the copy and then delete it shortly afterwards. It is squicky, but in my head superior to leaving the original target with memories of interrogation.
I don’t think that’s the case. If I would present a technique about how everyone on LessWrong could install in himself Ugh-fields that prevents that person from engaging in akrasia I would think there would be plenty of people who would welcome the technique.
Would you consider the following to be “Ugh-fields that prevents that person from engaging in akrasia”?
A drug such that, when a person who has the drug in their system drinks alcohol, the interaction is very unpleasant.
A surgery altering the tongue so that consuming food is painful.
As far as my default mental model goes, ugh-field is a term that’s specific enough to filter out your examples. But I have no problem accepting a mental model that defines the term more broadly. How narrowly you want to define terms always depends on the purpose for which you want to use them.
Even if you want to lose weight, you probably don’t want all eating to hurt. There’s however the real world treatment of barbaric surgery that works for most people who want to lose weight. It’s however not without it’s issues.
I don’t know. That would depend on your credibility, reversibility of the procedure, etc.
So, say, a startup says “Here is a mind-control device, implant it into your head and it will make you unable to engage in akrasia. That’s all it does, honest! Trust us! Oh, and if there are bugs we’ll fix it by firmware updates.”—how many people would be willing to do it?
Of course you need a trustworthy source for the technology. But as far as spreading new technology there will always be a bunch of people who trust certain people.
So, who will you trust to rearrange your mind?
There are quite a few people. All people I have seen face to face.
Unfortunately I’m still quite bad at switching on real trust that you need to do things like that mentally without implanting chips. However the farthest I went in that direction was a hypnosis episode where I allowed someone to switch off my ability to pee accidentally for a short while.
*Just to be clear: I”m not claiming that you can eliminate akrasia completely through hypnosis.
Uh… I agree with you that it really just depends on the marketing, and thought of people willingly mounting thought-taboo chips seems quite possible in the your given context. The connotations of “Though Crime” moved my away from thinking what are possible uses of such techniques towards why the hell should I allow other people to mess with my brain?
I cannot even think about the variety of interesting ways in which though-blocking technology can be applied.
I have learned a new word today. Was that the French “ingérence”, meaning “interference, intervention”?
OED says: “Bearing in upon; intrusion; interference.” and also “Compare French ingérence.”
I wouldn’t say they are doomed to fail because it is a slippery slope to NO THINKING ABOUT RESISTANCE , but I would say that is a good reason to object to thought-taboo devices.
I think a law stopping you from creating a second copy of a human or creating a new human counts as a thought crime, if the copy or new human is being run in your mind.
I guess it is kind of a slippery slope, indeed. There are probably ways in which it could work only as intended (hardwired chip or whatever), but allowing other people to block your thoughts is only a couple of steps from turning you into their puppet.
As for simulation as though crime, I am not sure. If they need to peek inside your brain to check if you are not running illegally constructed internal simulations, the government can just simulate a copy of you (with a warrant, I guess), either torture or explicitly read it’s mind (either way terrible) to find out what is going on and then erase it (I mean murder, but government does it so it is kind of better, except not really.).
Your approval of such measures, probably depends on the relative values that you assign to freedom and privacy.
Of course, provided the alternative is not to just be killed here and now.
Men with weapons have been successfully persuading other people to do something they don’t want to do for ages.
So you’re saying that the government (or whoever runs the show) is going to force everyone—at gunpoint—to insert such devices into their brains?
Not “is going to” but “may”.
That was a simple objection to the statement “but no one will agree to that!”.
It seems to me that if the government can run a simulation of an individual, it can also get the information in a better way.
I am not sure though. That is an interesting question.