and also seems deeply connected to biological systems.
I think I agree. Of course, all of the suffering that we know about so far is instantiated in biological systems. Depends on what you mean by “deeply connected.” Do you mean that you think that the biological substrate is necessary? i.e. you have a biological theory of consciousness?
AI/computers are just a “picture” of these biological systems.
What does this mean?
Now, we could someday crack consciousness in electronic systems, but I think it would be winning the lottery to get there not on purpose.
Can you elaborate? Are you saying that, unless we deliberately try to build in some complex stuff that is necessary for suffering, AI systems won’t ‘naturally’ have the capacity for suffering? (i.e. you’ve ruled out the possibility that Steven Byrnes raised in his comment)
1.) Suffering seems to need a lot of complexity, because it demands consciousness, which is the most complex thing that we know of.
2.) I personally suspect that the biological substrate is necessary (of course that I can’t be sure.) For reasons, like I mentioned, like sleep and death. I can’t imagine a computer that doesn’t sleep and can operate for trillions of years as being conscious, at least in any way that resembles an animal. It may be superintelligent but not conscious. Again, just my suspicion.
3.) I think it’s obvious—it means that we are trying to recreate something that biological systems do (arithmetics, imagine recognition, playing games, etc) on these electronic systems called computers or AI. Just like we try to recreate a murder scene with pencils and paper. But the murder drawing isn’t remotely a murder, it’s only a basic representation of a person’s idea of a murder.
4.) Correct. I’m not completely excluding that possibility, but like I said, it would be a great luck to get there not on purpose. Maybe not “winning the lottery” luck as I’ve mentioned, but maybe 1 to 5% probability.
We must understand that suffering takes consciousness, and consciousness takes a nervous system. Animals without one aren’t conscious. The nature of computers is so drastically different from that of a biological nervous system (and, at least until now, much less complex) that I think it would be quite unlikely that we eventually unwillingly generate this very complex and unique and unknown property of biological systems that we call consciousness. I think it would be a great coincidence.
A few questions:
Can you elaborate on this?
I think I agree. Of course, all of the suffering that we know about so far is instantiated in biological systems. Depends on what you mean by “deeply connected.” Do you mean that you think that the biological substrate is necessary? i.e. you have a biological theory of consciousness?
What does this mean?
Can you elaborate? Are you saying that, unless we deliberately try to build in some complex stuff that is necessary for suffering, AI systems won’t ‘naturally’ have the capacity for suffering? (i.e. you’ve ruled out the possibility that Steven Byrnes raised in his comment)
1.) Suffering seems to need a lot of complexity, because it demands consciousness, which is the most complex thing that we know of.
2.) I personally suspect that the biological substrate is necessary (of course that I can’t be sure.) For reasons, like I mentioned, like sleep and death. I can’t imagine a computer that doesn’t sleep and can operate for trillions of years as being conscious, at least in any way that resembles an animal. It may be superintelligent but not conscious. Again, just my suspicion.
3.) I think it’s obvious—it means that we are trying to recreate something that biological systems do (arithmetics, imagine recognition, playing games, etc) on these electronic systems called computers or AI. Just like we try to recreate a murder scene with pencils and paper. But the murder drawing isn’t remotely a murder, it’s only a basic representation of a person’s idea of a murder.
4.) Correct. I’m not completely excluding that possibility, but like I said, it would be a great luck to get there not on purpose. Maybe not “winning the lottery” luck as I’ve mentioned, but maybe 1 to 5% probability.
We must understand that suffering takes consciousness, and consciousness takes a nervous system. Animals without one aren’t conscious. The nature of computers is so drastically different from that of a biological nervous system (and, at least until now, much less complex) that I think it would be quite unlikely that we eventually unwillingly generate this very complex and unique and unknown property of biological systems that we call consciousness. I think it would be a great coincidence.