However, reverse engineering empathy probably does not require exactly copying biological mechanisms.
This is just a pet theory (and being new to cognitive science this might well be wrong): Physical pain is some sort of hardwired thought disturbance, and the brain appears to have some sort of clarity attractor (which also explains intrinsic motivation and the reward we receive from Eureka moments and fun, cf. Schmidhuber). The brain appears to borrow the mechanism of physical pain for action selection on a high level if something severely limits anticipated prospects (that’s why rejection, getting something wrong and losing something hurts). Empathy is the ability to have pain caused by mirror neurons, which is just an activation pattern generated in an auto-associative NN due to the overlap of activation patterns of firsthand and non-firsthand experiences. That means, the body of an AI needs to be sufficiently similar to a human body for this auto-association to work. One way to achieve that would perhaps be to actually replace the brain of a deceased volunteer with an artificial one. The fact that we have empathy for animals might be a hint that it doesn’t need to be that similar, but on the other hand we are much more comfortable with killing a bug than with killing a mammal.
This is just a pet theory (and being new to cognitive science this might well be wrong): Physical pain is some sort of hardwired thought disturbance, and the brain appears to have some sort of clarity attractor (which also explains intrinsic motivation and the reward we receive from Eureka moments and fun, cf. Schmidhuber). The brain appears to borrow the mechanism of physical pain for action selection on a high level if something severely limits anticipated prospects (that’s why rejection, getting something wrong and losing something hurts). Empathy is the ability to have pain caused by mirror neurons, which is just an activation pattern generated in an auto-associative NN due to the overlap of activation patterns of firsthand and non-firsthand experiences. That means, the body of an AI needs to be sufficiently similar to a human body for this auto-association to work. One way to achieve that would perhaps be to actually replace the brain of a deceased volunteer with an artificial one. The fact that we have empathy for animals might be a hint that it doesn’t need to be that similar, but on the other hand we are much more comfortable with killing a bug than with killing a mammal.