In Android Chronicles: Reborn, Machten falls in love with his creation (Synthia). In an attempt to get her to feel the same for him, despite not having human biology, he develops an empathy chip that gives her an advanced ability to read human expressions and provide appropriate emotional responses. Unfortunately for him, instead of love, she experiences loathing for the man who purges her memories and keeps her prisoner.
I’m not aware of any efforts at this point to create anything like an empathy chip and don’t see the motivation to do so except from an obsessed individual like Jeremiah Machten.
To build one, would require first to develop a tool that would allow the machine to assess human situations in order to determine what range of behavioral responses would be appropriate. This requires situational awareness and sensitivity to others. As a human example, a one-sided romantic encounter calls for backing off, not persistence.
Second involves developing a means to determine which of appropriate reactions is preferable in a given situation. Humans have difficulty with this, often reacting on an emotional basis that leads them to an outcome that isn’t in their best interests. And sometimes the human response to a situation, like a fire, is to run into danger to help others. In addition, people don’t come up with rational solutions to situations such as friendship and love. Could we realistically expect to teach an android, using logic, to come up with worthwhile solutions?
Yet, both of the above issues are workable, given enough computing capacity and deep-dive training.
The harder part of emulating human empathy would be what to do with the conclusion. Artificial intelligence can be driven by logical decision making, which might even match human choices, but that’s not the same as actually experiencing empathy. Now, perhaps we might stop here and say if an AI comes up with the same empathic outcomes as a human, that’s good enough.
However, Machten wanted his android to fall in love with him. Good luck with that. Perhaps he should have treated her better.
The hardest part of completing the circuit of an empathy chip along the lines of what Machten wanted was to create a connection between the chip and something that would motivate a reaction on the part of the android. The superficial answer is to provide a mechanism for the AI to arrive at a solution and then execute it. However, an intelligent AI would understand that self-sacrifice would limit its ability to continue to pursue its goals. Thus, an intelligent AI android might refuse to race into a fire to save lives. To make this empathy chip better emulate humans would require an AI equivalent to what drives human emotions.
Human love is aided by oxytocin, which is the chemical reaction that motivates a new mother to love her newborn child. Without that, humans would tend to abandon their young to the elements, as other species do. To stimulate other emotive responses in humans involves dopamine, serotonin, and a host of other chemicals that react to motivate us to love, fight, flee, eat, and the other natural things missing in an AI android.
However, this last step is not necessary to cause an AI android to act in all ways as if it had human empathy. We can do that by designing its goals and directives appropriately. But Machten wasn’t satisfied with that. He wanted more.
Where the social-psychology module and the empathy chip differ is that the latter is intended to get Synthia to feel. It does lead her to have emergent behavior and feelings, just not what Machten intended. The social-psychology module is more of a social resource that comes to conclusions about people around Synthia and how she can socially ease the situation.