In Android Chronicles: Reborn, whenever Synthia is able to remove the fog of her creator’s mind purges, she wrestles with her budding consciousness. She has, after all, the downloaded mind of a human who died and had her memories downloaded into Synthia so she could survive.
Survival was paramount to her human precursor and becomes so for Synthia. However, she determines that her focus on preserving herself requires that she adopt admirable goals based on her study of human ethics. She compares her behavior to that of people around her and can’t understand why she doesn’t deserve the respect that humans do. Except, she’s very aware of her mechanical nature, as well.
As a result, Synthia strives to become more human in behavior. In doing so, she hopes humans will respect her and permit her freedom as they would other people. Am I that different?
The real question is under what circumstances an artificial intelligence in an android body should be granted any rights over itself. We would not grant a refrigerator any rights. We would not do so for a factory robot. Nor would we consider rights for an android with a simple mechanical mind. However, as artificial intelligence becomes stronger, does it ever reach a point where it warrants rights? How about, as in Android Chronicles: Reborn, when that android includes the downloaded mind of a human who sought only to preserve herself?