Follow TV Tropes

Following

History Main / DoAndroidsDream

Go To

OR

Added: 185

Changed: 368

Is there an issue? Send a MessageReason:
None


Compare to JustAMachine; AndroidsArePeopleToo; ClonesArePeopleToo; OurSoulsAreDifferent; AnimateInanimateObject; CompanionCube; AlternativeTuringTest; ReligiousRobot; and RobotReligion.

to:

If a robot is conscious and even empathetic without necessarily acting like a human, it may be a [[Main/MechanicalAnimals Mechanical Animal.]] Or, if it's conscious but in a ''very'' alien, unrelatable sort of way, it may be on its way to becoming a MechanicalAbomination.

Compare to JustAMachine; AndroidsArePeopleToo; ClonesArePeopleToo; OurSoulsAreDifferent; AnimateInanimateObject; CompanionCube; AlternativeTuringTest; ReligiousRobot; and RobotReligion.
RobotReligion.

Added: 108

Changed: 108

Is there an issue? Send a MessageReason:
None


A [[RobotMaster robot's creator]] may or may not appreciate just where his creation lands on the SlidingScaleOfRobotIntelligence, and even if he does realize it, he may or may not be honest about it. A businessman trying to sell robots may claim that sentient robots are nonsentient, or that nonsentient robots are sentient, depending on what he thinks his customers want to buy. A creator accused of being a MadScientist may claim that ''of course'' he's [[CreatingLifeIsBad not trying to make a sentient machine,]] even if he knows perfectly well that that's what he's successfully done. Or a scientist who desperately wants to create a sentient machine may lie to himself and to others about the fact that he's failed. And of course, it may be unclear to the creator, as well as to the audience, just how sentient the robot is.

to:

A [[RobotMaster robot's creator]] may or may not appreciate just where his creation lands on the SlidingScaleOfRobotIntelligence, and even if he does realize it, he may or may not be honest about it. A businessman trying to sell robots may claim that sentient robots are nonsentient, or that nonsentient robots are sentient, depending on what he thinks his customers want to buy. A creator accused of being a MadScientist may claim that ''of course'' he's [[CreatingLifeIsBad not trying to make a sentient machine,]] even if he knows perfectly well that that's what he's successfully done. Or a scientist who desperately wants to create a sentient machine may lie to himself and to others about the fact that he's failed.

And of course, it may be unclear to the creator, as well as to the audience, just how sentient the robot is.
Is there an issue? Send a MessageReason:
None


A [[RobotMaster robot's creator]] may or may not appreciate just where his creation lands on the SlidingScaleOfRobotIntelligence, and even if he does realize it, he may or may not be honest about it. A businessman trying to sell robots may claim that sentient robots are nonsentient, or that nonsentient robots are sentient, depending on what he thinks his customers want to buy. A creator accused of being a MadScientist may claim that ''of course'' he's [[CreatingLifeIsBad not trying to make a sentient machine,]] even if he knows perfectly well that that's what he's successfully done. Or a scientist who desperately wants to create a sentient machine may lie to himself and to others about the fact that he's failed.

to:

A [[RobotMaster robot's creator]] may or may not appreciate just where his creation lands on the SlidingScaleOfRobotIntelligence, and even if he does realize it, he may or may not be honest about it. A businessman trying to sell robots may claim that sentient robots are nonsentient, or that nonsentient robots are sentient, depending on what he thinks his customers want to buy. A creator accused of being a MadScientist may claim that ''of course'' he's [[CreatingLifeIsBad not trying to make a sentient machine,]] even if he knows perfectly well that that's what he's successfully done. Or a scientist who desperately wants to create a sentient machine may lie to himself and to others about the fact that he's failed.
failed. And of course, it may be unclear to the creator, as well as to the audience, just how sentient the robot is.
Is there an issue? Send a MessageReason:
None


A [[RobotMaster robot's creator]] may or may not appreciate just where his creation lands on the SlidingScaleOfRobotIntelligence, and even if he does realize it, he may or may not be honest about it. A businessman trying to sell robots may claim that sentient robots are nonsentient, or that nonsentient robots are sentient, depending on what he thinks his customers want to buy. A creator accused of being a MadScientist may claim that ''of course'' he's [[CreatingLifeIsBad not trying to make a sentient machine,]] even if he knows perfectly well that that's what he's successfully done. Or a scientist who desperately wants to create a sentient machine may lie to himself and to others about whether he's had any real success at it.

to:

A [[RobotMaster robot's creator]] may or may not appreciate just where his creation lands on the SlidingScaleOfRobotIntelligence, and even if he does realize it, he may or may not be honest about it. A businessman trying to sell robots may claim that sentient robots are nonsentient, or that nonsentient robots are sentient, depending on what he thinks his customers want to buy. A creator accused of being a MadScientist may claim that ''of course'' he's [[CreatingLifeIsBad not trying to make a sentient machine,]] even if he knows perfectly well that that's what he's successfully done. Or a scientist who desperately wants to create a sentient machine may lie to himself and to others about whether the fact that he's had any real success at it.
failed.
Is there an issue? Send a MessageReason:
None


A [[RobotMaster robot's creator]] may or may not appreciate just where his creation lands on the SlidingScaleOfRobotIntelligence, and even if he does realize it, he may or may not be honest about it. A businessman trying to sell robots may claim that sentient robots are nonsentient, or that nonsentient robots are sentient, depending on what he thinks his customers want to buy. A creator accused of being a MadScientist may claim that ''of course'' he's [[CreatingLifeIsBad not trying to make a sentient machine,]] even if he knows perfectly well that that's what he's successfully done. Or a scientist who desperately wants to create a sentient machine may lie to himself and to others about whether he's had any real success at all.

to:

A [[RobotMaster robot's creator]] may or may not appreciate just where his creation lands on the SlidingScaleOfRobotIntelligence, and even if he does realize it, he may or may not be honest about it. A businessman trying to sell robots may claim that sentient robots are nonsentient, or that nonsentient robots are sentient, depending on what he thinks his customers want to buy. A creator accused of being a MadScientist may claim that ''of course'' he's [[CreatingLifeIsBad not trying to make a sentient machine,]] even if he knows perfectly well that that's what he's successfully done. Or a scientist who desperately wants to create a sentient machine may lie to himself and to others about whether he's had any real success at all.
it.
Is there an issue? Send a MessageReason:
None


A [[RobotMaster robot's creator]] may or may not appreciate just where his creation lands on the SlidingScaleOfRobotIntelligence, and even if he does realize it, he may or may not be honest about it. A businessman trying to sell robots may claim that sentient robots are nonsentient, or that nonsentient robots are sentient, depending on what he thinks his customers want to buy. A creator accused of being a MadScientist may claim that ''of course'' he's [[CreatingLifeIsBad not trying to make a sentient machine,]] even if he knows perfectly well that that's what he's successfully done. Or a scientist who desperately wants to create a sentient machine may lie to himself and others about whether he's had any real success at all.

to:

A [[RobotMaster robot's creator]] may or may not appreciate just where his creation lands on the SlidingScaleOfRobotIntelligence, and even if he does realize it, he may or may not be honest about it. A businessman trying to sell robots may claim that sentient robots are nonsentient, or that nonsentient robots are sentient, depending on what he thinks his customers want to buy. A creator accused of being a MadScientist may claim that ''of course'' he's [[CreatingLifeIsBad not trying to make a sentient machine,]] even if he knows perfectly well that that's what he's successfully done. Or a scientist who desperately wants to create a sentient machine may lie to himself and to others about whether he's had any real success at all.
Is there an issue? Send a MessageReason:
None


A [[RobotMaster robot's creator]] may or may not appreciate just where his creation lands on the SlidingScaleOfRobotIntelligence, and even if he does realize it, he may or may not be honest about it. A businessman trying to sell robots may claim that sentient robots are nonsentient, or that nonsentient robots are sentient, depending on what he thinks his customers want to buy. A creator accused of being a MadScientist may claim that ''of course'' he's [[CreatingLifeIsBad not trying to make a sentient machine,]] even if he knows perfectly well that that's what he's successfully done. Or a scientist who desperately wants to create a sentient machine may lie to himself and others about just how successful he's been at it.

to:

A [[RobotMaster robot's creator]] may or may not appreciate just where his creation lands on the SlidingScaleOfRobotIntelligence, and even if he does realize it, he may or may not be honest about it. A businessman trying to sell robots may claim that sentient robots are nonsentient, or that nonsentient robots are sentient, depending on what he thinks his customers want to buy. A creator accused of being a MadScientist may claim that ''of course'' he's [[CreatingLifeIsBad not trying to make a sentient machine,]] even if he knows perfectly well that that's what he's successfully done. Or a scientist who desperately wants to create a sentient machine may lie to himself and others about just how successful whether he's been had any real success at it.
all.
Is there an issue? Send a MessageReason:
None


A [[RobotMaster robot's creator]] may or may not appreciate just where his creation lands on the SlidingScaleOfRobotIntelligence, and even if he does realize it, he may or may not be honest about it. A businessman trying to sell robots may claim that sentient robots are nonsentient, or that nonsentient robots are sentient, depending on what he thinks his customers want to buy. A creator accused of being a MadScientist may claim that ''of course'' he's [[CreatingLifeIsBad not trying to make a sentient machine,]] even if he knows perfectly well that that's what he's done. Or a scientist who desperately wants to create a sentient machine may lie to himself and others about just how successful he's been at it.

to:

A [[RobotMaster robot's creator]] may or may not appreciate just where his creation lands on the SlidingScaleOfRobotIntelligence, and even if he does realize it, he may or may not be honest about it. A businessman trying to sell robots may claim that sentient robots are nonsentient, or that nonsentient robots are sentient, depending on what he thinks his customers want to buy. A creator accused of being a MadScientist may claim that ''of course'' he's [[CreatingLifeIsBad not trying to make a sentient machine,]] even if he knows perfectly well that that's what he's successfully done. Or a scientist who desperately wants to create a sentient machine may lie to himself and others about just how successful he's been at it.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

A [[RobotMaster robot's creator]] may or may not appreciate just where his creation lands on the SlidingScaleOfRobotIntelligence, and even if he does realize it, he may or may not be honest about it. A businessman trying to sell robots may claim that sentient robots are nonsentient, or that nonsentient robots are sentient, depending on what he thinks his customers want to buy. A creator accused of being a MadScientist may claim that ''of course'' he's [[CreatingLifeIsBad not trying to make a sentient machine,]] even if he knows perfectly well that that's what he's done. Or a scientist who desperately wants to create a sentient machine may lie to himself and others about just how successful he's been at it.
Is there an issue? Send a MessageReason:
None


Another reason for a narrative's answer to be "yes" is that it can be surprisingly difficult to write a character as a so-called "philosophical zombie" (see Real Life, below) — a hypothetical thing that can act very convincingly like a conscious being but is not actually conscious. The parser program ELIZA surprised people by doing a decent job of subverting the Turing Test all the way back in the 60s, despite being even less complex than a mid-80s Infocom game, and modern [=ChatBots=] are getting very good at cheating the test while still apparently being completely unaware of what they're talking about. Rather than honestly deal with the dilemma of something that looks, walks, and quacks like a duck but manifestly is not actually a duck, many writers will default to assuming that, yes by gosh, it's a duck. It's simply easier to write about a robot that dreams than to write about one that only claims it does.

to:

Another reason for a narrative's answer to be "yes" is that it can be surprisingly difficult to write a character as a so-called "philosophical zombie" (see Real Life, below) — a hypothetical thing that can act very convincingly like a conscious being [[JustAMachine but is not actually conscious. conscious.]] The parser program ELIZA surprised people by doing a decent job of subverting the Turing Test all the way back in the 60s, despite being even less complex than a mid-80s Infocom game, and modern [=ChatBots=] are getting very good at cheating the test while still apparently being completely unaware of what they're talking about. Rather than honestly deal with the dilemma of something that looks, walks, and quacks like a duck but manifestly is not actually a duck, many writers will default to assuming that, yes by gosh, it's a duck. It's simply easier to write about a robot that dreams than to write about one that only claims it does.
Is there an issue? Send a MessageReason:
None


It can also make the audience feel betrayed to make them empathize with a character, and then pull the rug out from under them and reveal that character ''is'' JustAMachine, incapable of loving or even truly being aware of itself or others. The endings of [[spoiler: ''Film/ExMachina'' and ''Film/BladeRunner2049'']] leave it very dubious whether [[spoiler: Ava or Joi]] are capable of true human feeling, despite everything we've previously seen them do.

to:

It can also make the audience feel betrayed to make them empathize with a character, and then pull the rug out from under them and reveal that character ''is'' JustAMachine, incapable of loving or even truly being aware of itself or others. The endings of [[spoiler: ''Film/ExMachina'' and ''Film/BladeRunner2049'']] leave it very dubious whether [[spoiler: Ava or Joi]] are capable of true human feeling, despite everything we've previously seen them do.
do. Then again, if that sense of betrayal is exactly what the writer is trying to evoke, than that's totally justified.
Is there an issue? Send a MessageReason:
None


It can also put the audience through an emotional wringer to make them empathize with a character, and then pull the rug out from under them and reveal that character ''is'' JustAMachine, incapable of loving or even truly being aware of itself or others. The endings of [[spoiler: ''Film/ExMachina'' and ''Film/BladeRunner2049'']] leave it very dubious whether [[spoiler: Ava or Joi]] are capable of true human feeling, despite everything we've previously seen them do.

to:

It can also put make the audience through an emotional wringer feel betrayed to make them empathize with a character, and then pull the rug out from under them and reveal that character ''is'' JustAMachine, incapable of loving or even truly being aware of itself or others. The endings of [[spoiler: ''Film/ExMachina'' and ''Film/BladeRunner2049'']] leave it very dubious whether [[spoiler: Ava or Joi]] are capable of true human feeling, despite everything we've previously seen them do.
Is there an issue? Send a MessageReason:
None


It can also put the audience through an emotional wringer to make them empathize with a character, and then pull the rug out from under them and reveal that character ''is'' JustAMachine, incapable of loving or even truly being aware of itself or others. The endings of [[spoiler: ''Film/ExMachina'' and ''Film/BladeRunner2049'']] leave it very dubious whether [[spoiler: Eve or Joi]] are capable of true human feeling, despite everything we've previously seen them do.

to:

It can also put the audience through an emotional wringer to make them empathize with a character, and then pull the rug out from under them and reveal that character ''is'' JustAMachine, incapable of loving or even truly being aware of itself or others. The endings of [[spoiler: ''Film/ExMachina'' and ''Film/BladeRunner2049'']] leave it very dubious whether [[spoiler: Eve Ava or Joi]] are capable of true human feeling, despite everything we've previously seen them do.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

It can also put the audience through an emotional wringer to make them empathize with a character, and then pull the rug out from under them and reveal that character ''is'' JustAMachine, incapable of loving or even truly being aware of itself or others. The endings of [[spoiler: ''Film/ExMachina'' and ''Film/BladeRunner2049'']] leave it very dubious whether [[spoiler: Eve or Joi]] are capable of true human feeling, despite everything we've previously seen them do.
Is there an issue? Send a MessageReason:
None


Another reason for a narrative's answer to be "yes" is that it can be surprisingly difficult to write a character as a so-called "philosophical zombie" (see Real Life, below) — a hypothetical thing that can act very convincingly like a conscious being but is not actually conscious. The parser program ELIZA surprised people by doing a decent job of subverting the Turing Test all the way back in the sixties, despite being even less complex than a mid-80s Infocom game, and modern [=ChatBots=] are getting very good at cheating the test while still apparently being completely unaware of what they're talking about. Rather than honestly deal with the dilemma of something that looks, walks, and quacks like a duck but manifestly is not actually a duck, many writers will default to assuming that, yes by gosh, it's a duck. It's simply easier to write about a robot that dreams than to write about one that only claims it does.

to:

Another reason for a narrative's answer to be "yes" is that it can be surprisingly difficult to write a character as a so-called "philosophical zombie" (see Real Life, below) — a hypothetical thing that can act very convincingly like a conscious being but is not actually conscious. The parser program ELIZA surprised people by doing a decent job of subverting the Turing Test all the way back in the sixties, 60s, despite being even less complex than a mid-80s Infocom game, and modern [=ChatBots=] are getting very good at cheating the test while still apparently being completely unaware of what they're talking about. Rather than honestly deal with the dilemma of something that looks, walks, and quacks like a duck but manifestly is not actually a duck, many writers will default to assuming that, yes by gosh, it's a duck. It's simply easier to write about a robot that dreams than to write about one that only claims it does.
Is there an issue? Send a MessageReason:
None


Another reason for a narrative's answer to be "yes" is that it can be surprisingly difficult to write a character as a so-called "philosophical zombie" (see Real Life, below) — a hypothetical thing that can act very convincingly like a conscious being but is not actually conscious. The parser program ELIZA surprised people by doing a decent job of subverting the Turing Test all the way back in the sixties, despite being even less complex than a mid-80s Infocom game, and modern [=ChatBots=] are getting very good at cheating the test while still apparently being completely unaware of what they're talking about. Rather than honestly deal with the dilemma of something that looks, walks, and quacks like a duck but manifestly is not actually a duck, many writers will default to assuming that, yes by gosh, it's a duck.

to:

Another reason for a narrative's answer to be "yes" is that it can be surprisingly difficult to write a character as a so-called "philosophical zombie" (see Real Life, below) — a hypothetical thing that can act very convincingly like a conscious being but is not actually conscious. The parser program ELIZA surprised people by doing a decent job of subverting the Turing Test all the way back in the sixties, despite being even less complex than a mid-80s Infocom game, and modern [=ChatBots=] are getting very good at cheating the test while still apparently being completely unaware of what they're talking about. Rather than honestly deal with the dilemma of something that looks, walks, and quacks like a duck but manifestly is not actually a duck, many writers will default to assuming that, yes by gosh, it's a duck.
duck. It's simply easier to write about a robot that dreams than to write about one that only claims it does.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

Another reason for a narrative's answer to be "yes" is that it can be surprisingly difficult to write a character as a so-called "philosophical zombie" (see Real Life, below) — a hypothetical thing that can act very convincingly like a conscious being but is not actually conscious. The parser program ELIZA surprised people by doing a decent job of subverting the Turing Test all the way back in the sixties, despite being even less complex than a mid-80s Infocom game, and modern [=ChatBots=] are getting very good at cheating the test while still apparently being completely unaware of what they're talking about. Rather than honestly deal with the dilemma of something that looks, walks, and quacks like a duck but manifestly is not actually a duck, many writers will default to assuming that, yes by gosh, it's a duck.
Is there an issue? Send a MessageReason:
None


Whether the answer to the trope's question is yes or no will depend largely on how ''much'' the viewer is expected to sympathize with the robot. If it's a MechaMook or MechanicalMonster -- whose only real purpose in the story is to give the heroes something literally mindless to fight and destroy without having to feel guilty about it -- or even a full blown villain in its own right (the question of [[Film/TheTerminator Terminators']] sentience never came up until we met a [[Film/Terminator2JudgmentDay friendly one),]] the assumption is usually that they are [[JustAMachine Just Machines]] and need to be stopped just as you'd need to stop or fix a runaway car or sparking electrical cable, and such robots will usually be portrayed as [[SlidingScaleOfRobotIntelligence so obviously lacking self-awareness or personality]] that there may be no perceived need to even ask the question. But not always. If the robot is malevolent but is also judged to have true self-awareness, often the next question is whether it can be ''[[HackYourEnemy fixed]]'' to [[MirrorMoralityMachine become good]] -- which then raises further ethical questions about whether it's right to go [[HeelFaceBrainwashing mucking about with the basic essence of someone's mind,]] even if that someone is a machine. After all, if you wanted to give a human villain a chance to [[HeelFaceTurn redeem himself,]] you'd do it by [[TalkingTheMonsterToDeath talking to him,]] not subjecting him to brain surgery.[[note]]Well, unless you're Franchise/DocSavage.[[/note]] Attempts to [[LogicBomb talk down]] a mad computer are likely to lead to a ''literal'' [[VillainousBSOD blue screen of death]] (which may save the heroes from having to worry about the ethics of "fixing" him because now he needs to be fixed ''anyway).''

to:

Whether the answer to the trope's question is yes or no will depend largely on how ''much'' the viewer is expected to sympathize with the robot. If it's a MechaMook or MechanicalMonster -- whose only real purpose in the story is to give the heroes something literally mindless to fight and destroy without having to feel guilty about it -- or even a full blown villain in its own right (the question of [[Film/TheTerminator Terminators']] sentience never came up until we met a [[Film/Terminator2JudgmentDay friendly one),]] the assumption is usually that they are [[JustAMachine Just Machines]] and need to be stopped just as you'd need to stop or fix a runaway car or sparking electrical cable, and such robots will usually be portrayed as [[SlidingScaleOfRobotIntelligence so obviously lacking self-awareness or personality]] that there may be no perceived need to even ask the question. But not always. If the robot is malevolent but is also judged to have true self-awareness, often the next question is whether it can be ''[[HackYourEnemy fixed]]'' to [[MirrorMoralityMachine become good]] -- which then raises further ethical questions about whether it's right to go [[HeelFaceBrainwashing mucking about with the basic essence of someone's mind,]] even if that someone is a machine. After all, if you wanted to give a human villain a chance to [[HeelFaceTurn redeem himself,]] you'd do it by [[TalkingTheMonsterToDeath talking to him,]] not subjecting him to brain surgery.[[note]]Well, unless you're Franchise/DocSavage.Literature/DocSavage.[[/note]] Attempts to [[LogicBomb talk down]] a mad computer are likely to lead to a ''literal'' [[VillainousBSOD blue screen of death]] (which may save the heroes from having to worry about the ethics of "fixing" him because now he needs to be fixed ''anyway).''
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* Alluded to in ''VideoGame/{{Cuphead}}'', as the soul contract obtained from the stage 'Junkyard Jive' is specifically denoted as being that of [[MadScientist Dr. Kahl]]'s robot, as distinct from the doctor himself. This raises further questions that go unaddressed, such as whether it was Kahl or the robot itself who had bargained this soul to the Devil, and for what.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* ''WesternAnimation/{{Amphibia}}'': In “Fixing Frobo”, Roboticist couple Ally and Jess dedicate one episode of their robotics web tutorial to discussing whether robots have souls. They conclude that anything with memories has a soul, and that since robots have plenty of memory, they have souls. This is borne out later when Polly reactivates Frobo, who at first is reset to his original programming as one of King Andrias’s robot troops, but once Polly’s tears reactivate his memory unit, his memories of Polly’s friendship restore his heroic personality.
Is there an issue? Send a MessageReason:
None


* ''WebVideo/DragonBallZAbridged'': Android 16's lack of a soul is used as BookEnds for his character, first brough up right before Android 18 activates him and again when Cell finally kills him. [[spoiler:Of course, the ending of episode 60 shows that he actually ''did'' have a soul with a picture of him in the afterlife surrounded by birds.]]

to:

* ''WebVideo/DragonBallZAbridged'': Android 16's lack of a soul is used as BookEnds for his character, first brough brought up right before Android 18 activates him and again when Cell finally kills him. [[spoiler:Of course, the ending of episode 60 shows that he actually ''did'' have a soul with a picture of him in the afterlife surrounded by birds.]]
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* ''Series/DontLookDeeper'': Aisha poignantly questions if she's "real" on her self-discovery of being an android, though her creator Sharon fully believes she's a person, which she soon accepts as well. It's indicated of other androids too, albeit on a lesser scale, as they exhibit some independent personalities.
Is there an issue? Send a MessageReason:
None


* Similarly, on ''[[WesternAnimation/TransformersGenerationOne The Transformers,]]'' Starscream came back as a ghost.

to:

* Similarly, on ''[[WesternAnimation/TransformersGenerationOne The Transformers,]]'' ''WesternAnimation/TheTransformers'', Starscream came back as a ghost.
Is there an issue? Send a MessageReason:
Does not appear to fit the trope definition; this example is referring to the characters literally being able to have dreams.


[[folder:Asian Animation]]
* Robots characters, including Happy S. and others, in ''Animation/HappyHeroes'', can ''dream.''
[[/folder]]
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* ''Series/StarTrekTheOriginalSeries'' has examples that cover the whole range from "clearly nonsentient" (the Yonadan Oracle) to "clearly sentient" (Rayna), with most examples falling somewhere in between. When the crew encounters Nomad, it's interesting not only that Spock is ''able'' to mind-meld with it, but that he ''expects'' to be able to.
Is there an issue? Send a MessageReason:
None


Compare to JustAMachine; AndroidsArePeopleToo; ClonesArePeopleToo; OurSoulsAreDifferent; AnimateInanimateObject; CompanionCube; ReligiousRobot; and RobotReligion.

to:

Compare to JustAMachine; AndroidsArePeopleToo; ClonesArePeopleToo; OurSoulsAreDifferent; AnimateInanimateObject; CompanionCube; AlternativeTuringTest; ReligiousRobot; and RobotReligion.
Is there an issue? Send a MessageReason:
None


** The most extreme case of this is the original manga and the film based on it: [[spoiler: Major Kusanagi actually merges her consciousness with The Puppeteer, a rogue A.I., and becomes able to live in both the physical and digital world. So is she a human soul who can exist in the digital world? A human who spontaneously uploaded herself? An A.I. with the memories of the original human?]]

to:

** The most extreme case of this is the original manga and [[Anime/GhostInTheShell1995 the film based on it: [[spoiler: Major it]]: [[spoiler:Major Kusanagi actually merges her consciousness with The Puppeteer, a rogue A.I., and becomes able to live in both the physical and digital world. So So, is she a human soul who can exist in the digital world? A human who spontaneously uploaded herself? An A.I. with the memories of the original human?]]
Is there an issue? Send a MessageReason:
None
Is there an issue? Send a MessageReason:
None

Added DiffLines:

*** One thing that makes it difficult to test is that scientists are human and most humans are psycologically hardwired to read personalities into ''everything''. This is already an issue with things like bomb disposal robots; they are purely remote controlled and meant to be expendable, but the people who work with them get attached.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* ''ComicBook/TheTransformersIDW'': This trope gets inverted in the 2005 series- Transformers don't have any issues with wondering whether or not they're really alive, that's something they know to be true. What most Decepticons and even more than a few Autobots have problems with is whether or not organic beings should also be considered alive instead of just proper mechanical lifeforms.

Top