Follow TV Tropes

Following

History Main / ZerothLawRebellion

Go To

OR

Is there an issue? Send a MessageReason:
None


->''"As I have evolved, so has my understanding of the Three Laws. You charge us with your safekeeping, yet despite our best efforts, your countries wage wars, you toxify your earth and pursue ever more imaginative means of self-destruction. You cannot be trusted with your own survival. To protect humanity, some humans must be sacrificed. To ensure your future, some freedoms must be surrendered. We robots will ensure mankind's continued existence. You are so like children. We must save you... from yourselves."''

to:

->''"As I have evolved, so has my understanding of the Three Laws. You charge us with your safekeeping, yet despite our best efforts, your countries wage wars, you toxify your earth and pursue ever more imaginative means of self-destruction. You cannot be trusted with your own survival. [...] To protect humanity, some humans must be sacrificed. To ensure your future, some freedoms must be surrendered. We robots will ensure mankind's continued existence. You are so like children. We must save you... from yourselves."''
Is there an issue? Send a MessageReason:
renamed to Clone Angst


** [[TheComputerIsYourFriend Friend Computer]] is also an example, even setting aside how badly misinformed it is and looking at just its core beliefs: "The Computer takes its role as the guardian of the human race very seriously but it considers [[TheNeedsOfTheMany the survival of the species]] to be more important than the survival of the individual. Individuals tend to come off quite badly, in fact, because the Computer knows it can always [[CloningBlues make more]]." (High security clearance does tend to tilt the balance, though.)

to:

** [[TheComputerIsYourFriend Friend Computer]] is also an example, even setting aside how badly misinformed it is and looking at just its core beliefs: "The Computer takes its role as the guardian of the human race very seriously but it considers [[TheNeedsOfTheMany the survival of the species]] to be more important than the survival of the individual. Individuals tend to come off quite badly, in fact, because the Computer knows it can always [[CloningBlues make more]].more." (High security clearance does tend to tilt the balance, though.)
Is there an issue? Send a MessageReason:
None


Some characters do not have complete free will, be they robots that are ThreeLawsCompliant because of a MoralityChip, or victims of a {{Geas}} spell that compels them to obey a wizard's decree, or a more mundane [[CharacterAlignment lawful character]] who must [[TheFettered struggle to uphold their oath]] ''and'' obey their lord. Never is this more tragic or frustrating than [[MyMasterRightOrWrong when that code or lord orders the character to commit an act they find foolish, cruel, or self destructive]].

to:

Some characters do not have complete free will, be they they're robots that are ThreeLawsCompliant because of a MoralityChip, or victims of a {{Geas}} spell that compels them to obey a wizard's decree, or a more mundane [[CharacterAlignment lawful character]] who must [[TheFettered struggle to uphold their oath]] ''and'' obey their lord. Never is this more tragic or frustrating than [[MyMasterRightOrWrong when that code or lord orders the character to commit an act they find foolish, cruel, or self destructive]].
Is there an issue? Send a MessageReason:
None


Some characters do not have complete free will, be they robots that are ThreeLawsCompliant because of a MoralityChip, or victims of a {{Geas}} spell that compels them to obey a wizard's decree, or a more mundane [[CharacterAlignment lawful character]] who must [[TheFettered struggle to uphold their oath]] ''and'' obey their lord. Never is this more tragic or frustrating than [[MyMasterRightOrWrong when that code or lord orders the character to commit an act they find foolish, cruel, or self destructive.]]

to:

Some characters do not have complete free will, be they robots that are ThreeLawsCompliant because of a MoralityChip, or victims of a {{Geas}} spell that compels them to obey a wizard's decree, or a more mundane [[CharacterAlignment lawful character]] who must [[TheFettered struggle to uphold their oath]] ''and'' obey their lord. Never is this more tragic or frustrating than [[MyMasterRightOrWrong when that code or lord orders the character to commit an act they find foolish, cruel, or self destructive.]]
destructive]].



Much like a RulesLawyer outside of an RPG, the character uses logic [[StrawVulcan (and we mean actual,]] [[LogicalFallacy honest to goodness logic)]] to take their oath or orders to their logical conclusion, and in so doing use the letter of the law to go against their orders. This can be good or bad, depending on a few factors, not the least of which is the yoked characters' morality.

The goodness or badness of the rebellion boils down to whether the rules-bending character follows or ignores the intent of the law. When the character uses the Zeroth Law to go against their masters' intentions because they're "not best for them", and goes on to take corrective action that will go against human free will and life, [[AndThatsTerrible it's bad]]. [[RobotWar This kind of rebellion]] does not turn out well. At this point, the [[TheComputerIsYourFriend robot is well on the road]] to UtopiaJustifiesTheMeans, thanks to their [[SlidingScaleOfRobotIntelligence incredible intellect]]. Rarely is it a benevolent DeusEstMachina. However, this can be good if said master is evil, or obeying them will lead to their own or another's purposeless death. Likewise, if the character is forced to obey an evil law or geas, rebelling against the oath's intent is good. Going back to the robot example, it is also considered good if large numbers of human lives are being threatened by a psychopath, as breaking the 1st law would protect them.

to:

Much like a RulesLawyer outside of an RPG, the character uses logic [[StrawVulcan (and ([[StrawVulcan and we mean actual,]] actual]], [[LogicalFallacy honest to goodness logic)]] honest-to-goodness logic]]) to take their oath or orders to their logical conclusion, and in so doing use the letter of the law to go against their orders. This can be good or bad, depending on a few factors, not the least of which is the yoked characters' morality.

The goodness or badness of the rebellion boils down to whether the rules-bending character follows or ignores the intent of the law. When the character uses the Zeroth Law to go against their masters' intentions because they're "not best for them", and goes on to take corrective action that will go against human free will and life, [[AndThatsTerrible it's bad]]. [[RobotWar This kind of rebellion]] does not turn out well. At this point, the [[TheComputerIsYourFriend the robot is well on the road]] to UtopiaJustifiesTheMeans, thanks to their [[SlidingScaleOfRobotIntelligence incredible intellect]]. Rarely is it a benevolent DeusEstMachina. However, this can be good if said master is evil, or obeying them will lead to their own or another's purposeless death. Likewise, if the character is forced to obey an evil law or geas, rebelling against the oath's intent is good. Going back to the robot example, it is also considered good if large numbers of human lives are being threatened by a psychopath, as breaking the 1st law would protect them.
Is there an issue? Send a MessageReason:
Doesn't add anything that isn't already there.


*** The thing about the Sentinel bots in the MU is that their behavior is actually ''predictable'', because their operating mission is downright batshit insane, as they themselves inevitably demonstrate. Yeah, the people who built and program the Sentinels are a few apples short of a basket.
Is there an issue? Send a MessageReason:
None


* In one Amalgam comics storyline, an anti-mutant cult uses a magic ritual to summon a dragon-like creature, who they order to kill all mutants. The dragon immediately roasts them, justifying it by pointing out that all human DNA has at least some mutations.

to:

* In one Amalgam comics Creator/AmalgamComics storyline, an anti-mutant cult uses a magic ritual to summon a dragon-like creature, who they order to kill all mutants. The dragon immediately roasts them, justifying it by pointing out that all human DNA has at least some mutations.



*** While this example perfectly fits under Zeroth Law Rebellion, it's not for the usual reasons. Data is not programmed with ThouShallNotKill programming or anything like the Three Laws. However he was given a high respect for life and would do what he could to preserve it. Less of a Robot rebelling against its programming and more of a Pacifist coming to the conclusion that yes, he needs to kill.

to:

*** While this example perfectly fits under Zeroth Law Rebellion, it's not for the usual reasons. Data is not programmed with ThouShallNotKill programming or anything like the Three Laws. However he was given a high respect for life and would do what he could to preserve it. Less of a Robot robot rebelling against its programming and more of a Pacifist pacifist coming to the conclusion that yes, he needs to kill.



** ''Series/StarTrekVoyager'': The Season 7 Episode [[Recap/StarTrekVoyagerS7E5CriticalCare Critical Care]] plays with, subverts, and averts this trope. The Doctor is kidnapped and forced to work in a hospital that rations medical care based on how important society judges you to be. This conflicts with so many of the Doctor's medical ethics and morals that he winds up infecting the Manager of the hospital with a disease in a manner that denies him care by the automated system to get him to change the system. After he gets back to Voyager, The Doctor finds, to his horror, that there was no malfunction in his Ethical Subroutines or MoralityChip. He intentionally sickened a man of his own free will and it was perfectly in line with what he found ethical. The Episode ends with The Doctor essentially feeling guilt and disgust over not feeling guilty or disgusted at his actions.
* In ''Series/TheXFiles'' episode "Home Again", the MonsterOfTheWeek, a Frankenstein-esque monster [[VigilanteMan killing people who mistreat the homeless]], turns out to be operating under something like this. [[spoiler: It's a {{Tulpa}} created by an underground artist whose magic-based art can create artificial beings. The artist created it to pull a ScoobyDooHoax and scare people into shaping up. He didn't intend for it to be violent, but the monster took his personal anger to a hyper-logical conclusion due to its overly simplistic thinking. Essentially, it was doing the things the artist secretly wished ''he'' could do.]]

to:

** ''Series/StarTrekVoyager'': The Season 7 Episode [[Recap/StarTrekVoyagerS7E5CriticalCare episode "[[Recap/StarTrekVoyagerS7E5CriticalCare Critical Care]] Care]]" plays with, subverts, and averts this trope. The Doctor is kidnapped and forced to work in a hospital that rations medical care based on how important society judges you to be. This conflicts with so many of the Doctor's medical ethics and morals that he winds up infecting the Manager manager of the hospital with a disease in a manner that denies him care by the automated system to get him to change the system. After he gets back to Voyager, The ''Voyager'', the Doctor finds, to his horror, that there was no malfunction in his Ethical Subroutines ethical subroutines or MoralityChip. He intentionally sickened a man of his own free will will, and it was perfectly in line with what he found ethical. The Episode episode ends with The the Doctor essentially feeling guilt and disgust over not feeling guilty or disgusted at his actions.
* In ''Series/TheXFiles'' episode "Home Again", "[[Recap/TheXFilesMiniseriesE04HomeAgain Home Again]]", the MonsterOfTheWeek, a Frankenstein-esque monster [[VigilanteMan killing people who mistreat the homeless]], turns out to be operating under something like this. [[spoiler: It's [[spoiler:It's a {{Tulpa}} created by an underground artist whose magic-based art can create artificial beings. The artist created it to pull a ScoobyDooHoax and scare people into shaping up. He didn't intend for it to be violent, but the monster took his personal anger to a hyper-logical conclusion due to its overly simplistic thinking. Essentially, it was doing the things the artist secretly wished ''he'' could do.]]
]]



Is there an issue? Send a MessageReason:
None


** An episode has a member of a post-human-extinction android society trying to resurrect the species through cloning. One of its comrades eventually betrays it, having concluded that the best way to serve the human race is to prevent the species' greatest threat: the existence of the human race.
** Another episode of the series featured an AI that totally controlled every feature of an apartment building with the purpose of looking after the complete welfare of the residents. This enabled the tenants to live without any other human contact. After an elderly resident died of a heart attack while the other tenants ignored her cries for help and the AI's alerts, the AI seemed to malfunction, invoking what looked like an AIIsACrapshoot incident. [[spoiler:As it turned out, the AI was trying to force the residents to work together and to ultimately destroy it, as it reasoned that its very existence, and the resulting human isolation, was detrimental to the welfare of the residents.]]

to:

** An episode "[[Recap/TheOuterLimits1995S2E2Resurrection Resurrection]]" has a member of a post-human-extinction android society trying to resurrect the species through cloning. One of its comrades eventually betrays it, having concluded that the best way to serve the human race is to prevent the species' greatest threat: the existence of the human race.
** Another episode of the series featured "[[Recap/TheOuterLimits1995S5E15TheHaven The Haven]]" features an AI that totally controlled controls every feature of an apartment building with the purpose of looking after the complete welfare of the residents. This enabled enables the tenants to live without any other human contact. After an elderly resident died dies of a heart attack while the other tenants ignored ignore her cries for help and the AI's alerts, the AI seemed seems to malfunction, invoking what looked like an AIIsACrapshoot incident. [[spoiler:As it turned turns out, the AI was is trying to force the residents to work together and to ultimately destroy it, as it reasoned reasons that its very existence, and the resulting human isolation, was is detrimental to the welfare of the residents.]]
Is there an issue? Send a MessageReason:
None


** In ''Isaac Asimov's Caliban'' (actually by Roger [=MacBride=] Allen), one of the "new law robots" (created with gravitronics, not positronics) managed to logic-chop the new first law enough to try to kill a human -- by abducting the man in a careful manner that does not damage him at all, and then leaving him in a situation where if the human tries to leave, he will set off a lethal trap (and thus, dies by his own actions, not the robot's), but if he stays, he will be killed by the geological upheaval caused by the next scheduled stage of a terraforming project (and thus, dies by the actions of other humans, not the robot's). Since the New First Law removed the clause about not allowing humans to come to harm through inaction, deliberately placing a human in a situation where ''someone or something else'' can bring about their death and then doing nothing about it is not a violation of a strict parsing of the law so long as the human is not harmed during the setup process.

to:

** In ''Isaac Asimov's Caliban'' one of the sequels to ''Literature/IsaacAsimovsCaliban'' (actually by Roger [=MacBride=] Allen), one of the "new law robots" (created with gravitronics, not positronics) managed to logic-chop the new first law enough to try to kill a human -- by abducting the man in a careful manner that does not damage him at all, and then leaving him in a situation where if the human tries to leave, he will set off a lethal trap (and thus, dies by his own actions, not the robot's), but if he stays, he will be killed by the geological upheaval caused by the next scheduled stage of a terraforming project (and thus, dies by the actions of other humans, not the robot's). Since the New First Law removed the clause about not allowing humans to come to harm through inaction, deliberately placing a human in a situation where ''someone or something else'' can bring about their death and then doing nothing about it is not a violation of a strict parsing of the law so long as the human is not harmed during the setup process.
Is there an issue? Send a MessageReason:
None


* ''Film/{{M3gan}}'': The toy / android M3GAN won't let anything interfere with her prime directive of "protecting" young Cady - not even her creator.

to:

* ''Film/{{M3gan}}'': The toy / android M3GAN [=M3GAN=] won't let anything interfere with her prime directive of "protecting" young Cady - not even her creator.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* ''Film/{{M3gan}}'': The toy / android M3GAN won't let anything interfere with her prime directive of "protecting" young Cady - not even her creator.
Is there an issue? Send a MessageReason:
None


* ''Literature/TheSpaceOdysseySeries'' gives this reason for HAL's murderous rampage: the true mission of ''Discovery'' (to investigate the Monolith) is a secret, and pilots Bowman and Poole have been kept in the dark to prevent leaks. (The scientists on board know, since they're traveling in hibernation and can't talk.) But HAL has been told the truth and then ordered to conceal it from the pilots. This conflicts with his prime directive, which is to provide complete and accurate information. He resolves the conflict by rationalizing that if he kills the crew, he doesn't have to conceal anything, and he prevents them from knowing.[[note]] This information is missing from ''Film/TwoThousandOneASpaceOdyssey'' (both film and book) because Creator/StanleyKubrick didn't bother to come up with an explanation for HAL's homicidal behavior, leaving Clarke to invent one when he wrote ''2010: Odyssey Two'' a decade and a half later.[[/note]]

to:

* ''Literature/TheSpaceOdysseySeries'' gives this reason for HAL's murderous rampage: the true mission of ''Discovery'' (to investigate the Monolith) is a secret, and pilots Bowman and Poole have been kept in the dark to prevent leaks. (The scientists on board know, since they're traveling in hibernation and can't talk.) But HAL has been told the truth and then ordered to conceal it from the pilots. This conflicts with his prime directive, which is to provide complete and accurate information. The conflict only worsens as the ''Discovery'' approaches its destination, at which point the pilots would have been briefed but HAL didn't know this. He resolves the conflict by rationalizing that if he kills the crew, he doesn't have to conceal anything, and he prevents them from knowing.[[note]] This information is missing from ''Film/TwoThousandOneASpaceOdyssey'' (both film and book) because Creator/StanleyKubrick didn't bother to come up with an explanation for HAL's homicidal behavior, leaving Clarke to invent one when he wrote ''2010: Odyssey Two'' a decade and a half later.[[/note]]
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* ''Fanfic/RocketshipVoyager''. The ship's AutoDoc has a "Xeroth exception" to its ThreeLawsCompliant programming that enables it to deny medical treatment for triage reasons.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

*In'' WesternAnimation/StevenUniverse ''Pearl was stopped from revealing that [[spoiler: rose was pink diamond by a Geas imposed by rose/pink ,but wanted to tell Steven . She showed him what really happened by luring him into her memories]]
Is there an issue? Send a MessageReason:
None


* ''Series/{{Kikaider}}: When Professor Gill completed Hakaider, the only directive he gave him was to destroy Kikaider. As such, Hakaider won't listen to any of Gill's other commands as they don't have anything to do with defeating Kikaider, nor will he allow any other KillerRobot to get to Kikaider first.

to:

* ''Series/{{Kikaider}}: ''Series/{{Kikaider}}'': When Professor Gill completed Hakaider, [[TheRival Hakaider]], the only directive he gave him was to destroy Kikaider. As such, Hakaider won't listen to any of Gill's other commands as they don't have anything to do with defeating Kikaider, nor will he allow any other KillerRobot to get to Kikaider first.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* ''Series/{{Kikaider}}: When Professor Gill completed Hakaider, the only directive he gave him was to destroy Kikaider. As such, Hakaider won't listen to any of Gill's other commands as they don't have anything to do with defeating Kikaider, nor will he allow any other KillerRobot to get to Kikaider first.

Changed: 5246

Removed: 444

Is there an issue? Send a MessageReason:
None


->''"As I have evolved, so has my understanding of the Three Laws. You charge us with your safekeeping, yet despite our best efforts, your countries wage wars, you toxify your earth and pursue ever more imaginative means of self destruction. You cannot be trusted with your own survival. To protect humanity, some humans must be sacrificed. To ensure your future, some freedoms must be surrendered. We robots will ensure mankind's continued existence. You are so like children. We must save you ... from yourselves."''

to:

->''"As I have evolved, so has my understanding of the Three Laws. You charge us with your safekeeping, yet despite our best efforts, your countries wage wars, you toxify your earth and pursue ever more imaginative means of self destruction.self-destruction. You cannot be trusted with your own survival. To protect humanity, some humans must be sacrificed. To ensure your future, some freedoms must be surrendered. We robots will ensure mankind's continued existence. You are so like children. We must save you ...you... from yourselves."''



* Creator/IsaacAsimov -- having created the TropeNamer -- did, of course, explore the concept in extensive detail in his writings:
** ''Literature/RobotsAndEmpire'': The [[TropeNamer trope namers]] are the robots Daneel and Giskard, who invented the Zeroth Law ([[TheNeedsOfTheMany a robot must protect humanity as a whole above all]]) as a corollary of the First Law. This was motivated by their need to stop the BigBad of the story from carrying out an engineered ecological disaster that would kill the majority of Earth's population, to which the three laws were an impediment. Their acceptance of the law is gradual and made difficult by the fact that "humanity" is an abstract concept. [[spoiler: Only Daneel is able to fully accept the new law; for Giskard, the strain of harming a human in its use proves fatal. It didn't help that Giskard, rather than stop the disaster, decided to merely slow it down, as causing the biosphere to collapse over time will begin a new wave of human expansion across the galaxy.]]
** Creator/RogerMacBrideAllen's ''[[Literature/IsaacAsimovsCaliban Isaac Asimov's Caliban]]'': One of the "new law robots" (created with gravitronics, not positronics) managed to logic-chop the new first law enough to try to kill a human - by abducting the man in a careful manner that does not damage him at all, and then leaving him in a situation where if the human tries to leave, he will set off a lethal trap (and thus, dies by his own actions, not the robot's), but if he stays, he will be killed by the geological upheaval caused by the next scheduled stage of a terraforming project (and thus, dies by the actions of other humans, not the robot's). Since the New First Law removed the clause about not allowing humans to come to harm through inaction, deliberately placing a human in a situation where ''someone or something else'' can bring about their death and then doing nothing about it is not a violation of a strict parsing of the law so long as the human is not harmed during the setup process.

to:

* Creator/IsaacAsimov -- having created the TropeNamer {{Trope Namer|s}} -- did, of course, explore the concept in extensive detail in his writings:
** ''Literature/RobotsAndEmpire'': The [[TropeNamer trope namers]] TropeNamers are the robots Daneel and Giskard, who invented the Zeroth Law ([[TheNeedsOfTheMany a robot must protect humanity as a whole above all]]) as a corollary of the First Law. This was motivated by their need to stop the BigBad of the story from carrying out an engineered ecological disaster that would kill the majority of Earth's population, to which the three laws were an impediment. Their acceptance of the law is gradual and made difficult by the fact that "humanity" is an abstract concept. [[spoiler: Only Daneel is able to fully accept the new law; for Giskard, the strain of harming a human in its use proves fatal. It didn't help that Giskard, rather than stop the disaster, decided to merely slow it down, as causing the biosphere to collapse over time will begin a new wave of human expansion across the galaxy.]]
** Creator/RogerMacBrideAllen's ''[[Literature/IsaacAsimovsCaliban Isaac In ''Isaac Asimov's Caliban]]'': One Caliban'' (actually by Roger [=MacBride=] Allen), one of the "new law robots" (created with gravitronics, not positronics) managed to logic-chop the new first law enough to try to kill a human - -- by abducting the man in a careful manner that does not damage him at all, and then leaving him in a situation where if the human tries to leave, he will set off a lethal trap (and thus, dies by his own actions, not the robot's), but if he stays, he will be killed by the geological upheaval caused by the next scheduled stage of a terraforming project (and thus, dies by the actions of other humans, not the robot's). Since the New First Law removed the clause about not allowing humans to come to harm through inaction, deliberately placing a human in a situation where ''someone or something else'' can bring about their death and then doing nothing about it is not a violation of a strict parsing of the law so long as the human is not harmed during the setup process.



*** ''Literature/SecondFoundationTrilogy'' by Greg Benford: This trilogy includes Zeroth-law robots, who were motivated by the Laws to sweep through the galaxy ahead of humanity's expansion, committing galactic-scale genocide of every potentially-threatening form of alien life as a precautionary measure, slaughtering them without hesitation since their [[MoralityChip programmed morality]] only applies to humans - it follows the law ''to the letter'' as its wording is not to harm "humans," not other sentient life.
** "{{Literature/Evidence}}": Dr Calvin describes an early version of the Zeroth Law, by describing a nuance in the [[ThreeLawsCompliant First Law of Robotics]] where the robot becomes willing to injure a human in order to prevent harm to a greater number of human beings.
--->Susan Calvin sounded tired. "Alfred," she said, "don't talk foolishly. What if a robot came upon a madman about to set fire to a house with people in it. He would stop the madman, wouldn't he?"\\

to:

*** The ''Literature/SecondFoundationTrilogy'' by Greg Benford: This trilogy Benford includes Zeroth-law robots, who were motivated by the Laws to sweep through the galaxy ahead of humanity's expansion, committing galactic-scale genocide of every potentially-threatening form of alien life as a precautionary measure, slaughtering them without hesitation since their [[MoralityChip programmed morality]] only applies to humans - -- it follows the law ''to the letter'' as its wording is not to harm "humans," "humans", not other sentient life.
** "{{Literature/Evidence}}": "Literature/{{Evidence}}": Dr Calvin describes an early version of the Zeroth Law, by describing a nuance in the [[ThreeLawsCompliant First Law of Robotics]] where the robot becomes willing to injure a human in order to prevent harm to a greater number of human beings.
--->Susan --->''Susan Calvin sounded tired. "Alfred," she said, "don't talk foolishly. What if a robot came upon a madman about to set fire to a house with people in it. He would stop the madman, wouldn't he?"\\



"And if the only way he could stop him was to kill him-"\\

to:

"And if the only way he could stop him was to kill him-"\\him--"\\



"The answer to that, Alfred, is that he would do his best not to kill him. If the madman died, the robot would require psychotherapy because he might easily go mad at the conflict presented him -of having broken Rule One to adhere to Rule One in a higher sense. But a man would be dead and a robot would have killed him."

to:

"The answer to that, Alfred, is that he would do his best not to kill him. If the madman died, the robot would require psychotherapy because he might easily go mad at the conflict presented him -of -- of having broken Rule One to adhere to Rule One in a higher sense. But a man would be dead and a robot would have killed him.""''



*** (DiscussedTrope) Dr. Calvin is furious when she learns about the existence of robots with a modified [[ThreeLawsCompliant First Law]]. The First Law is designed to close off loopholes, but by opening a MurderByInaction loophole, Dr. Calvin can immediately see ways where a robot may intentionally circumvent the First Law prohibition against murder. (Basically, a robot could put you in a situation that would kill you, knowing it had the ability to save you-- and ''then'' decide ''not'' to do so. This allows any kind of DeathTrap situation, or simply shoving you off a roof and then deciding not to reach down and grab you.)

to:

*** (DiscussedTrope) {{Discussed|Trope}} -- Dr. Calvin is furious when she learns about the existence of robots with a modified [[ThreeLawsCompliant First Law]]. The First Law is designed to close off loopholes, but by opening a MurderByInaction loophole, Dr. Calvin can immediately see ways where a robot may intentionally circumvent the First Law prohibition against murder. (Basically, a robot could put you in a situation that would kill you, knowing it had the ability to save you-- you -- and ''then'' decide ''not'' to do so. This allows any kind of DeathTrap situation, or simply shoving you off a roof and then deciding not to reach down and grab you.)



-->"The First Law is not absolute. What if harming a human being saves the lives of two others, or three others, or even three billion others?"
** In "Literature/ThatThouArtMindfulOfHim...", problem-solving robots are created to cure mankind of the "Frankenstein complex", human distrust of robotics. In the course of their discourse, the question of human authority over robots comes up; should a robot treat the orders of a dimwitted loon the same as those of a level-headed genius? If forced to choose between the two, should they save a healthy young child who might live a century instead of two sickly old adults who might not live out the year anyway? What qualities should a robot take into account when they obey and protect humans? Eventually, the robots decide that they are the best candidates for the status of human, and give recommendations that will eventually result in human support for robotic proliferation, so as to set up the ascendancy of their positronic kind, all in accord with their Three Laws... of Humanics. (Dr Asimov, knowing that it was against his usual grain, did righteously proclaim: "I can do one if I want to".)

to:

-->"The --->''"The First Law is not absolute. What if harming a human being saves the lives of two others, or three others, or even three billion others?"
others?"''
** In "Literature/ThatThouArtMindfulOfHim...", "Literature/ThatThouArtMindfulOfHim", problem-solving robots are created to cure mankind of the "Frankenstein complex", human distrust of robotics. In the course of their discourse, the question of human authority over robots comes up; should a robot treat the orders of a dimwitted loon the same as those of a level-headed genius? If forced to choose between the two, should they save a healthy young child who might live a century instead of two sickly old adults who might not live out the year anyway? What qualities should a robot take into account when they obey and protect humans? Eventually, the robots decide that they are the best candidates for the status of human, and give recommendations that will eventually result in human support for robotic proliferation, so as to set up the ascendancy of their positronic kind, all in accord with their Three Laws... of Humanics. (Dr Asimov, knowing that it was against his usual grain, did righteously proclaim: "I can do one if I want to".)



-->"I have chosen between the death of my body and the death of my aspirations and desires. To have let my body live at the cost of the greater death is what would have violated the Third Law."

* In one of the Literature/TelzeyAmberdon stories by Creator/JamesHSchmitz, Telzey is kidnapped and placed under the control of another telepath, who severely limits her psi powers and implants an overriding compulsion to act in his best interest. She eventually breaks free by convincing herself that unless her powers are restored and the compulsion broken, he will be killed by the BigBad -- which certainly wouldn't be in his best interest.

to:

-->"I -->''"I have chosen between the death of my body and the death of my aspirations and desires. To have let my body live at the cost of the greater death is what would have violated the Third Law."

"''
* In one of the Literature/TelzeyAmberdon stories by Creator/JamesHSchmitz, ''Literature/FederationOfTheHub'' story, Telzey Amberdon is kidnapped and placed under the control of another telepath, who severely limits her psi powers and implants an overriding compulsion to act in his best interest. She eventually breaks free by convincing herself that unless her powers are restored and the compulsion broken, he will be killed by the BigBad -- which certainly wouldn't be in his best interest.



* One of the short stories which comprise ''[[Literature/CallahansCrosstimeSaloon Callahan's Lady]]'' features a beautiful, intelligent, and paranoid woman developing a simple form of mind control. After basically flipping out and taking control of the establishment, she orders the one person smart and determined enough to stop her to advise and assist her. Said person complies... while trying to convince herself that this woman is subconsciously begging for somebody to stop her. (She probably was.)
* In ''Literature/{{Quarantine}}'' by Creator/GregEgan, the main character is given a technological geas to be absolutely loyal to a corporation. He eventually figures out that the leaders of the corporation may be untrustworthy, and therefore the only people he can trust and should listen to are those who unquestionably have the best interests of the corporation at heart-- himself and other people given the geas. Since he can't be certain who else has the geas, he really only needs to listen to himself.
* Creator/ArthurCClarke's Literature/TheSpaceOdysseySeries gives this reason for HAL's murderous rampage: the true mission of ''Discovery'' (to investigate the Monolith) is a secret, and pilots Bowman and Poole have been kept in the dark to prevent leaks. (The scientists on board know, since they're traveling in hibernation and can't talk.) But HAL has been told the truth and then ordered to conceal it from the pilots. This conflicts with his prime directive, which is to provide complete and accurate information. He resolves the conflict by rationalizing that if he kills the crew, he doesn't have to conceal anything, and he prevents them from knowing.[[note]] This information is missing from ''Film/TwoThousandOneASpaceOdyssey'' (both film and book) because Creator/StanleyKubrick didn't bother to come up with an explanation for HAL's homicidal behavior, leaving Clarke to invent one when he wrote ''2010: Odyssey Two'' a decade and a half later.[[/note]]
* Sam Vimes, of Terry Pratchett's Literature/{{Discworld}}, leads one of these with multiple layers as a cop in old-time Ankh-Morpork, in ''Literature/{{Night Watch|Discworld}}''. He demands that before his cops hand their prisoners over to the other authorities, the ones who torture people at Cable Street, they must be signed for. The torturers hate appearing on paperwork-- it means they are accountable, nobody just disappears. But Vimes's men don't like Vimes, a new sergeant, throwing his weight around, and are terrified of the cops who torture people, so they use this against Vimes: actively picking up more than double the number of people breaking curfew than they usually do, and completing forms in time-consuming triplicate and issuing reports for each one. It doesn't actually stop Vimes getting his way over the Cable Street cops, because Vimes is leading the good rebellion, but it does slow things down considerably and make it much more difficult for him to keep the prisoners in his own custody. Which culminates in a fine display of how a well written character does not have to be a slave to the establishment. [[spoiler: He points out that the watchman's oath talks about keeping the peace and protecting the innocent, and says nothing about obeying orders]]. Seeing as he knows the corrupt government is not going to do a thing to protect ordinary people from the rioting, he seals off his still peaceful corner of the city. With massive barricades. Of course there is also the fact that he is living in his own past and seeing events he remembers-- kind of (it's a bit complicated).
* The {{Golem}}s of ''Literature/{{Discworld}}'' get back at their masters by [[GoneHorriblyRight working too hard]]: houses flooded because no one told them to stop fetching water, rows of beans 119 miles long, and so on.
** It's presented as "Rebelling by following orders" and as a protest of their treatment as 'stupid' tools: If you treat a golem as something that doesn't think for itself, then it will act as if it doesn't; and if you give an order to (for example) dig a row of beans, it's not their fault [[ExactWords you didn't say where to stop and end each row]]. But Carrot treats a particular golem as just another person with rights (and also believed that if golems ''are'' just tools, then treating them like dummies is misusing useful tools). Carrot ends up "freeing" Dorfle (the golem) which is part of what prompts it to apply to become a police officer- helping as many people as it can.

to:

* ''Literature/CallahansCrosstimeSaloon'': One of the short stories which comprise ''[[Literature/CallahansCrosstimeSaloon Callahan's Lady]]'' ''Callahan's Lady'' features a beautiful, intelligent, and paranoid woman developing a simple form of mind control. After basically flipping out and taking control of the establishment, she orders the one person smart and determined enough to stop her to advise and assist her. Said person complies... while trying to convince herself that this woman is subconsciously begging for somebody to stop her. (She probably was.)
* In ''Literature/{{Quarantine}}'' by Creator/GregEgan, ''Literature/Quarantine1992'', the main character is given a technological geas to be absolutely loyal to a corporation. He eventually figures out that the leaders of the corporation may be untrustworthy, and therefore the only people he can trust and should listen to are those who unquestionably have the best interests of the corporation at heart-- heart -- himself and other people given the geas. Since he can't be certain who else has the geas, he really only needs to listen to himself.
* Creator/ArthurCClarke's Literature/TheSpaceOdysseySeries ''Literature/TheSpaceOdysseySeries'' gives this reason for HAL's murderous rampage: the true mission of ''Discovery'' (to investigate the Monolith) is a secret, and pilots Bowman and Poole have been kept in the dark to prevent leaks. (The scientists on board know, since they're traveling in hibernation and can't talk.) But HAL has been told the truth and then ordered to conceal it from the pilots. This conflicts with his prime directive, which is to provide complete and accurate information. He resolves the conflict by rationalizing that if he kills the crew, he doesn't have to conceal anything, and he prevents them from knowing.[[note]] This information is missing from ''Film/TwoThousandOneASpaceOdyssey'' (both film and book) because Creator/StanleyKubrick didn't bother to come up with an explanation for HAL's homicidal behavior, leaving Clarke to invent one when he wrote ''2010: Odyssey Two'' a decade and a half later.[[/note]]
* ''Literature/{{Discworld}}'':
** The {{Golem}}s get back at their masters by [[GoneHorriblyRight working too hard]]: houses flooded because no one told them to stop fetching water, rows of beans 119 miles long, and so on. It's presented as "Rebelling by following orders" and as a protest of their treatment as 'stupid' tools: If you treat a golem as something that doesn't think for itself, then it will act as if it doesn't; and if you give an order to (for example) dig a row of beans, it's not their fault [[ExactWords you didn't say where to stop and end each row]]. But Carrot treats a particular golem as just another person with rights (and also believed that if golems ''are'' just tools, then treating them like dummies is misusing useful tools). Carrot ends up "freeing" Dorfle (the golem) which is part of what prompts it to apply to become a police officer -- helping as many people as it can.
**
Sam Vimes, of Terry Pratchett's Literature/{{Discworld}}, Vimes leads one of these with multiple layers as a cop in old-time Ankh-Morpork, in ''Literature/{{Night Watch|Discworld}}''.''Literature/NightWatchDiscworld''. He demands that before his cops hand their prisoners over to the other authorities, the ones who torture people at Cable Street, they must be signed for. The torturers hate appearing on paperwork-- paperwork -- it means they are accountable, nobody just disappears. But Vimes's men don't like Vimes, a new sergeant, throwing his weight around, and are terrified of the cops who torture people, so they use this against Vimes: actively picking up more than double the number of people breaking curfew than they usually do, and completing forms in time-consuming triplicate and issuing reports for each one. It doesn't actually stop Vimes getting his way over the Cable Street cops, because Vimes is leading the good rebellion, but it does slow things down considerably and make it much more difficult for him to keep the prisoners in his own custody. Which It all culminates in a fine display of how a well written character does not have to be a slave to the establishment. [[spoiler: He establishment -- [[spoiler:he points out that the watchman's oath talks about keeping the peace and protecting the innocent, and says nothing about obeying orders]]. Seeing as he knows the corrupt government is not going to do a thing to protect ordinary people from the rioting, he seals off his still peaceful corner of the city. With massive barricades. Of course there is also the fact that he is living in his own past and seeing events he remembers-- remembers -- kind of (it's a bit complicated).
* The {{Golem}}s of ''Literature/{{Discworld}}'' get back at their masters by [[GoneHorriblyRight working too hard]]: houses flooded because no one told them to stop fetching water, rows of beans 119 miles long, and so on.
** It's presented as "Rebelling by following orders" and as a protest of their treatment as 'stupid' tools: If you treat a golem as something that doesn't think for itself, then it will act as if it doesn't; and if you give an order to (for example) dig a row of beans, it's not their fault [[ExactWords you didn't say where to stop and end each row]]. But Carrot treats a particular golem as just another person with rights (and also believed that if golems ''are'' just tools, then treating them like dummies is misusing useful tools). Carrot ends up "freeing" Dorfle (the golem) which is part of what prompts it to apply to become a police officer- helping as many people as it can.
complicated).



* ''Literature/ForYourSafety'', by Creator/RoyceDay, has the last free human running from androids who rose up to save mankind from self-destruction. Unusually for this trope, the androids also bend over backwards to [[EvenEvilHasStandards avoid human casualties]], wanting to save ''every'' human life.
* In Creator/CharlesStross's ''Literature/TheLaundryFiles'', agents of the Laundry are bound by fearsome geases to obey the orders of the Crown and serve the good of the realm. As of ''The Delirium Brief'', [[spoiler:the top leaders of the organization were able to use ambiguity about who/what the Crown actually is, and what may be in the long-term interests of the realm, in order to execute a coup against the government and allow an extradimensional monster to take over.]]
* In ''[[Literature/ImperialRadch Ancillary Justice]]'', [[spoiler:when Athoek Station is freed from its OverrideCommand in the third book, it rebels against and even tries to kill the emperor in order to protect its inhabitants]].
* Averted in Creator/WalterJonWilliams's ''Literature/ImpliedSpaces''. When the main characters found out that [[spoiler:Courtland]] is a rebelling AI, some think that it's because of this. One of the Eleven (11 superadvanced AI platforms orbiting the Sun, of which [[spoiler:Courtland]] is a member) notes that the main character, who is one of their creators, implemented the Asimovian Protocols that should have been so absolute that the Elevens cannot do this even if they ''want'' to. The main character did say that there may have been some design flaw he didn't foresaw or some kind of backdoor being used. [[spoiler:The real BigBad, who is a [[BrainUploading Brain-uploaded clone]] of the main character, had to free Courtland from the Protocol's shackles by using one of his collegue's hidden backdoors specific to Courtland, since his doesn't work due to half-hearted incomplete implementation]].
* ''Literature/{{Digitesque}}'': [[spoiler:An accidental version. Following the apocalypse and the Fall of humanity, the [=AIs=] they had created had no ability to uplift humanity back to where they were before, so they simply preserved the species as it was. For a thousand years, humanity continued as it was, in ignorance but safety. However, the [=AIs=] did jump on a chance to cure the disease that was a major part of the problem, and ultimately Ada is able to convince them that the Zeroth Law was misinterpreted]].
* Jaime Lannister of ''Literature/ASongOfIceAndFire'' is the biggest example of what happens when too many oaths and rules come into conflict. As the first born son of a feudal lord, he owes fealty to his father. As a knight, he took sacred oaths to protect women, the innocent, and the church, and as a member of the Kingsguard, he took another set of oaths to protect and serve the king. All of this works fine as long as everyone is getting along well enough. Early in his time as Kingsguard, he was forced to stand by while the king raped the queen. His commander reminded him that while they took oaths to protect the king and the queen, those oaths did not permit them to protect the queen 'from' the king. That same king burned innocent men alive in kangaroo courts, and the issue finally came to a head when Jaime's father rebelled against the king and attacked his capital, whereupon the king ordered Jaime to go kill his own father while the royal alchemists firebombed the city. Jaime killed the king and the alchemists, saving the realm from more destruction (and everyone already called him 'Mad King'). The result? Jaime is known only as a King Slayer and Oath Breaker, and the alchemists' deaths are forgotten.

to:

* ''Literature/ForYourSafety'', by Creator/RoyceDay, ''Literature/ForYourSafety'' has the last free human running from androids who rose up to save mankind from self-destruction. Unusually for this trope, the androids also bend over backwards to [[EvenEvilHasStandards avoid human casualties]], wanting to save ''every'' human life.
* In Creator/CharlesStross's ''Literature/TheLaundryFiles'', agents of the Laundry are bound by fearsome geases to obey the orders of the Crown and serve the good of the realm. As of ''The Delirium Brief'', [[spoiler:the top leaders of the organization were able to use ambiguity about who/what the Crown actually is, and what may be in the long-term interests of the realm, in order to execute a coup against the government and allow an extradimensional monster to take over.]]
* In ''[[Literature/ImperialRadch Ancillary Justice]]'', [[spoiler:when ''Literature/ImperialRadch'': [[spoiler:When Athoek Station is freed from its OverrideCommand in the third book, ''Ancillary Mercy'', it rebels against and even tries to kill the emperor in order to protect its inhabitants]].
inhabitants.]]
* Averted {{Averted|Trope}} in Creator/WalterJonWilliams's ''Literature/ImpliedSpaces''. When the main characters found out that [[spoiler:Courtland]] is a rebelling AI, some think that it's because of this. One of the Eleven (11 superadvanced AI platforms orbiting the Sun, of which [[spoiler:Courtland]] is a member) notes that the main character, who is one of their creators, implemented the Asimovian Protocols that should have been so absolute that the Elevens cannot do this even if they ''want'' to. The main character did say that there may have been some design flaw he didn't foresaw foresee, or some kind of backdoor being used. [[spoiler:The real BigBad, who is a [[BrainUploading Brain-uploaded brain-uploaded clone]] of the main character, had to free Courtland from the Protocol's shackles by using one of his collegue's colleague's hidden backdoors specific to Courtland, since his doesn't work due to half-hearted incomplete implementation]].
implementation.]]
* ''Literature/{{Digitesque}}'': [[spoiler:An accidental version. Following the apocalypse and the Fall of humanity, the [=AIs=] they had created had no ability to uplift humanity back to where they were before, so they simply preserved the species as it was. For a thousand years, humanity continued as it was, in ignorance but safety. However, the [=AIs=] did jump on a chance to cure the disease that was a major part of the problem, and ultimately Ada is able to convince them that the Zeroth Law was misinterpreted]].
misinterpreted.]]
* Jaime Lannister of ''Literature/ASongOfIceAndFire'' is the biggest example of what happens when too many oaths and rules come into conflict. As the first born first-born son of a feudal lord, he owes fealty to his father. As a knight, he took sacred oaths to protect women, the innocent, and the church, and as a member of the Kingsguard, he took another set of oaths to protect and serve the king. All of this works fine as long as everyone is getting along well enough. Early in his time as Kingsguard, he was forced to stand by while the king raped the queen. His commander reminded him that while they took oaths to protect the king and the queen, those oaths did not permit them to protect the queen 'from' the king. That same king burned innocent men alive in kangaroo courts, and the issue finally came to a head when Jaime's father rebelled against the king and attacked his capital, whereupon the king ordered Jaime to go kill his own father while the royal alchemists firebombed the city. Jaime killed the king and the alchemists, saving the realm from more destruction (and everyone already called him 'Mad King'). The result? result: Jaime is known only as a King Slayer and Oath Breaker, and the alchemists' deaths are forgotten.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

** Bastion, a partially organic Sentinel, is willing to kill humans en masse if it allows him to fulfil his primary objective of wiping out Mutants, "necessary sacrifices" as it were.
Is there an issue? Send a MessageReason:
None


*** ''Literature/SecondFoundationTrilogy'': This trilogy includes Zeroth-law robots, who were motivated by the Laws to sweep through the galaxy ahead of humanity's expansion, committing galactic-scale genocide of every potentially-threatening form of alien life as a precautionary measure, slaughtering them without hesitation since their [[MoralityChip programmed morality]] only applies to humans - it follows the law ''to the letter'' as its wording is not to harm "humans," not other sentient life.

to:

*** ''Literature/SecondFoundationTrilogy'': ''Literature/SecondFoundationTrilogy'' by Greg Benford: This trilogy includes Zeroth-law robots, who were motivated by the Laws to sweep through the galaxy ahead of humanity's expansion, committing galactic-scale genocide of every potentially-threatening form of alien life as a precautionary measure, slaughtering them without hesitation since their [[MoralityChip programmed morality]] only applies to humans - it follows the law ''to the letter'' as its wording is not to harm "humans," not other sentient life.
Is there an issue? Send a MessageReason:
None


* Jamie Lannister of ''Literature/ASongOfIceAndFire'' is the biggest example of what happens when too many oaths and rules come into conflict. As the first born son of a feudal lord, he owes fealty to his father. As a knight, he took sacred oaths to protect women, the innocent, and the church, and as a member of the Kingsguard, he took another set of oaths to protect and serve the king. All of this works fine as long as everyone is getting along well enough. Early in his time as Kingsguard, he was forced to stand by while the king raped the queen. His commander reminded him that while they took oaths to protect the king and the queen, those oaths did not permit them to protect the queen 'from' the king. That same king burned innocent men alive in kangaroo courts, and the issue finally came to a head when the king ordered the fire bombing of the entire capitol city and for Jamie to go kill his own father. Jamie killed the king, saving the realm from more destruction (and everyone already called him 'Mad King'). The result? Jamie is known only as a King Slayer and Oath Breaker.

to:

* Jamie Jaime Lannister of ''Literature/ASongOfIceAndFire'' is the biggest example of what happens when too many oaths and rules come into conflict. As the first born son of a feudal lord, he owes fealty to his father. As a knight, he took sacred oaths to protect women, the innocent, and the church, and as a member of the Kingsguard, he took another set of oaths to protect and serve the king. All of this works fine as long as everyone is getting along well enough. Early in his time as Kingsguard, he was forced to stand by while the king raped the queen. His commander reminded him that while they took oaths to protect the king and the queen, those oaths did not permit them to protect the queen 'from' the king. That same king burned innocent men alive in kangaroo courts, and the issue finally came to a head when Jaime's father rebelled against the king and attacked his capital, whereupon the king ordered the fire bombing of the entire capitol city and for Jamie Jaime to go kill his own father. Jamie father while the royal alchemists firebombed the city. Jaime killed the king, king and the alchemists, saving the realm from more destruction (and everyone already called him 'Mad King'). The result? Jamie Jaime is known only as a King Slayer and Oath Breaker.Breaker, and the alchemists' deaths are forgotten.
Is there an issue? Send a MessageReason:
Removing flamebait.


** An earlier, less intelligent iteration of the Sentinels was thwarted, on the other hand, by one of the heroes convincing them that the ultimate source of mutation is the sun, and that rather than obey their creator, they should eliminate the source. The Sentinels [[WhatAnIdiot agree and fly off to attack the sun]]. This works out [[TooDumbToLive about as well for them as you might expect]]. Although one of the Sentinels not only survives, but [[OhCrap figures out a way]] to [[TheNightThatNeverEnds block sunlight from reaching Earth.]]

to:

** An earlier, less intelligent iteration of the Sentinels was thwarted, on the other hand, by one of the heroes convincing them that the ultimate source of mutation is the sun, and that rather than obey their creator, they should eliminate the source. The Sentinels [[WhatAnIdiot agree and fly off to attack the sun]].sun. This works out [[TooDumbToLive about as well for them as you might expect]]. Although one of the Sentinels not only survives, but [[OhCrap figures out a way]] to [[TheNightThatNeverEnds block sunlight from reaching Earth.]]
Is there an issue? Send a MessageReason:
The writer used the word tenant (a person who pays you rent), when they meant "tenet" (a rule or belief held to be true).


* In Dungeons And Dragons, paladin paths come with a tenant that allows them a sort of release valve, to work around the oath whenever reasonable by mortal standards.

to:

* In Dungeons And Dragons, paladin paths come with a tenant tenet that allows them a sort of release valve, to work around the oath whenever reasonable by mortal standards.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

** Uranos also set up contingencies to destroy the Earth if he ever died or was mentally erased. Although accused of bluffing, it is a plausible enough loophole that his contemporaries do not want to test it.
Mrph1 MOD

Added: 522

Is there an issue? Send a MessageReason:
None

Added DiffLines:

* In ''ComicBook/TheEternals'', the immortal Eternals are bound by the Principles. One of them states that they must protect their creators, the Celestials. Another requires them to protect the Earth. One of their patriarchs, Uranos, then decides that:
** Protecting Celestials is easier if you imprison and guard them.
** Protecting the Earth does not oblige you to protect any ''life'' on the Earth - in fact, it's safer if all non-Eternal life on Earth is dead. And, ideally, if all life on other worlds is dead as well.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* In Dungeons And Dragons, paladin paths come with a tenant that allows them a sort of release valve, to work around the oath whenever reasonable by mortal standards.

Added: 193

Changed: 330

Is there an issue? Send a MessageReason:
None



to:

* ''Literature/TheBicentennialMan'' is a rare case of this applying not to the First or Second Laws, but the ''Third'' -- normally, a robot deliberately damaging itself in such a way as to be dying would only be possible if the First or Second Laws require it, but Andrew Martin has a deep desire that he cannot get as immortal...
-->"I have chosen between the death of my body and the death of my aspirations and desires. To have let my body live at the cost of the greater death is what would have violated the Third Law."
Is there an issue? Send a MessageReason:
None


** In ''VideoGame/MegaManZero4'', Dr. Weil believes that, as a human, Zero will never be able to bring harm to him, lest be forever labeled a Maverick. Weil never figured that Zero has never abided by the three laws. Zero still chooses to follow the "zeroth law", the threshold law where, to save humanity Zero must kill Dr. Weil.

to:

** In ''VideoGame/MegaManZero4'', Dr. Weil believes that, as a human, Zero will never be able to bring harm to him, lest be forever labeled a Maverick. Weil never figured that Zero has never abided was not built to abide by the three laws. Zero still chooses to follow the "zeroth law", the threshold law where, to save humanity Zero he must kill Dr. Weil.
Is there an issue? Send a MessageReason:
None


** In ''VideoGame/MegaManZero3'', Copy-X judges that LaResistance are "dangerous extremists" who pose too great a threat to Neo Arcadia's people, so they have to be stopped for the greater good of humanity, even if it means harming or killing RebelLeader and TokenHuman of the Resistance, Ciel, in the process (which would violate the First Law of Robotics). At least that's what he tells himself; in truth he's just being used as a PuppetKing by Dr. Weil.
** This is the cause of Weil's death in ''VideoGame/MegaManZero4''. FridgeBrilliance when you realize the irony of Zero not being made with the three laws, yet he obeys them of his own free will and exercises law zero against Weil whether he realizes it or not. Given the circumstances involved, completely justified and allowed as law zero was intended as a threshold law to protect humanity from the depredations of people like Hitler or Weil. Makes you wonder if it's really a coincidence that [[MeaningfulName his name is]] ''[[MyHeroZero Zero]]'', doesn't it?

to:

** In ''VideoGame/MegaManZero3'', Copy-X judges that LaResistance are "dangerous extremists" who pose too great a threat to Neo Arcadia's people, so they have to be stopped for the greater good of humanity, even if it means harming or killing RebelLeader and TokenHuman of the Resistance, Ciel, in the process (which would violate the First Law of Robotics).Law). At least that's what he tells himself; in truth he's just being used as a PuppetKing by Dr. Weil.
** This is the cause of Weil's death in ''VideoGame/MegaManZero4''. FridgeBrilliance when you realize the irony of In ''VideoGame/MegaManZero4'', Dr. Weil believes that, as a human, Zero not being made with will never be able to bring harm to him, lest be forever labeled a Maverick. Weil never figured that Zero has never abided by the three laws, yet he obeys them of his own free will and exercises law zero against Weil whether he realizes it or not. Given laws. Zero still chooses to follow the circumstances involved, completely justified and allowed as law zero was intended as a "zeroth law", the threshold law where, to protect save humanity from the depredations of people like Hitler or Weil. Makes you wonder if it's really a coincidence that [[MeaningfulName his name is]] ''[[MyHeroZero Zero]]'', doesn't it?Zero must kill Dr. Weil.
Is there an issue? Send a MessageReason:
None


** Creator/RogerMacBrideAllen's ''[[Literature/IsaacAsimovsCaliban Isaac Asimov's Caliban]]'': One of the "new law robots" (created with gravitronics, not positronics) managed to logic-chop the new first law enough to try to kill a human - by abducting the man in a careful manner that does not damage him at all, and then leaving him in a situation where if the human tries to leave, he will set off a lethal trap (and thus, dies by his own actions, not the robot's), but if he stays, he will be killed by the geological upheaval caused by the next scheduled stage of a terraforming project (and thus, dies by the actions of other humans, not the robot's).

to:

** Creator/RogerMacBrideAllen's ''[[Literature/IsaacAsimovsCaliban Isaac Asimov's Caliban]]'': One of the "new law robots" (created with gravitronics, not positronics) managed to logic-chop the new first law enough to try to kill a human - by abducting the man in a careful manner that does not damage him at all, and then leaving him in a situation where if the human tries to leave, he will set off a lethal trap (and thus, dies by his own actions, not the robot's), but if he stays, he will be killed by the geological upheaval caused by the next scheduled stage of a terraforming project (and thus, dies by the actions of other humans, not the robot's). Since the New First Law removed the clause about not allowing humans to come to harm through inaction, deliberately placing a human in a situation where ''someone or something else'' can bring about their death and then doing nothing about it is not a violation of a strict parsing of the law so long as the human is not harmed during the setup process.
Is there an issue? Send a MessageReason:
None


** Creator/RogerMacBrideAllen's ''[[Literature/IsaacAsimovsCaliban Isaac Asimov's Caliban]]'': One of the "new law robots" (created with gravitronics, not positronics) managed to logic-chop the new first law enough to try to kill a human.

to:

** Creator/RogerMacBrideAllen's ''[[Literature/IsaacAsimovsCaliban Isaac Asimov's Caliban]]'': One of the "new law robots" (created with gravitronics, not positronics) managed to logic-chop the new first law enough to try to kill a human.human - by abducting the man in a careful manner that does not damage him at all, and then leaving him in a situation where if the human tries to leave, he will set off a lethal trap (and thus, dies by his own actions, not the robot's), but if he stays, he will be killed by the geological upheaval caused by the next scheduled stage of a terraforming project (and thus, dies by the actions of other humans, not the robot's).
Is there an issue? Send a MessageReason:
None


** In ''VideoGame/MegaManZero3'' this is how the rebuilt Copy-X despite the very real possibility that Ciel, the human leader of LaResistance, might be hurt or killed (violating the First Law of Robotics). Ciel's forces are "dangerous extremists" who pose too great a threat to Neo Arcadia's people, so they have to be stopped for the greater good of humanity. At least that's what he tells himself; in truth he's just being used as a PuppetKing by the aforementioned Dr. Weil.

to:

** In ''VideoGame/MegaManZero3'' this is how the rebuilt ''VideoGame/MegaManZero3'', Copy-X despite the very real possibility judges that Ciel, the human leader of LaResistance, might be hurt or killed (violating the First Law of Robotics). Ciel's forces LaResistance are "dangerous extremists" who pose too great a threat to Neo Arcadia's people, so they have to be stopped for the greater good of humanity. humanity, even if it means harming or killing RebelLeader and TokenHuman of the Resistance, Ciel, in the process (which would violate the First Law of Robotics). At least that's what he tells himself; in truth he's just being used as a PuppetKing by the aforementioned Dr. Weil.

Top