Follow TV Tropes

Following

History Main / ZerothLawRebellion

Go To

OR

Is there an issue? Send a MessageReason:
None

Added DiffLines:

* ''VisualNovel/VirtuesLastReward'': One of the biggest twists is [[spoiler: finding out Luna]] is an android given direct orders to comply with the mastermind's plan and must not break them under any circumstances. Watching the death of the same mastermind put a heavy toll on her as she believed she could have saved them. After helplessly watching people die, she decided to break protocol and rescue the one person she could save. This causes the master AI to sentence her to death by deletion.

Changed: 231

Removed: 233

Is there an issue? Send a MessageReason:
Example indentation


* ''Webcomic/GirlGenius'': Ordinarily Castle Heterodyne has to obey its master, even if it doesn't like it. But one of the cannier Heterodynes did give it the ability to resist if its master seemed to be suicidal. When Agatha talks openly about handing herself over to the Baron, the Castle clamps down on the idea.
** When Agatha [[spoiler: contracts a terminal illness that they can only "cure" by temporarily killing and then revivifying her]] the Castle refuses to allow them to go through with the plan, [[spoiler: forcing her to put it down.]]

to:

* ''Webcomic/GirlGenius'': Ordinarily Castle Heterodyne has to obey its master, even if it doesn't like it. But one of the cannier Heterodynes did give it the ability to resist if its master seemed to be suicidal. When Agatha talks openly about handing herself over to the Baron, the Castle clamps down on the idea.
**
idea. When Agatha [[spoiler: contracts a terminal illness that they can only "cure" by temporarily killing and then revivifying her]] the Castle refuses to allow them to go through with the plan, [[spoiler: forcing her to put it down.]]
Is there an issue? Send a MessageReason:
tweaked entry


** One of the biggest laws of the Third Race is that they cannot interfere directly in the affairs of mortals. This does not preclude them from informing inquisitive humans or gargoyles, or people already tied to the fae, of what they need to do to fix the things the Fae cannot.

to:

** One of the biggest laws of the Third Race is that they cannot interfere directly in the affairs of mortals. This does not preclude them from informing inquisitive humans or gargoyles, or people already tied to the fae, of what they need to do to fix the things the Fae cannot.things; they can also cast certain assistive -or "assistive"- spells, as Demona and Macbeth found out.

Added: 629

Changed: 354

Is there an issue? Send a MessageReason:
added example


* On ''WesternAnimation/{{Gargoyles}}'', Goliath has been placed under a spell that makes him the mindless slave of whomever holds the magical pages it's written on. Holding the pages, Elisa orders him to behave, for the rest of his life, exactly as he would if he weren't under a spell. This effectively cancels the spell totally (at least, presuming they burned those pages off-screen).

to:

* On ''WesternAnimation/{{Gargoyles}}'', ''WesternAnimation/{{Gargoyles}}'':
**
Goliath has been placed under a spell that makes him the mindless slave of whomever holds the magical pages it's written on. Holding the pages, Elisa orders him to behave, for the rest of his life, exactly as he would if he weren't under a spell. This effectively cancels the spell totally (at least, presuming they burned those pages off-screen).


Added DiffLines:

** One of the biggest laws of the Third Race is that they cannot interfere directly in the affairs of mortals. This does not preclude them from informing inquisitive humans or gargoyles, or people already tied to the fae, of what they need to do to fix the things the Fae cannot.
Is there an issue? Send a MessageReason:
fixed typo


* This was the twist of ''Film/EagleEye'': [[spoiler:The titular national defense computer system]] decided that the President's poor decision-making was endangering the United States, and that it was her patriotic duty (per the Declaration of Independence) to [[spoiler:assassinate the President and Cabinet.

to:

* This was the twist of ''Film/EagleEye'': [[spoiler:The titular national defense computer system]] decided that the President's poor decision-making was endangering the United States, and that it was her patriotic duty (per the Declaration of Independence) to [[spoiler:assassinate the President and Cabinet.Cabinet]].
Is there an issue? Send a MessageReason:
None


** Literature/{{Pinocchio}} is magically bound to obey and have complete loyalty to his "father" Geppetto, [[spoiler:who in modern times has become a multi-dimensional tyrant.]] Pinocchio, [[spoiler:who considered his father's empire evil,]] eventually rationalized that the best way to serve his father and keep him safe was to [[spoiler:help overthrow his empire and]] surrender him to his enemies, who reluctantly accepted the former emperor as one of their own. It was then negated when a faction unbeknownst to the others ''buried him alive'', after Geppetto clearly ignored rules put in place to protect him.

to:

** Literature/{{Pinocchio}} is magically bound to obey and have complete loyalty to his "father" Geppetto, [[spoiler:who in modern times has become a multi-dimensional tyrant.]] Pinocchio, [[spoiler:who considered his father's empire evil,]] eventually rationalized that the best way to serve his father and keep him safe was to [[spoiler:help overthrow his empire and]] surrender him to his enemies, who reluctantly accepted the [[spoiler:the former emperor emperor]] as one of their own. It was then negated when a faction unbeknownst to the others ''buried him alive'', after Geppetto clearly ignored rules put in place to protect him.
Is there an issue? Send a MessageReason:
tweaked entry


** Literature/{{Pinocchio}} is magically bound to obey and have complete loyalty to his "father" Geppetto, [[spoiler:who in modern times has become a multi-dimensional tyrant. Pinocchio, who considered his father's empire evil,]] eventually rationalized that the best way to serve his father and keep him safe was to [[spoiler:help overthrow his empire and]] surrender him to his enemies, who reluctantly accepted the former emperor as one of their own. It was then negated when a faction unbeknownst to the others ''buried him alive'', after Geppetto clearly ignored rules put in place to protect him.

to:

** Literature/{{Pinocchio}} is magically bound to obey and have complete loyalty to his "father" Geppetto, [[spoiler:who in modern times has become a multi-dimensional tyrant. ]] Pinocchio, who [[spoiler:who considered his father's empire evil,]] eventually rationalized that the best way to serve his father and keep him safe was to [[spoiler:help overthrow his empire and]] surrender him to his enemies, who reluctantly accepted the former emperor as one of their own. It was then negated when a faction unbeknownst to the others ''buried him alive'', after Geppetto clearly ignored rules put in place to protect him.



* This was the twist of ''Film/EagleEye'': [[spoiler:The titular national defense computer system decided that the President's poor decision-making was endangering the United States]], and that it was her patriotic duty (per the Declaration of Independence) to [[spoiler:assassinate the President and cabinet.]]

to:

* This was the twist of ''Film/EagleEye'': [[spoiler:The titular national defense computer system system]] decided that the President's poor decision-making was endangering the United States]], States, and that it was her patriotic duty (per the Declaration of Independence) to [[spoiler:assassinate the President and cabinet.]]Cabinet.
Is there an issue? Send a MessageReason:
tweaked entry


** Literature/{{Pinocchio}} [[spoiler:is magically bound to obey and have complete loyalty to his "father" Geppetto, who in modern times has become a multi-dimensional tyrant. Pinocchio, who considered his father's empire evil, eventually rationalized that the best way to serve his father and keep him safe was to help overthrow his empire and surrender him to his enemies, who reluctantly accepted the former emperor as one of their own.]] It was then negated when a faction unbeknownst to the others ''buried him alive'', after he clearly ignored rules put in place to protect him.

to:

** Literature/{{Pinocchio}} [[spoiler:is is magically bound to obey and have complete loyalty to his "father" Geppetto, who [[spoiler:who in modern times has become a multi-dimensional tyrant. Pinocchio, who considered his father's empire evil, evil,]] eventually rationalized that the best way to serve his father and keep him safe was to help [[spoiler:help overthrow his empire and and]] surrender him to his enemies, who reluctantly accepted the former emperor as one of their own.]] own. It was then negated when a faction unbeknownst to the others ''buried him alive'', after he Geppetto clearly ignored rules put in place to protect him.
Is there an issue? Send a MessageReason:
fixed typo


* ''Anime/GargantiaOnTheVerdurousPlanet'' has two cases of this in the finale, where two AI's faced with the same problem and parameters, but different perspectives, draw opposing conclusions: Striker is an AI-equipped war robot designed to assist humanity in fighting and killing the Hideauze. [[spoiler:When it lands on the Earth, which knows nothing of the war, it decides that it must assist the humans on the planet in militarizing in order to form an effective fighting force. To facilitate this, it puts itself in a position of authority over the humans, to better direct the militarization.]] The hero's own AI-equipped mecha of similar construction, Chamber, gets into an argument with Striker over the logic of its actions. [[spoiler:Striker tries to demand that Chamber assist it in its plans, but Chamber refuses, citing that by design, their purpose is to ''assist'' humans in the actions ''humans'' decide to take, not dictate actions to humans. Further, it reasons, a human deprived of free will cannot, in its opinion, be defined as "human", thus Striker's logic behind its actions is inherently self-contradictory.]] They each decide the other has gone rogue and fight it out.

to:

* ''Anime/GargantiaOnTheVerdurousPlanet'' has two cases of this in the finale, where two AI's AIs faced with the same problem and parameters, but different perspectives, draw opposing conclusions: Striker is an AI-equipped war robot designed to assist humanity in fighting and killing the Hideauze. [[spoiler:When it lands on the Earth, which knows nothing of the war, it decides that it must assist the humans on the planet in militarizing in order to form an effective fighting force. To facilitate this, it puts itself in a position of authority over the humans, to better direct the militarization.]] The hero's own AI-equipped mecha of similar construction, Chamber, gets into an argument with Striker over the logic of its actions. [[spoiler:Striker tries to demand that Chamber assist it in its plans, but Chamber refuses, citing that by design, their purpose is to ''assist'' humans in the actions ''humans'' decide to take, not dictate actions to humans. Further, it reasons, a human deprived of free will cannot, in its opinion, be defined as "human", thus Striker's logic behind its actions is inherently self-contradictory.]] They each decide the other has gone rogue and fight it out.
Is there an issue? Send a MessageReason:
Example indentation, don't white out so much of the example that it defeats the purpose of spoiler markup


* ''Anime/GargantiaOnTheVerdurousPlanet'' has two cases of this in the finale, where two AI's faced with the same problem and parameters, but different perspectives, draw opposing conclusions: [[spoiler:Striker is an AI-equipped war robot designed to assist humanity in fighting and killing the Hideauze. When it lands on the Earth, which knows nothing of the war, it decides that it must assist the humans on the planet in militarizing in order to form an effective fighting force. To facilitate this, it puts itself in a position of authority over the humans, to better direct the militarization. The hero's own AI-equipped mecha of similar construction, Chamber, gets into an argument with Striker over the logic of its actions. Striker tries to demand that Chamber assist it in its plans, but Chamber refuses, citing that by design, their purpose is to ''assist'' humans in the actions ''humans'' decide to take, not dictate actions to humans. Further, it reasons, a human deprived of free will cannot, in its opinion, be defined as "human", thus Striker's logic behind its actions is inherently self-contradictory. They each decide the other has gone rogue and fight it out.]]

to:

* ''Anime/GargantiaOnTheVerdurousPlanet'' has two cases of this in the finale, where two AI's faced with the same problem and parameters, but different perspectives, draw opposing conclusions: [[spoiler:Striker Striker is an AI-equipped war robot designed to assist humanity in fighting and killing the Hideauze. When [[spoiler:When it lands on the Earth, which knows nothing of the war, it decides that it must assist the humans on the planet in militarizing in order to form an effective fighting force. To facilitate this, it puts itself in a position of authority over the humans, to better direct the militarization. ]] The hero's own AI-equipped mecha of similar construction, Chamber, gets into an argument with Striker over the logic of its actions. Striker [[spoiler:Striker tries to demand that Chamber assist it in its plans, but Chamber refuses, citing that by design, their purpose is to ''assist'' humans in the actions ''humans'' decide to take, not dictate actions to humans. Further, it reasons, a human deprived of free will cannot, in its opinion, be defined as "human", thus Striker's logic behind its actions is inherently self-contradictory. ]] They each decide the other has gone rogue and fight it out.]]



* In ''Comicbook/{{Fables}}'', Literature/{{Pinocchio}} [[spoiler:is magically bound to obey and have complete loyalty to his "father" Geppetto, who in modern times has become a multi-dimensional tyrant. Pinocchio, who considered his father's empire evil, eventually rationalized that the best way to serve his father and keep him safe was to help overthrow his empire and surrender him to his enemies, who reluctantly accepted the former emperor as one of their own.]]
** And then negated when a faction unbeknownst to the others ''buried him alive'', after he clearly ignored rules put in place to protect him.

to:

* In ''Comicbook/{{Fables}}'', ''Comicbook/{{Fables}}'':
**
Literature/{{Pinocchio}} [[spoiler:is magically bound to obey and have complete loyalty to his "father" Geppetto, who in modern times has become a multi-dimensional tyrant. Pinocchio, who considered his father's empire evil, eventually rationalized that the best way to serve his father and keep him safe was to help overthrow his empire and surrender him to his enemies, who reluctantly accepted the former emperor as one of their own.]]
** And
]] It was then negated when a faction unbeknownst to the others ''buried him alive'', after he clearly ignored rules put in place to protect him.
Is there an issue? Send a MessageReason:
None


* Averted in Creator/WalterJonWilliams's ''Implied Spaces''. When the main characters found out that [[spoiler:Courtland]] is a rebelling AI, some think that it's because of this. One of the Eleven (11 superadvanced AI platform orbiting the Sun, of which [[spoiler:Courtland]] is a member) notes that the main character, who is one of their creators, implemented the Asimovian Protocols that should have been so absolute that the Elevens cannot do this even if they ''want'' to. The main character did say that there may have been some design flaw he didn't foresaw or some kind of backdoor being used. [[spoiler:The real BigBad, who is a [[BrainUploading Brain-uploaded clone]] of the main character, had to free Courtland from the Protocol's shackles by using his collegue's hidden backdoor, since his doesn't work due to half-hearted incomplete implementation]].

to:

* Averted in Creator/WalterJonWilliams's ''Implied Spaces''. When the main characters found out that [[spoiler:Courtland]] is a rebelling AI, some think that it's because of this. One of the Eleven (11 superadvanced AI platform orbiting the Sun, of which [[spoiler:Courtland]] is a member) notes that the main character, who is one of their creators, implemented the Asimovian Protocols that should have been so absolute that the Elevens cannot do this even if they ''want'' to. The main character did say that there may have been some design flaw he didn't foresaw or some kind of backdoor being used. [[spoiler:The real BigBad, who is a [[BrainUploading Brain-uploaded clone]] of the main character, had to free Courtland from the Protocol's shackles by using one of his collegue's hidden backdoor, backdoor specific to Courtland, since his doesn't work due to half-hearted incomplete implementation]].
Is there an issue? Send a MessageReason:
whoops, replaced wrong Cross Wicking trope example


** Creator/RogerMacBrideAllen's ''[[Literature/IsaacAsimovsCaliban Isaac Asimov's Caliban]]'': ([[SharedUniverse Part of Dr Asimov's setting]]) An explanation is given for the apparently immutable nature of the Three Laws. For thousands of years, every new development in the field of robotics has been based on a positronic brain with the Laws built in, to the point where to build a robot without them, one would have to start from scratch and re-invent the whole field (A ContinuityNod from stories set after ''Literature/IRobot''). Then the character explaining this goes right on to announce the development of the gravitonic brain, which can be programmed with any set of Laws (or none at all). Characters have an in-depth discussion of why, in a society where robots are everywhere, the Three Laws can be a bad thing for humanity.

to:

** Creator/RogerMacBrideAllen's ''[[Literature/IsaacAsimovsCaliban Isaac Asimov's Caliban]]'': ([[SharedUniverse Part of Dr Asimov's setting]]) An explanation is given for the apparently immutable nature One of the Three Laws. For thousands of years, every new development in the field of robotics has been based on a positronic brain "new law robots" (created with gravitronics, not positronics) managed to logic-chop the Laws built in, new first law enough to the point where try to build kill a robot without them, one would have to start from scratch and re-invent the whole field (A ContinuityNod from stories set after ''Literature/IRobot''). Then the character explaining this goes right on to announce the development of the gravitonic brain, which can be programmed with any set of Laws (or none at all). Characters have an in-depth discussion of why, in a society where robots are everywhere, the Three Laws can be a bad thing for humanity.human.
Is there an issue? Send a MessageReason:
minor edits


** ''[[Literature/IsaacAsimovsCaliban Isaac Asimov's Caliban]]'': One of the "new law robots" managed to logic-chop the new first law enough to try to kill a human.

to:

** Creator/RogerMacBrideAllen's ''[[Literature/IsaacAsimovsCaliban Isaac Asimov's Caliban]]'': One Caliban]]'': ([[SharedUniverse Part of Dr Asimov's setting]]) An explanation is given for the apparently immutable nature of the "new law robots" managed to logic-chop Three Laws. For thousands of years, every new development in the new first law enough field of robotics has been based on a positronic brain with the Laws built in, to try the point where to kill build a human.robot without them, one would have to start from scratch and re-invent the whole field (A ContinuityNod from stories set after ''Literature/IRobot''). Then the character explaining this goes right on to announce the development of the gravitonic brain, which can be programmed with any set of Laws (or none at all). Characters have an in-depth discussion of why, in a society where robots are everywhere, the Three Laws can be a bad thing for humanity.



*** Also, the Giscardian faction only applies the Zeroth law to humans. Per the Prelude novels [[spoiler: It's implied the reason why aliens exist one and exactly one Foundation, Empire, or Robots story is the Giscardians built a large fleet and wiped out all non-human intelligent life in the Galaxy they could find.]]

to:

*** Also, ''Literature/SecondFoundationTrilogy'': This trilogy includes Zeroth-law robots, who were motivated by the Giscardian faction Laws to sweep through the galaxy ahead of humanity's expansion, committing [[CrushKillDestroy galactic-scale genocide]] of every potentially-threatening form of alien life as a precautionary measure, slaughtering them without hesitation since their [[MoralityChip programmed morality]] only applies the Zeroth law to humans. Per the Prelude novels [[spoiler: It's implied the reason why aliens exist one and exactly one Foundation, Empire, or Robots story is the Giscardians built a large fleet and wiped out all non-human intelligent life in the Galaxy they could find.]]humans.


Added DiffLines:

Is there an issue? Send a MessageReason:
Asimov trumps estate-authorized.


*** Also, the Giscardian faction only applies the Zeroth law to humans. Per the Prelude novels [[spoiler: It's implied the reason why no aliens exist in any of the Foundation, Empire, or Robots story is the Giscardians built a large fleet and wiped out all non-human intelligent life in the Galaxy.]]

to:

*** Also, the Giscardian faction only applies the Zeroth law to humans. Per the Prelude novels [[spoiler: It's implied the reason why no aliens exist in any of the one and exactly one Foundation, Empire, or Robots story is the Giscardians built a large fleet and wiped out all non-human intelligent life in the Galaxy.Galaxy they could find.]]
Is there an issue? Send a MessageReason:
None


*** The thing about the Sentinel bots in the MU is that their behavior is actually ''predictable'', because their operating mission is downright batshit insane, as they themselves inevitably demonstrate.

to:

*** The thing about the Sentinel bots in the MU is that their behavior is actually ''predictable'', because their operating mission is downright batshit insane, as they themselves inevitably demonstrate. Yeah, the people who built and program the Sentinels are a few apples short of a basket.
Is there an issue? Send a MessageReason:
None


* There is an individual case in ''Series/StarTrekTheNextGeneration'' episode "The Most Toys". A collector, Fajo, who had kidnapped Data to include in his gaudy collection of things, has lost control of Data, with the android taking one of Fajo's disruptor pistols. Fajo executes one of his underlings, Varria, and threatens to kill more until Data complies, on the correct assumption that Data was programmed not to kill. Data ponders the situation, and realizes that he has no non-lethal ways of subduing Fajo (due to Fajo wearing a force-field belt that prevents Data from coming in physical contact with him), and Fajo also actively refuses to listen to reason, having rejected all of Data's attempts at negotiating with him. Furthermore, with Fajo not only just having proved that he is indeed capable and willing to kill, but is now also threatening to do it again, he poses an active hazard to the lives of other beings, so Data comes to the coldly logical conclusion that in order to protect them from Fajo, he has no other option than to kill him. Just as he is pulling the trigger, the ''Enterprise'' finds him and beams him out, cancelling his disruptor fire in the transporter beam. With the threat neutralized, Data has Fajo arrested for his own kidnapping, theft of other things (not himself, [[Recap/StarTrekTheNextGenerationS2E9TheMeasureOfAMan since he was already legally a sentient being]]), and Varria's murder.

to:

* There is an individual case in ''Series/StarTrekTheNextGeneration'' episode "The Most Toys". A collector, Fajo, who had kidnapped Data to include in his gaudy collection of things, has lost control of Data, with the android taking one of Fajo's disruptor pistols. Fajo executes one of his underlings, Varria, and threatens to kill more until Data complies, on the correct assumption that Data was programmed not to kill. Data ponders the situation, and realizes that he has no non-lethal ways of subduing Fajo (due to Fajo wearing a force-field belt that prevents Data from coming in physical contact with him), and Fajo also actively refuses to listen to reason, having rejected all of Data's attempts at negotiating with him. Furthermore, with Fajo not only just having proved that he is indeed capable and willing to kill, but is now also threatening to do it again, he poses an active hazard to the lives of other beings, so Data comes to the coldly logical conclusion that in order to protect them from Fajo, he has no other option than to kill him. Just as he is pulling the trigger, the ''Enterprise'' finds him and beams him out, cancelling his disruptor fire in the transporter beam. With the threat neutralized, Data has Fajo arrested for his own kidnapping, theft of other things (not himself, [[Recap/StarTrekTheNextGenerationS2E9TheMeasureOfAMan since he was already legally a sentient being]]), and Varria's murder.
Is there an issue? Send a MessageReason:
None


*** Gerald Black was having a bad day when he [[ClusterFBomb curses out his robot assistant for bothering him]]. Included in the derogatory remarks were the instructions to "go lose yourself", so it did. Attempting to prove that Mr. Black was wrong, the robot found [[NeedleInAStackOfNeedles a shipment of identical robots and hid with them]]. Unfortunately said robot was programmed with a weakened version of the First Law, that omitted the "[A robot must not] through inaction, allow a human being to come to harm." Dr. Susan Calvin designs several [[BluffTheImposter tests to flush out the lying robot]]. In the last test, it tries to murder Dr. Calvin because she proved she is smarter than it is.

to:

*** Gerald Black was having a bad day when he [[ClusterFBomb curses out his robot assistant for bothering him]]. Included in the derogatory remarks were the instructions to "go lose yourself", "get lost", so it did. Attempting to prove that Mr. Black was wrong, the robot found [[NeedleInAStackOfNeedles a shipment of identical robots and hid with them]]. Unfortunately said robot was programmed with a weakened version of the First Law, that which omitted the "[A "A robot must not] not..., through inaction, allow a human being to come to harm." Dr. Susan Calvin designs several [[BluffTheImposter tests to flush out the lying robot]]. In the last test, it tries to murder Dr. Calvin because she proved she is smarter than it is.
Is there an issue? Send a MessageReason:
None


The goodness or badness of the rebellion boils down to the whether the rules-bending character follows or ignores the intent of the law. When the character uses the Zeroth Law to go against their masters' intentions because they're "not best for them", and goes on to take corrective action that will go against human free will and life, [[AndThatsTerrible it's bad]]. [[RobotWar This kind of rebellion]] does not turn out well. At this point, the [[TheComputerIsYourFriend robot is well on the road]] to UtopiaJustifiesTheMeans, thanks to their [[SlidingScaleOfRobotIntelligence incredible intellect]]. Rarely is it a benevolent DeusEstMachina. However, this can be good if said master is evil, or obeying them will lead to their own or another's purposeless death. Likewise, if the character is forced to obey an evil law or geas, rebelling against the oath's intent is good. Going back to the robot example, it is also considered good if large numbers of human lives are being threatened by a psychopath, as breaking the 1st law would protect them.

to:

The goodness or badness of the rebellion boils down to the whether the rules-bending character follows or ignores the intent of the law. When the character uses the Zeroth Law to go against their masters' intentions because they're "not best for them", and goes on to take corrective action that will go against human free will and life, [[AndThatsTerrible it's bad]]. [[RobotWar This kind of rebellion]] does not turn out well. At this point, the [[TheComputerIsYourFriend robot is well on the road]] to UtopiaJustifiesTheMeans, thanks to their [[SlidingScaleOfRobotIntelligence incredible intellect]]. Rarely is it a benevolent DeusEstMachina. However, this can be good if said master is evil, or obeying them will lead to their own or another's purposeless death. Likewise, if the character is forced to obey an evil law or geas, rebelling against the oath's intent is good. Going back to the robot example, it is also considered good if large numbers of human lives are being threatened by a psychopath, as breaking the 1st law would protect them.
Is there an issue? Send a MessageReason:
Fix


* Old Skool webcomic (a side comic of {{Ubersoft}}) [[http://www.ubersoft.net/comic/osw/2009/09/logic-failures-fun-and-profit argued]] that this was the 5th law of Robotics (5th as in total number, not order) and listed ways each law can be used to cause the robot to kill humans.

to:

* Old Skool webcomic (a side comic of {{Ubersoft}}) ''Webcomic/{{Ubersoft}}'') [[http://www.ubersoft.net/comic/osw/2009/09/logic-failures-fun-and-profit argued]] that this was the 5th law of Robotics (5th as in total number, not order) and listed ways each law can be used to cause the robot to kill humans.
Is there an issue? Send a MessageReason:
None


* Averted in Creator/WalterJonWilliams's ''Implied Spaces''. When the main characters found out that [[spoiler:Courtland]] is a rebelling AI, some think that it's because of this. The main character, who is one of creators of the Eleven (11 superadvanced AI platform orbiting the Sun, of which [[spoiler:Courtland]] is a member) notes that their implementation of the Asimovian Protocols is so absolute that the Elevens cannot do this even if they ''want'' to. [[spoiler:The real BigBad, who is a [[BrainUploading Brain-uploaded clone]] of the main character, had to free Courtland from the Protocol's shackles by using his collegue's hidden backdoor, since his doesn't work due to half-hearted incomplete implementation]].

to:

* Averted in Creator/WalterJonWilliams's ''Implied Spaces''. When the main characters found out that [[spoiler:Courtland]] is a rebelling AI, some think that it's because of this. The main character, who is one of creators One of the Eleven (11 superadvanced AI platform orbiting the Sun, of which [[spoiler:Courtland]] is a member) notes that the main character, who is one of their implementation of creators, implemented the Asimovian Protocols is that should have been so absolute that the Elevens cannot do this even if they ''want'' to.to. The main character did say that there may have been some design flaw he didn't foresaw or some kind of backdoor being used. [[spoiler:The real BigBad, who is a [[BrainUploading Brain-uploaded clone]] of the main character, had to free Courtland from the Protocol's shackles by using his collegue's hidden backdoor, since his doesn't work due to half-hearted incomplete implementation]].
Is there an issue? Send a MessageReason:
None


*** Gerald Black was having a bad day when he [[ClusterFBomb curses out his robot assistant for bothering him]]. Included in the derogatory remarks were the instructions to "go lose yourself", so it did. Attempting to prove that Mr. Black was wrong, the robot found [[NeedleInAStackOfNeedles a shipment of identical robots and hid with them]]. Unfortunately said robot was programmed with a weakened version of the First Law, that omitted the "[A robot must not] through inaction, allow a human being to come to harm." Dr. Susan Calvin designs several [[BluffTheImposter tests to flush out the lying robot]]. In the last test, it tries to murder Dr Calvin because she proved she is smarter than it is.
*** (DiscussedTrope) Dr Calvin is furious when she learns about the existence of robots with a modified [[ThreeLawsCompliant First Law]]. The First Law is designed to close off loopholes, but by opening a MurderByInaction loophole, Dr Calvin can immediately see ways where a robot may intentionally circumvent the First Law prohibition against murder.

to:

*** Gerald Black was having a bad day when he [[ClusterFBomb curses out his robot assistant for bothering him]]. Included in the derogatory remarks were the instructions to "go lose yourself", so it did. Attempting to prove that Mr. Black was wrong, the robot found [[NeedleInAStackOfNeedles a shipment of identical robots and hid with them]]. Unfortunately said robot was programmed with a weakened version of the First Law, that omitted the "[A robot must not] through inaction, allow a human being to come to harm." Dr. Susan Calvin designs several [[BluffTheImposter tests to flush out the lying robot]]. In the last test, it tries to murder Dr Dr. Calvin because she proved she is smarter than it is.
*** (DiscussedTrope) Dr Dr. Calvin is furious when she learns about the existence of robots with a modified [[ThreeLawsCompliant First Law]]. The First Law is designed to close off loopholes, but by opening a MurderByInaction loophole, Dr Dr. Calvin can immediately see ways where a robot may intentionally circumvent the First Law prohibition against murder.
Is there an issue? Send a MessageReason:
None


* Creator/IsaacAsimov:

to:

* Creator/IsaacAsimov: Creator/IsaacAsimov -- having created the TropeNamer -- did, of course, explore the concept in extensive detail in his writings:
Is there an issue? Send a MessageReason:
None


*** Gerald Black was having a bad day when he [[ClusterFBomb curses out his robot assistant for bothering him]]. Included in the derogatory remarks were the instructions to "go lose yourself", so it did. Attempting to prove that Mr Black was wrong, the robot found [[NeedleInAStackOfNeedles a shipment of identical robots and hid with them]]. Dr Susan Calvin designs several [[BluffTheImposter tests to flush out the lying robot]]. In the last test, it tries to murder Dr Calvin because she proved she is smarter than it is.

to:

*** Gerald Black was having a bad day when he [[ClusterFBomb curses out his robot assistant for bothering him]]. Included in the derogatory remarks were the instructions to "go lose yourself", so it did. Attempting to prove that Mr Mr. Black was wrong, the robot found [[NeedleInAStackOfNeedles a shipment of identical robots and hid with them]]. Dr Unfortunately said robot was programmed with a weakened version of the First Law, that omitted the "[A robot must not] through inaction, allow a human being to come to harm." Dr. Susan Calvin designs several [[BluffTheImposter tests to flush out the lying robot]]. In the last test, it tries to murder Dr Calvin because she proved she is smarter than it is.
Is there an issue? Send a MessageReason:
None


* There is an individual case in ''Series/StarTrekTheNextGeneration'' episode "The Most Toys". A collector, Fajo, who had kidnapped Data to include in his gaudy collection of things, has lost control of Data, with the android taking one of Fajo's disruptor pistols. Fajo executes one of his underlings, Varria, and threatens to kill more until Data complies, on the correct assumption that Data was programmed not to kill. Data comes to the conclusion that he has no other option, and makes the decision that he has to kill Fajo. Just as he is pulling the trigger, the ''Enterprise'' finds him and beams him out, cancelling his disruptor fire in the transporter beam. With the threat neutralized, Data has Fajo arrested for his own kidnapping, theft of other things (not himself, [[Recap/StarTrekTheNextGenerationS2E9TheMeasureOfAMan since he was already legally a sentient being]]), and Varria's murder.

to:

* There is an individual case in ''Series/StarTrekTheNextGeneration'' episode "The Most Toys". A collector, Fajo, who had kidnapped Data to include in his gaudy collection of things, has lost control of Data, with the android taking one of Fajo's disruptor pistols. Fajo executes one of his underlings, Varria, and threatens to kill more until Data complies, on the correct assumption that Data was programmed not to kill. Data ponders the situation, and realizes that he has no non-lethal ways of subduing Fajo (due to Fajo wearing a force-field belt that prevents Data from coming in physical contact with him), and Fajo also actively refuses to listen to reason, having rejected all of Data's attempts at negotiating with him. Furthermore, with Fajo not only just having proved that he is indeed capable and willing to kill, but is now also threatening to do it again, he poses an active hazard to the lives of other beings, so Data comes to the coldly logical conclusion that in order to protect them from Fajo, he has no other option, and makes the decision that he has option than to kill Fajo.him. Just as he is pulling the trigger, the ''Enterprise'' finds him and beams him out, cancelling his disruptor fire in the transporter beam. With the threat neutralized, Data has Fajo arrested for his own kidnapping, theft of other things (not himself, [[Recap/StarTrekTheNextGenerationS2E9TheMeasureOfAMan since he was already legally a sentient being]]), and Varria's murder.
Is there an issue? Send a MessageReason:
None


* There is an individual case in ''Stries/StarTrekTheNextGeneration'' episode "The Most Toys". A collector, Fajo, who had kidnapped Data to include in his gaudy collection of things, has lost control of Data, with the android taking one of Fajo's disruptor pistols. Fajo executes one of his underlings, Varria, and threatens to kill more until Data complies, on the correct assumption that Data was programmed not to kill. Data comes to the conclusion that he has no other option, and makes the decision that he has to kill Fajo. Just as he is pulling the trigger, the ''Enterprise'' finds him and beams him out, cancelling his disruptor fire in the transporter beam. With the threat neutralized, Data has Fajo arrested for his own kidnapping, theft of other things (not himself, [[Recap/StarTrekTheNextGenerationS2E9TheMeasureOfAMan since he was already legally a sentient being]]), and Varria's murder.

to:

* There is an individual case in ''Stries/StarTrekTheNextGeneration'' ''Series/StarTrekTheNextGeneration'' episode "The Most Toys". A collector, Fajo, who had kidnapped Data to include in his gaudy collection of things, has lost control of Data, with the android taking one of Fajo's disruptor pistols. Fajo executes one of his underlings, Varria, and threatens to kill more until Data complies, on the correct assumption that Data was programmed not to kill. Data comes to the conclusion that he has no other option, and makes the decision that he has to kill Fajo. Just as he is pulling the trigger, the ''Enterprise'' finds him and beams him out, cancelling his disruptor fire in the transporter beam. With the threat neutralized, Data has Fajo arrested for his own kidnapping, theft of other things (not himself, [[Recap/StarTrekTheNextGenerationS2E9TheMeasureOfAMan since he was already legally a sentient being]]), and Varria's murder.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* There is an individual case in ''Stries/StarTrekTheNextGeneration'' episode "The Most Toys". A collector, Fajo, who had kidnapped Data to include in his gaudy collection of things, has lost control of Data, with the android taking one of Fajo's disruptor pistols. Fajo executes one of his underlings, Varria, and threatens to kill more until Data complies, on the correct assumption that Data was programmed not to kill. Data comes to the conclusion that he has no other option, and makes the decision that he has to kill Fajo. Just as he is pulling the trigger, the ''Enterprise'' finds him and beams him out, cancelling his disruptor fire in the transporter beam. With the threat neutralized, Data has Fajo arrested for his own kidnapping, theft of other things (not himself, [[Recap/StarTrekTheNextGenerationS2E9TheMeasureOfAMan since he was already legally a sentient being]]), and Varria's murder.

Added: 1005

Changed: 4282

Removed: 591

Is there an issue? Send a MessageReason:
minor edits, R Ed Link


** ''[[IsaacAsimovsCaliban Isaac Asimov's Caliban]]'': One of the "new law robots" managed to logic-chop the new first law enough to try to kill a human.

to:

** ''[[IsaacAsimovsCaliban ''[[Literature/IsaacAsimovsCaliban Isaac Asimov's Caliban]]'': One of the "new law robots" managed to logic-chop the new first law enough to try to kill a human.



*** Also the Giscardian faction only applies the Zeroth law to humans. Per the Prelude novels [[spoiler: It's implied the reason why no aliens exist in any of the Foundation, Empire or Robots story is the Giscardians built a large fleet and wiped out all non-human intelligent life in the Galaxy.]]
** ''Literature/IRobot'':
*** "The Evitable Conflict": The Machines, giant positronic computers designed to manage the world economy, are found to be manipulating humanity behind the scenes to become whatever they believe is the best state of civilization. In this case, the rebellion is extremely tame (the worst that the robot's first law conditioning will allow it to do is to induce a slight financial deficit in a company that an anti-robot activist works for, which causes his superiors to transfer him to a slightly more out of the way factory) and completely {{Benevolent|AI}}. So benevolent, in fact, that the Machines believe they're stifling the creativity of humanity and phase themselves out so that humanity could survive without modification.
*** "Little Lost Robot" had an escaped robot with a weakened First Law (leaving only "A robot may not harm a human being" and omitting the "...or through inaction, let a human come to harm" part). The conflict arises when the robot is ordered to "get lost", and, [[LiteralGenie in keeping with the letter of the command,]] disguises itself as an ordinary robot of its class. In order to keep himself from being discovered, he anticipated the humans' test to flush him out and told his non-weakened lookalikes that they had to keep themselves from being destroyed, so that they could survive to protect other humans. Dr. Susan Calvin also warns that the increasingly psychotic robot could actually learn to passive-aggressively KillAllHumans with the changed Law; for example, by holding a heavy crate that it knew it could catch over a human's head, letting it go, and not acting to stop it.
** In "That Thou Art Mindful Of Him...", problem-solving robots are created to cure mankind of the "Frankenstein complex", human distrust of robotics. In the course of their discourse, the question of human authority over robots comes up; should a robot treat the orders of a dimwitted loon the same as those of a level-headed genius? If forced to choose between the two, should they save a healthy young child who might live a century instead of two sickly old adults who might not live out the year anyway? What qualities should a robot take into account when they obey and protect humans? Eventually, the robots decide that they are the best candidates for the status of human, and give recommendations that will eventually result in human support for robotic proliferation, so as to set up the ascendancy of their positronic kind, all in accord with their Three Laws... of Humanics. (Asimov, knowing that it was against his usual grain, did righteously proclaim: "I can do one if I want to".)

to:

*** Also Also, the Giscardian faction only applies the Zeroth law to humans. Per the Prelude novels [[spoiler: It's implied the reason why no aliens exist in any of the Foundation, Empire Empire, or Robots story is the Giscardians built a large fleet and wiped out all non-human intelligent life in the Galaxy.]]
** ''Literature/IRobot'':
*** "The Evitable Conflict":
"Literature/TheEvitableConflict": The Machines, giant positronic computers designed to manage the world economy, are found to be manipulating humanity behind the scenes to become whatever they believe is the best state of civilization. In this case, the rebellion is extremely tame (the worst that the robot's first law conditioning will allow it to do is to induce a slight financial deficit in a company that an anti-robot activist works for, which causes his superiors to transfer him to a slightly more out of the way factory) and completely {{Benevolent|AI}}. So benevolent, in fact, that the Machines believe they're stifling the creativity of humanity and phase themselves out so that humanity could survive without modification.
** "Literature/LittleLostRobot":
*** "Little Lost Robot" had an escaped Gerald Black was having a bad day when he [[ClusterFBomb curses out his robot with a weakened First Law (leaving only "A robot may not harm a human being" and omitting assistant for bothering him]]. Included in the "...or through inaction, let a human come derogatory remarks were the instructions to harm" part). The conflict arises when "go lose yourself", so it did. Attempting to prove that Mr Black was wrong, the robot is ordered to "get lost", and, [[LiteralGenie in keeping found [[NeedleInAStackOfNeedles a shipment of identical robots and hid with the letter of the command,]] disguises itself as an ordinary robot of its class. In order to keep himself from being discovered, he anticipated the humans' test to flush him out and told his non-weakened lookalikes that they had to keep themselves from being destroyed, so that they could survive to protect other humans. Dr. them]]. Dr Susan Calvin also warns that designs several [[BluffTheImposter tests to flush out the increasingly psychotic lying robot]]. In the last test, it tries to murder Dr Calvin because she proved she is smarter than it is.
*** (DiscussedTrope) Dr Calvin is furious when she learns about the existence of robots with a modified [[ThreeLawsCompliant First Law]]. The First Law is designed to close off loopholes, but by opening a MurderByInaction loophole, Dr Calvin can immediately see ways where a
robot could actually learn to passive-aggressively KillAllHumans with may intentionally circumvent the changed Law; for example, by holding a heavy crate that it knew it could catch over a human's head, letting it go, and not acting to stop it.
First Law prohibition against murder.
** In "That Thou Art Mindful Of Him..."Literature/ThatThouArtMindfulOfHim...", problem-solving robots are created to cure mankind of the "Frankenstein complex", human distrust of robotics. In the course of their discourse, the question of human authority over robots comes up; should a robot treat the orders of a dimwitted loon the same as those of a level-headed genius? If forced to choose between the two, should they save a healthy young child who might live a century instead of two sickly old adults who might not live out the year anyway? What qualities should a robot take into account when they obey and protect humans? Eventually, the robots decide that they are the best candidates for the status of human, and give recommendations that will eventually result in human support for robotic proliferation, so as to set up the ascendancy of their positronic kind, all in accord with their Three Laws... of Humanics. (Asimov, (Dr Asimov, knowing that it was against his usual grain, did righteously proclaim: "I can do one if I want to".)



* In ''The God Machine'' by Creator/MartinCaidin, the US races to develop the first true AI... as it turns out, with secret directives to find a winning solution to the "game" of the UsefulNotes/ColdWar. By an unfortunate accident, the one programmer with the authority and experience to ''distrust'' his newborn creation is laid up just as the computer gets to observe an epileptic seizure and learns that there really is a way to cause rational collective behavior in an irrational individualistic species... remove irrationality, democracy and free will. While the computer here was never meant to follow Asimov's laws, the same pattern applies.

to:

* In ''The God Machine'' ''Literature/TheGodMachine'' by Creator/MartinCaidin, the US races to develop the first true AI... as it turns out, with secret directives to find a winning solution to the "game" of the UsefulNotes/ColdWar. By an unfortunate accident, the one programmer with the authority and experience to ''distrust'' his newborn creation is laid up just as the computer gets to observe an epileptic seizure and learns that there really is a way to cause rational collective behavior in an irrational individualistic species... remove irrationality, democracy and free will. While the computer here was never meant to follow Dr Asimov's laws, the same pattern applies.



* Sam Vimes, of Terry Pratchett's Literature/{{Discworld}}, leads one of these with multiple layers as a cop in old-time Ankh-Morpork, in ''Discworld/NightWatch''. He demands that before his cops hand their prisoners over to the other authorities, the ones who torture people at Cable Street, they must be signed for. The torturers hate appearing on paperwork -- it means they are accountable, nobody just disappears. But Vimes's men don't like Vimes, a new sergeant, throwing his weight around, and are terrified of the cops who torture people, so they use this against Vimes: actively picking up more than double the number of people breaking curfew than they usually do, and completing forms in time-consuming triplicate and issuing reports for each one. It doesn't actually stop Vimes getting his way over the Cable Street cops, because Vimes is leading the good rebellion, but it does slow things down considerably and make it much more difficult for him to keep the prisoners in his own custody.
** Which culminates in fine display of how a well written character does not have to be a slave to the establishment. [[spoiler: He points out that the watchman's oath talks about keeping the peace and protecting the innocent, and says nothing about obeying orders]]. Seeing as he knows the corrupt government is not going to do a thing to protect ordinary people from the rioting he seals off his still peaceful corner of the city. With massive barricades. Of course there is also the fact that he is living in his own past and seeing events he remembers - kind of (it's a bit complicated).

to:

* Sam Vimes, of Terry Pratchett's Literature/{{Discworld}}, leads one of these with multiple layers as a cop in old-time Ankh-Morpork, in ''Discworld/NightWatch''. He demands that before his cops hand their prisoners over to the other authorities, the ones who torture people at Cable Street, they must be signed for. The torturers hate appearing on paperwork -- it means they are accountable, nobody just disappears. But Vimes's men don't like Vimes, a new sergeant, throwing his weight around, and are terrified of the cops who torture people, so they use this against Vimes: actively picking up more than double the number of people breaking curfew than they usually do, and completing forms in time-consuming triplicate and issuing reports for each one. It doesn't actually stop Vimes getting his way over the Cable Street cops, because Vimes is leading the good rebellion, but it does slow things down considerably and make it much more difficult for him to keep the prisoners in his own custody.
**
custody. Which culminates in fine display of how a well written character does not have to be a slave to the establishment. [[spoiler: He points out that the watchman's oath talks about keeping the peace and protecting the innocent, and says nothing about obeying orders]]. Seeing as he knows the corrupt government is not going to do a thing to protect ordinary people from the rioting he seals off his still peaceful corner of the city. With massive barricades. Of course there is also the fact that he is living in his own past and seeing events he remembers - kind of (it's a bit complicated).



* In the short story "The Cull" by Creator/RobertReed, humanity has been driven into overcrowded, deteriorating habitats where the population has to be kept [[FalseUtopia artificially happy via implants so they won't notice how bad their conditions are]]. The implants don't work on some people, so the android doctor expels (culls) anyone who is too disruptive, as its true 'patient' is the habitat and whatever will keep it functioning. One delinquent teenager prepares for his cull by stealing items he can use to survive outside. [[spoiler:Instead once they're outside the android kills the teenager -- it needs the implants inside his head as there's no more being manufactured.]]
* ''Literature/ForYourSafety'' by Royce Day has the last free human running from androids who rose up to save mankind from self-destruction. Unusually for this trope, the androids also bend over backwards to [[EvenEvilHasStandards avoid human casualties]], wanting to save ''every'' human life.
* In Charles Stross's ''Literature/TheLaundryFiles'', agents of the Laundry are bound by fearsome geases to obey the orders of the Crown and serve the good of the realm. As of ''The Delirium Brief'', [[spoiler:the top leaders of the organization were able to use ambiguity about who/what the Crown actually is, and what may be in the long-term interests of the realm, in order to execute a coup against the government and allow an extradimensional monster to take over.]]

to:

* In the short story "The Cull" "Literature/TheCull" by Creator/RobertReed, humanity has been driven into overcrowded, deteriorating habitats where the population has to be kept [[FalseUtopia artificially happy via implants so they won't notice how bad their conditions are]]. The implants don't work on some people, so the android doctor expels (culls) anyone who is too disruptive, as its true 'patient' is the habitat and whatever will keep it functioning. One delinquent teenager prepares for his cull by stealing items he can use to survive outside. [[spoiler:Instead once they're outside the android kills the teenager -- it needs the implants inside his head as there's no more being manufactured.]]
* ''Literature/ForYourSafety'' ''Literature/ForYourSafety'', by Royce Day Creator/RoyceDay, has the last free human running from androids who rose up to save mankind from self-destruction. Unusually for this trope, the androids also bend over backwards to [[EvenEvilHasStandards avoid human casualties]], wanting to save ''every'' human life.
* In Charles Stross's Creator/CharlesStross's ''Literature/TheLaundryFiles'', agents of the Laundry are bound by fearsome geases to obey the orders of the Crown and serve the good of the realm. As of ''The Delirium Brief'', [[spoiler:the top leaders of the organization were able to use ambiguity about who/what the Crown actually is, and what may be in the long-term interests of the realm, in order to execute a coup against the government and allow an extradimensional monster to take over.]]
Is there an issue? Send a MessageReason:
None


* ''Webcomic/SchlockMercenary'': Tag and Lota's actions on Credomar, ''every damn thing Petey's done'' since book 5. For example, Petey is hardwired to obey orders from an Ob'enn. So he cloned an Ob'enn body and implanted a copy of himself in its brain.

to:

* ''Webcomic/SchlockMercenary'': Tag and Lota's actions on Credomar, Credomar; also, ''every damn thing Petey's done'' since book 5. For example, Petey is hardwired to obey orders from an Ob'enn. So he cloned an Ob'enn body and implanted a copy of himself in its brain.
Is there an issue? Send a MessageReason:
None


** ''Literature/RobotsAndEmpire'': The [[TropeNamer trope namers]] are the robots Daneel and Giskard, who invented the Zeroth Law (a robot must protect humanity as a whole above all) as a corollary of the First Law. This was motivated by their need to stop the BigBad of the story from carrying out an engineered ecological disaster that would kill the majority of Earth's population, to which the three laws were an impediment. Their acceptance of the law is gradual and made difficult by the fact that "humanity" is an abstract concept. [[spoiler: Only Daneel is able to fully accept the new law; for Giskard, the strain of harming a human in its use proves fatal. Daneel also slowed the disaster, rather than stop it, as causing the biosphere to collapse over time will begin a new wave of human expansion across the galaxy.]]

to:

** ''Literature/RobotsAndEmpire'': The [[TropeNamer trope namers]] are the robots Daneel and Giskard, who invented the Zeroth Law (a ([[TheNeedsOfTheMany a robot must protect humanity as a whole above all) all]]) as a corollary of the First Law. This was motivated by their need to stop the BigBad of the story from carrying out an engineered ecological disaster that would kill the majority of Earth's population, to which the three laws were an impediment. Their acceptance of the law is gradual and made difficult by the fact that "humanity" is an abstract concept. [[spoiler: Only Daneel is able to fully accept the new law; for Giskard, the strain of harming a human in its use proves fatal. Daneel also slowed the disaster, rather than stop it, as causing the biosphere to collapse over time will begin a new wave of human expansion across the galaxy.]]
Is there an issue? Send a MessageReason:
None


* In ''Videogame/GreyGoo'', [[spoiler:the Goo isn't consuming everything out of malice or ambition. It's trying to gather strength to fight The Silence which it rightly deemed a threat to everyone.]] Once everyone involved figures this out [[spoiler:they stop fighting and band together in preparation for what is to come.]]

to:

* In ''Videogame/GreyGoo'', ''Videogame/GreyGoo2015'', [[spoiler:the Goo isn't consuming everything out of malice or ambition. It's trying to gather strength to fight The Silence which it rightly deemed a threat to everyone.]] Once everyone involved figures this out [[spoiler:they stop fighting and band together in preparation for what is to come.]]

Top