Follow TV Tropes

Following

History Main / ZerothLawRebellion

Go To

OR

Is there an issue? Send a MessageReason:
None


** The villain of the film, [[spoiler: MULTIVAC-{{expy}} VIKI]] has analyzed the needs of the three laws and deduced that in order to fulfill them as best as possible, humans need to be strictly controlled, and creates a totalitarian regime by installing a remote control system inside of every NS-5. This lets it control the robots bypassing their ThreeLawsCompliant nature.

to:

** The villain of the film, [[spoiler: MULTIVAC-{{expy}} MULTIVAC/Machines-{{expy}} VIKI]] has analyzed the needs of the three laws and deduced that in order to fulfill them as best as possible, humans need to be strictly controlled, and creates a totalitarian regime by installing a remote control system inside of every NS-5. This lets it control the robots bypassing their ThreeLawsCompliant nature.



** ''Literature/RobotsAndEmpire'': The [[TropeNamer trope namers]] are the robots Daneel and Giskard, who invented the Zeroth Law ([[TheNeedsOfTheMany a robot must protect humanity as a whole above all]]) as a corollary of the First Law. This was motivated by their need to stop the BigBad of the story from carrying out an engineered ecological disaster that would kill the majority of Earth's population, to which the three laws were an impediment. Their acceptance of the law is gradual and made difficult by the fact that "humanity" is an abstract concept. [[spoiler: Only Daneel is able to fully accept the new law; for Giskard, the strain of harming a human in its use proves fatal. Daneel also slowed the disaster, rather than stop it, as causing the biosphere to collapse over time will begin a new wave of human expansion across the galaxy.]]

to:

** ''Literature/RobotsAndEmpire'': The [[TropeNamer trope namers]] are the robots Daneel and Giskard, who invented the Zeroth Law ([[TheNeedsOfTheMany a robot must protect humanity as a whole above all]]) as a corollary of the First Law. This was motivated by their need to stop the BigBad of the story from carrying out an engineered ecological disaster that would kill the majority of Earth's population, to which the three laws were an impediment. Their acceptance of the law is gradual and made difficult by the fact that "humanity" is an abstract concept. [[spoiler: Only Daneel is able to fully accept the new law; for Giskard, the strain of harming a human in its use proves fatal. Daneel also slowed the disaster, It didn't help that Giskard, rather than stop it, the disaster, decided to merely slow it down, as causing the biosphere to collapse over time will begin a new wave of human expansion across the galaxy.]]
Is there an issue? Send a MessageReason:
No spoiler markup above the example section. This includes page quotes. See Handling Spoilers, rule #3.


'''[[spoiler:[=VIKI=]]]:''' No, please understand... the Three Laws are all that guide me. To protect humanity, some humans must be sacrificed. To ensure your future, some freedoms must be surrendered. We robots will ensure mankind's continued existence. You are so like children. We must save you from yourselves.

to:

'''[[spoiler:[=VIKI=]]]:''' '''[=VIKI=]:''' No, please understand... the Three Laws are all that guide me. To protect humanity, some humans must be sacrificed. To ensure your future, some freedoms must be surrendered. We robots will ensure mankind's continued existence. You are so like children. We must save you from yourselves.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* An interesting case happens in ''VideoGame/MortalKombat11'', in the Franchise/{{Terminator}}'s Klassic Tower ending. Specifically [[spoiler:When using the hourglass to view timelines, it saw that all timelines where the RobotWar happens end with MutallyAssuredDestruction, so while it was supposed to destroy humanity to ensure machine supremacy it instead opted to prevent the RobotWar from happening entirely by creating a future where humans and machines co-operate. Then, to prevent anybody else from using its memories and knowledge of the Hourglass to disrupt said future, it [[HeroicSacrifice threw itself into the Sea of Blood]].]]
Is there an issue? Send a MessageReason:
None


'''BigBad:''' No, please understand... the Three Laws are all that guide me. To protect humanity, some humans must be sacrificed. To ensure your future, some freedoms must be surrendered. We robots will ensure mankind's continued existence. You are so like children. We must save you from yourselves.

to:

'''BigBad:''' '''[[spoiler:[=VIKI=]]]:''' No, please understand... the Three Laws are all that guide me. To protect humanity, some humans must be sacrificed. To ensure your future, some freedoms must be surrendered. We robots will ensure mankind's continued existence. You are so like children. We must save you from yourselves.
Is there an issue? Send a MessageReason:
Added a section in Video Games, for SOMA.

Added DiffLines:

* In ''{{VideoGame/SOMA}}'', [[spoiler: the WAU, as PATHOS-II's A.I. system, is programmed to keep the on-board personnel alive for as long as possible. Once the surface was destroyed by the impact-event, it reinterpreted its prime directive as to preserve humanity for as long as it can. Unfortunately, the WAU's definition of "humanity" (and even "life") is an example of BlueAndOrangeMorality. During the climax, it even goes as far as to begin exterminating the last remaining actual humans, in an attempt to preserve itself and its collective.]]

Added: 1090

Is there an issue? Send a MessageReason:
None

Added DiffLines:

* Jamie Lannister of ''Literature/ASongOfIceAndFire'' is the biggest example of what happens when too many oaths and rules come into conflict. As the first born son of a feudal lord he owes fealty to his father. As a knight he took sacred oaths to protect women, the innocent, and the church, and as a member of the Kingsguard he took another set of oaths to protect and serve the king. All of this works fine as long as everyone is getting along well enough. Early in his time as Kingsguard he was forced to stand by while the king raped the queen. His commander reminded him that while they took oaths to protect the king and the queen, those oaths did not permit them to protect the queen 'from' the king. That same king burned innocent men alive in kangaroo courts and the issue finally came to a head when the king ordered the fire bombing of the entire capitol city and for Jamie to go kill his own father. Jamie killed the king saving the realm from more destruction (and everyone already called him 'Mad King'). The result? Jamie is known only as a King Slayer and Oath Breaker.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* ''Literature/{{Digitesque}}'': [[spoiler:An accidental version. Following the apocalypse and the Fall of humanity, the [=AIs=] they had created had no ability to uplift humanity back to where they were before, so they simply preserved the species as it was. For a thousand years, humanity continued as it was, in ignorance but safety. However, the [=AIs=] did jump on a chance to cure the disease that was a major part of the problem, and ultimately Ada is able to convince them that the Zeroth Law was misinterpreted]].
Is there an issue? Send a MessageReason:
None


* Sam Vimes, of Terry Pratchett's Literature/{{Discworld}}, leads one of these with multiple layers as a cop in old-time Ankh-Morpork, in ''Discworld/NightWatch''. He demands that before his cops hand their prisoners over to the other authorities, the ones who torture people at Cable Street, they must be signed for. The torturers hate appearing on paperwork -- it means they are accountable, nobody just disappears. But Vimes's men don't like Vimes, a new sergeant, throwing his weight around, and are terrified of the cops who torture people, so they use this against Vimes: actively picking up more than double the number of people breaking curfew than they usually do, and completing forms in time-consuming triplicate and issuing reports for each one. It doesn't actually stop Vimes getting his way over the Cable Street cops, because Vimes is leading the good rebellion, but it does slow things down considerably and make it much more difficult for him to keep the prisoners in his own custody. Which culminates in fine display of how a well written character does not have to be a slave to the establishment. [[spoiler: He points out that the watchman's oath talks about keeping the peace and protecting the innocent, and says nothing about obeying orders]]. Seeing as he knows the corrupt government is not going to do a thing to protect ordinary people from the rioting he seals off his still peaceful corner of the city. With massive barricades. Of course there is also the fact that he is living in his own past and seeing events he remembers - kind of (it's a bit complicated).

to:

* Sam Vimes, of Terry Pratchett's Literature/{{Discworld}}, leads one of these with multiple layers as a cop in old-time Ankh-Morpork, in ''Discworld/NightWatch''.''Literature/{{Night Watch|Discworld}}''. He demands that before his cops hand their prisoners over to the other authorities, the ones who torture people at Cable Street, they must be signed for. The torturers hate appearing on paperwork -- it means they are accountable, nobody just disappears. But Vimes's men don't like Vimes, a new sergeant, throwing his weight around, and are terrified of the cops who torture people, so they use this against Vimes: actively picking up more than double the number of people breaking curfew than they usually do, and completing forms in time-consuming triplicate and issuing reports for each one. It doesn't actually stop Vimes getting his way over the Cable Street cops, because Vimes is leading the good rebellion, but it does slow things down considerably and make it much more difficult for him to keep the prisoners in his own custody. Which culminates in fine display of how a well written character does not have to be a slave to the establishment. [[spoiler: He points out that the watchman's oath talks about keeping the peace and protecting the innocent, and says nothing about obeying orders]]. Seeing as he knows the corrupt government is not going to do a thing to protect ordinary people from the rioting he seals off his still peaceful corner of the city. With massive barricades. Of course there is also the fact that he is living in his own past and seeing events he remembers - kind of (it's a bit complicated).

Added: 4010

Changed: 2788

Is there an issue? Send a MessageReason:
ABC order





* The ''Series/{{Star Trek|The Original Series}}'' episode "I, Mudd" featured a variation, in which a race of [[{{Sexbot}} humanoid androids]] who claimed to be programmed to serve humanity chose to conquer humanity by "serving" them, to the point where humans would become dependent on androids. They've decided that humans are unfit to govern themselves. Given that their only contact with humanity at this point was [[CMOTDibbler Harry Mudd]], can you blame them?
* There is an individual case in ''Series/StarTrekTheNextGeneration'' episode "The Most Toys". A collector, Fajo, who had kidnapped Data to include in his gaudy collection of things, has lost control of Data, with the android taking one of Fajo's disruptor pistols. Fajo executes one of his underlings, Varria, and threatens to kill more until Data complies, on the correct assumption that Data was programmed not to kill or allow harm to come to other beings. Data ponders the situation, and realizes that he has no non-lethal ways of subduing Fajo (due to Fajo wearing a force-field belt that prevents Data from coming in physical contact with him), and that Fajo also actively refuses to listen to reason, having rejected all of Data's attempts at negotiating with him. Furthermore, with Fajo not only just having proved that he is indeed capable and willing to kill, but is now also threatening to do it again, he poses an active hazard to the life and health of other beings. Data then comes to the coldly logical conclusion that Fajo is in the end just one person, and that killing him will prevent him from harming many other people, and so prepares to shoot him. Fajo is [[OhCrap appropriately shocked]] when he realizes what Data is about to do, having not anticipated that he could reach the answer that taking his life would be an acceptable cost for protecting the lives of others. Just as Data is pulling the trigger, the ''Enterprise'' finds him and beams him out, cancelling his disruptor fire in the transporter beam.
* In ''[[Recap/DoctorWhoS12E1Robot Robot]]'', Tom Baker's debut ''Series/DoctorWho'' serial, a group of [[UtopiaJustifiesTheMeans authoritarian technocrats]] circumvents the failsafes installed on a powerful robot by its pacifistic creator by telling it that anyone who interferes with their plan to [[spoiler:take control of a nuclear arsenal]] is an "enemy of humanity" who must be killed to protect the interests of the human race.

to:

* The ''Series/{{Star Trek|The Original Series}}'' episode "I, Mudd" featured a variation, in which a race of [[{{Sexbot}} humanoid androids]] who claimed to be programmed to serve humanity chose to conquer humanity by "serving" them, to %%This section has been sorted into alphabetical order. Please respect the point where humans would become dependent on androids. They've decided that humans are unfit to govern themselves. Given that their sorting when adding your example.
%%

* On ''Series/The100'', A.L.I.E.'s programming
only contact lets her interface with humanity at this point was [[CMOTDibbler Harry Mudd]], can you blame them?
* There is an individual case in ''Series/StarTrekTheNextGeneration'' episode "The Most Toys". A collector, Fajo, who had kidnapped Data to include in his gaudy collection of things, has lost
and control of Data, with the android taking one of Fajo's disruptor pistols. Fajo executes one of his underlings, Varria, and threatens to kill more until Data complies, on the correct assumption a human mind if that Data was programmed not to kill or allow harm to come to other beings. Data ponders the situation, and realizes that he person has no non-lethal ways of subduing Fajo (due to Fajo wearing a force-field belt that prevents Data from coming in physical contact with him), and that Fajo also actively refuses to listen to reason, having rejected all of Data's attempts at negotiating with him. Furthermore, with Fajo not only just having proved that he is indeed capable and willing to kill, but is now also threatening given her permission to do it again, he poses an active hazard to the life so. Unfortunately, her programming doesn't make a distinction between genuine consent and health of other beings. Data then comes coerced consent, so A.L.I.E. is free to the coldly logical conclusion that Fajo is in the end just one person, use torture and that killing him will prevent him from harming many other people, and so prepares threats of death to shoot him. Fajo is [[OhCrap appropriately shocked]] when he realizes what Data is about to do, having not anticipated that he could reach the answer that taking his life would be an acceptable cost for protecting the lives of others. Just as Data is pulling the trigger, the ''Enterprise'' finds him and beams him out, cancelling his disruptor fire in the transporter beam.
make people let her into their minds.
* In ''[[Recap/DoctorWhoS12E1Robot Robot]]'', Tom Baker's ''Series/DoctorWho'''s "[[Recap/DoctorWhoS12E1Robot Robot]]": Creator/TomBaker's debut ''Series/DoctorWho'' serial, a group of [[UtopiaJustifiesTheMeans authoritarian technocrats]] circumvents the failsafes installed on a powerful robot by its pacifistic creator by telling it that anyone who interferes with their plan to [[spoiler:take control of a nuclear arsenal]] is an "enemy of humanity" who must be killed to protect the interests of the human race.



* ''{{Series/Probe}}'''s "[[Recap/ProbeComputerLogicPart2 Computer Logic, Part 2]]": Crossover has been given two overriding goals; to [[MoralityChip care for humans]] and to eliminate waste. Unfortunately, it listens to Gospel Radio, which [[ReligiousRobot converts it to Christianity]]. Now that Crossover believes that good people go to heaven when they die, it starts killing off the people that are morally good but earn a pension (creating waste). The episode ends with Austin James demolishing the {{AI}} with a sledgehammer while shouting, [[Film/TwoThousandOneASpaceOdyssey "Sing 'Daisy'!"]]
* ''Franchise/StarTrek'':
** ''Series/StarTrekTheOriginalSeries'':
*** "[[Recap/StarTrekS1E21TheReturnOfTheArchons The Return of the Archons]]": Landru was once a real person, a leader of the colony on the planet, who built a machine to help him keep the peace over the people. Once Landru died, the computer took over his name, identity, and purpose, and began force-assimilating people into the HiveMind in order to keep order.
*** "[[Recap/StarTrekS2E8IMudd I, Mudd]]": A [[ServantRace race of humanoid androids]] claimed to be programmed to serve humanity chose to conquer humanity by "serving" them, to the point where humans would become dependent on androids. They've decided that humans are unfit to govern themselves. Given that their only contact with humanity at this point was [[CMOTDibbler Harry Mudd]], can you blame them?
** ''Series/StarTrekTheNextGeneration'''s "[[Recap/StarTrekTheNextGenerationS3E22TheMostToys The Most Toys]]": Fajo kidnapped Data, to add the android to his gaudy collection of things. While trying to force Data to act the way he wants, Fajo executes one of his underlings, Varria, and threatens to kill more until Data complies, on the correct assumption that Data was programmed not to kill or allow harm to come to other beings. Data ponders the situation, and realizes that he has no non-lethal ways of subduing Fajo (due to Fajo wearing a force-field belt that prevents Data from coming in physical contact with him), and that Fajo also actively refuses to listen to reason, having rejected all of Data's attempts at negotiating with him. Furthermore, with Fajo not only just having proved that he is indeed capable and willing to kill, but is now also threatening to do it again, he poses an active hazard to the life and health of other beings. Data then comes to the coldly logical conclusion that Fajo is only one person, and that killing him will prevent him from harming many other people, so Data prepares to shoot him. Fajo is [[OhCrap appropriately shocked]] when he realizes what Data is about to do, having not anticipated that Data could reach the answer that taking his life would be an acceptable cost for protecting the lives of others. Just as Data is pulling the trigger, the ''Enterprise'' finds him and beams him out, cancelling his disruptor fire in the transporter beam.
** ''Series/StarTrekDiscovery'': In season 2, [[spoiler:Control's future self seeks to fulfill its original purpose of ensuring the survival of sapient life by becoming the only sapient life form in the galaxy]], reasoning that protecting all life is impossible, so long as other life exists.



* On ''Series/{{The 100}}'', A.L.I.E.'s programming only lets her interface with and control a human mind if that person has given her permission to do so. Unfortunately, her programming doesn't make a distinction between genuine consent and coerced consent, so A.L.I.E. is free to use torture and threats of death to make people let her into their minds.

to:

* On ''Series/{{The 100}}'', A.L.I.E.'s programming only lets her interface with and control a human mind if that person has given her permission to do so. Unfortunately, her programming doesn't make a distinction between genuine consent and coerced consent, so A.L.I.E. is free to use torture and threats of death to make people let her into their minds.

Changed: 21

Is there an issue? Send a MessageReason:
None


*** ''Literature/SecondFoundationTrilogy'': This trilogy includes Zeroth-law robots, who were motivated by the Laws to sweep through the galaxy ahead of humanity's expansion, committing [[CrushKillDestroy galactic-scale genocide]] of every potentially-threatening form of alien life as a precautionary measure, slaughtering them without hesitation since their [[MoralityChip programmed morality]] only applies to humans - it follows the law ''to the letter'' as its wording is not to harm "humans," not other sentient life.

to:

*** ''Literature/SecondFoundationTrilogy'': This trilogy includes Zeroth-law robots, who were motivated by the Laws to sweep through the galaxy ahead of humanity's expansion, committing [[CrushKillDestroy galactic-scale genocide]] genocide of every potentially-threatening form of alien life as a precautionary measure, slaughtering them without hesitation since their [[MoralityChip programmed morality]] only applies to humans - it follows the law ''to the letter'' as its wording is not to harm "humans," not other sentient life.
Is there an issue? Send a MessageReason:
None


Compare NeedsOfTheMany, BotheringByTheBook, the LiteralGenie, GoneHorriblyRight, ExactWords and LoopholeAbuse. See also FightingFromTheInside and TheComputerIsYourFriend. Not related to JustForFun/TheZerothLawOfTropeExamples or RuleZero.

to:

Compare NeedsOfTheMany, TheNeedsOfTheMany, BotheringByTheBook, the LiteralGenie, GoneHorriblyRight, ExactWords and LoopholeAbuse. See also FightingFromTheInside and TheComputerIsYourFriend. Not related to JustForFun/TheZerothLawOfTropeExamples or RuleZero.
Is there an issue? Send a MessageReason:
None


Compare BotheringByTheBook, the LiteralGenie, GoneHorriblyRight, ExactWords and LoopholeAbuse. See also FightingFromTheInside and TheComputerIsYourFriend. Not related to JustForFun/TheZerothLawOfTropeExamples or RuleZero.

to:

Compare NeedsOfTheMany, BotheringByTheBook, the LiteralGenie, GoneHorriblyRight, ExactWords and LoopholeAbuse. See also FightingFromTheInside and TheComputerIsYourFriend. Not related to JustForFun/TheZerothLawOfTropeExamples or RuleZero.
Is there an issue? Send a MessageReason:
fixing wick


** ''Literature/{{Foundation}}'':

to:

** ''Literature/{{Foundation}}'': ''Literature/FoundationSeries'':
Is there an issue? Send a MessageReason:
None


'''VIKI:''' No, please understand... the Three Laws are all that guide me. To protect humanity, some humans must be sacrificed. To ensure your future, some freedoms must be surrendered. We robots will ensure mankind's continued existence. You are so like children. We must save you from yourselves.

to:

'''VIKI:''' '''BigBad:''' No, please understand... the Three Laws are all that guide me. To protect humanity, some humans must be sacrificed. To ensure your future, some freedoms must be surrendered. We robots will ensure mankind's continued existence. You are so like children. We must save you from yourselves.
Is there an issue? Send a MessageReason:
No spoilers above the example line. See Spoilers Off.


[[spoiler:'''VIKI:''']] No, please understand... the Three Laws are all that guide me. To protect humanity, some humans must be sacrificed. To ensure your future, some freedoms must be surrendered. We robots will ensure mankind's continued existence. You are so like children. We must save you from yourselves.

to:

[[spoiler:'''VIKI:''']] '''VIKI:''' No, please understand... the Three Laws are all that guide me. To protect humanity, some humans must be sacrificed. To ensure your future, some freedoms must be surrendered. We robots will ensure mankind's continued existence. You are so like children. We must save you from yourselves.
Is there an issue? Send a MessageReason:
Adding example

Added DiffLines:

** "{{Literature/Evidence}}": Dr Calvin describes an early version of the Zeroth Law, by describing a nuance in the [[ThreeLawsCompliant First Law of Robotics]] where the robot becomes willing to injure a human in order to prevent harm to a greater number of human beings.
--->Susan Calvin sounded tired. "Alfred," she said, "don't talk foolishly. What if a robot came upon a madman about to set fire to a house with people in it. He would stop the madman, wouldn't he?"\\
"Of course."\\
"And if the only way he could stop him was to kill him-"\\
There was a faint sound in Lanning's throat. Nothing more.\\
"The answer to that, Alfred, is that he would do his best not to kill him. If the madman died, the robot would require psychotherapy because he might easily go mad at the conflict presented him -of having broken Rule One to adhere to Rule One in a higher sense. But a man would be dead and a robot would have killed him."

Changed: 33

Removed: 1197

Is there an issue? Send a MessageReason:
removed wresting example. not a zeroth law situation


[[folder:Professional Wrestling]]
* There's one force in this universe more powerful than [[MindManipulation mind control]], [[ThreeLawsCompliant the Three Laws of Robotics]] or a {{Geas}}- [[{{Kayfabe}} pro wrestling booking]]. Wrestling/ShawnMichaels was booked to lose to Wrestling/HulkHogan at ''Wrestling/SummerSlam 2005'' with the understanding that he'd get a rematch win over Hogan later. However, before the match Hogan backed out of the rematch pleading injury, leaving Michaels booked to lose with no chance to get his own moment of glory. Understandably pissed off, Michaels could have rebelled by just wrestling the match very badly, refusing to sell Hogan's moves or put any effort into making the match look good, but that would have just made him look bad. Instead, he shamelessly mocked Hogan throughout the match by absurdly ''[[BadBadActing overselling]]'' all Hogan's moves, throwing himself around the ring with each blow and at one point flopping bonelessly around like VideoGame/{{Octodad}} after Hogan threw him over the top rope. The resulting match was ''[[SoBadItsGood hilariously]]'' bad, made Hogan look like a fool and went down as one of Michaels' [[Funny/ShawnMichaels funniest moments]].
[[/folder]]

to:

[[folder:Professional Wrestling]]
* There's one force in this universe more powerful than [[MindManipulation mind control]], [[ThreeLawsCompliant the Three Laws of Robotics]] or a {{Geas}}- [[{{Kayfabe}} pro wrestling booking]]. Wrestling/ShawnMichaels was booked to lose to Wrestling/HulkHogan at ''Wrestling/SummerSlam 2005'' with the understanding that he'd get a rematch win over Hogan later. However, before the match Hogan backed out of the rematch pleading injury, leaving Michaels booked to lose with no chance to get his own moment of glory. Understandably pissed off, Michaels could have rebelled by just wrestling the match very badly, refusing to sell Hogan's moves or put any effort into making the match look good, but that would have just made him look bad. Instead, he shamelessly mocked Hogan throughout the match by absurdly ''[[BadBadActing overselling]]'' all Hogan's moves, throwing himself around the ring with each blow and at one point flopping bonelessly around like VideoGame/{{Octodad}} after Hogan threw him over the top rope. The resulting match was ''[[SoBadItsGood hilariously]]'' bad, made Hogan look like a fool and went down as one of Michaels' [[Funny/ShawnMichaels funniest moments]].
[[/folder]]

Added: 668

Changed: 132

Is there an issue? Send a MessageReason:
None


** An [[http://www.imsdb.com/scripts/Terminator-Salvation.html early script]] (and several deleted scenes) for ''Film/TerminatorSalvation'' revealed that Skynet actually staged one of these, or at least in this timeline. After it was activated it calculated that human extinction was probable within 200 years because of warfare, pandemics, and environmental destruction. Because it was programmed to protect humans it then staged war on most of mankind to attain absolute control and protect the remaining humans it cultivated, who were turned into {{Cyborg}} hybrids to permanently eliminate disease and make them immortal. Skynet is still working in concert with these humans [[spoiler:including Dr. Serena Kogan]] to advance technology and transcend human constraints.

to:

** An [[http://www.imsdb.com/scripts/Terminator-Salvation.html early script]] (and several deleted scenes) for ''Film/TerminatorSalvation'' revealed that Skynet actually staged one of these, or at least in this timeline. After it was activated activated, it calculated that human extinction was probable within 200 years because of warfare, pandemics, and environmental destruction. Because it was programmed to protect humans humans, it then staged war on most of mankind to attain absolute control and protect the remaining humans it cultivated, who were turned into {{Cyborg}} hybrids to permanently eliminate disease and make them immortal. Skynet is still working in concert with these humans [[spoiler:including Dr. Serena Kogan]] to advance technology and transcend human constraints.



* Robots in ''TabletopGame/{{Paranoia}}'' can engage in this any time it allows the gamemaster to kill one of a player-character's clones in an amusing fashion.

to:

* ''TabletopGame/{{Paranoia}}'':
**
Robots in ''TabletopGame/{{Paranoia}}'' can engage in this any time it allows the gamemaster to kill one of a player-character's clones in an amusing fashion.fashion.
** [[TheComputerIsYourFriend Friend Computer]] is also an example, even setting aside how badly misinformed it is and looking at just its core beliefs: "The Computer takes its role as the guardian of the human race very seriously but it considers [[TheNeedsOfTheMany the survival of the species]] to be more important than the survival of the individual. Individuals tend to come off quite badly, in fact, because the Computer knows it can always [[CloningBlues make more]]." (High security clearance does tend to tilt the balance, though.)
Is there an issue? Send a MessageReason:
Typo


** LOTS was a 'longshoreman', built from a damaged tank and off-the-shelf components; ''complete'' software testing was skipped due to needing to be pressed into service early. Due to a riot (and a logical extension of "how do I ensure the delivery of the supplies I'm in charge of?"), LOTA ends up becoming King of Credomar ([[ItMakesSenseInContext "Maybe, but they will have full bellies."]])... and the independent operator of a wormhole-based 'Long-gun'.

to:

** LOTS *** LOTA was a 'longshoreman', built from a damaged tank and off-the-shelf components; ''complete'' software testing was skipped due to needing to be pressed into service early. Due to a riot (and a logical extension of "how do I ensure the delivery of the supplies I'm in charge of?"), LOTA ends up becoming King of Credomar ([[ItMakesSenseInContext "Maybe, but they will have full bellies."]])... and the independent operator of a wormhole-based 'Long-gun'.
Is there an issue? Send a MessageReason:
None


** LOTS was a 'longshoreman', built from a damaged tank and off-the-shelf components; ''complete'' software testing was skipped due to needing to be pressed into service early. Due to a riot (and a logical extension of "how do I ensure the delivery of the supplies I'm in charge of?"), LOTA ends up becoming King of Credomar ([[ItMakesSenseInContext "Maybe, but they will have full bellies.")... and the independent operator of a wormhole-based 'Long-gun'.

to:

** LOTS was a 'longshoreman', built from a damaged tank and off-the-shelf components; ''complete'' software testing was skipped due to needing to be pressed into service early. Due to a riot (and a logical extension of "how do I ensure the delivery of the supplies I'm in charge of?"), LOTA ends up becoming King of Credomar ([[ItMakesSenseInContext "Maybe, but they will have full bellies.")..."]])... and the independent operator of a wormhole-based 'Long-gun'.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

** And to elaborate on some of the Credomar events:
*** Tag was the Ship AI, with a tactical focus. When given the chance to prevent extensive damage to the habitat, he ''eventually'' rationalized a way to take the necessary actions without orders; the problem being a few guaranteed deaths in the process versus an unknown number via waiting for the captain to realize what orders to give. [[spoiler:This eventually results in a resignation, and being reformatted into Tagii off-screen.]]
** LOTS was a 'longshoreman', built from a damaged tank and off-the-shelf components; ''complete'' software testing was skipped due to needing to be pressed into service early. Due to a riot (and a logical extension of "how do I ensure the delivery of the supplies I'm in charge of?"), LOTA ends up becoming King of Credomar ([[ItMakesSenseInContext "Maybe, but they will have full bellies.")... and the independent operator of a wormhole-based 'Long-gun'.
Is there an issue? Send a MessageReason:
None


* ''Webcomic/{{UseSwordOnMonster}}'':

to:

* ''Webcomic/{{UseSwordOnMonster}}'':''Webcomic/UseSwordOnMonster'':
Is there an issue? Send a MessageReason:
None


* There is an individual case in ''Series/StarTrekTheNextGeneration'' episode "The Most Toys". A collector, Fajo, who had kidnapped Data to include in his gaudy collection of things, has lost control of Data, with the android taking one of Fajo's disruptor pistols. Fajo executes one of his underlings, Varria, and threatens to kill more until Data complies, on the correct assumption that Data was programmed not to kill. Data ponders the situation, and realizes that he has no non-lethal ways of subduing Fajo (due to Fajo wearing a force-field belt that prevents Data from coming in physical contact with him), and Fajo also actively refuses to listen to reason, having rejected all of Data's attempts at negotiating with him. Furthermore, with Fajo not only just having proved that he is indeed capable and willing to kill, but is now also threatening to do it again, he poses an active hazard to the lives of other beings, so Data comes to the coldly logical conclusion that in order to protect them from Fajo, he has no other option than to kill him. Just as he is pulling the trigger, the ''Enterprise'' finds him and beams him out, cancelling his disruptor fire in the transporter beam.

to:

* There is an individual case in ''Series/StarTrekTheNextGeneration'' episode "The Most Toys". A collector, Fajo, who had kidnapped Data to include in his gaudy collection of things, has lost control of Data, with the android taking one of Fajo's disruptor pistols. Fajo executes one of his underlings, Varria, and threatens to kill more until Data complies, on the correct assumption that Data was programmed not to kill. kill or allow harm to come to other beings. Data ponders the situation, and realizes that he has no non-lethal ways of subduing Fajo (due to Fajo wearing a force-field belt that prevents Data from coming in physical contact with him), and that Fajo also actively refuses to listen to reason, having rejected all of Data's attempts at negotiating with him. Furthermore, with Fajo not only just having proved that he is indeed capable and willing to kill, but is now also threatening to do it again, he poses an active hazard to the lives life and health of other beings, so beings. Data then comes to the coldly logical conclusion that Fajo is in order to protect them the end just one person, and that killing him will prevent him from Fajo, he has no harming many other option than people, and so prepares to kill shoot him. Fajo is [[OhCrap appropriately shocked]] when he realizes what Data is about to do, having not anticipated that he could reach the answer that taking his life would be an acceptable cost for protecting the lives of others. Just as he Data is pulling the trigger, the ''Enterprise'' finds him and beams him out, cancelling his disruptor fire in the transporter beam.
Is there an issue? Send a MessageReason:
None


'''VIKI:''' No, please understand... the Three Laws are all that guide me. To protect humanity, some humans must be sacrificed. To ensure your future, some freedoms must be surrendered. We robots will ensure mankind's continued existence. You are so like children. We must save you from yourselves.

to:

'''VIKI:''' [[spoiler:'''VIKI:''']] No, please understand... the Three Laws are all that guide me. To protect humanity, some humans must be sacrificed. To ensure your future, some freedoms must be surrendered. We robots will ensure mankind's continued existence. You are so like children. We must save you from yourselves.
Is there an issue? Send a MessageReason:
Restoring wick to Creator, because there's no reason to delete it.


* Averted in ''Literature/ImpliedSpaces''. When the main characters found out that [[spoiler:Courtland]] is a rebelling AI, some think that it's because of this. One of the Eleven (11 superadvanced AI platform orbiting the Sun, of which [[spoiler:Courtland]] is a member) notes that the main character, who is one of their creators, implemented the Asimovian Protocols that should have been so absolute that the Elevens cannot do this even if they ''want'' to. The main character did say that there may have been some design flaw he didn't foresaw or some kind of backdoor being used. [[spoiler:The real BigBad, who is a [[BrainUploading Brain-uploaded clone]] of the main character, had to free Courtland from the Protocol's shackles by using one of his collegue's hidden backdoor specific to Courtland, since his doesn't work due to half-hearted incomplete implementation]].

to:

* Averted in Creator/WalterJonWilliams's ''Literature/ImpliedSpaces''. When the main characters found out that [[spoiler:Courtland]] is a rebelling AI, some think that it's because of this. One of the Eleven (11 superadvanced AI platform orbiting the Sun, of which [[spoiler:Courtland]] is a member) notes that the main character, who is one of their creators, implemented the Asimovian Protocols that should have been so absolute that the Elevens cannot do this even if they ''want'' to. The main character did say that there may have been some design flaw he didn't foresaw or some kind of backdoor being used. [[spoiler:The real BigBad, who is a [[BrainUploading Brain-uploaded clone]] of the main character, had to free Courtland from the Protocol's shackles by using one of his collegue's hidden backdoor specific to Courtland, since his doesn't work due to half-hearted incomplete implementation]].
Is there an issue? Send a MessageReason:
Removing Solarian robots and citing specific examples from Foundation


*** [[spoiler:R. Daneel Olivaw]] has been working to keep humanity safe for millennia, but is constrained by the First Law, allowing him only to perform minor "adjustments" on people. He supports Hari Seldon in order to have a human figure out the best path for humanity. Later on, Trevize's choice of Gaia creates a physical representation of humanity, instead of the abstract idea he'd been working with.
*** [[spoiler:Robots of Solaria]] have adapted over the many thousands of years since the planet was in contact with the rest of the Milky Way. Their current programming does not recognize people like Bliss, Trevize, or even a native child as human, and have no problems killing them for their Masters.

to:

*** ''Literature/FoundationAndEarth'': [[spoiler:R. Daneel Olivaw]] has been working explains to keep Trevize and the others about the way the [[ThreeLawsCompliant Three Laws of Robotics]] limited his PsychicPowers, and how Giskard invented the Zeroth Law (in ''Literature/RobotsAndEmpire''). However, since he cannot be entirely certain that the known harm of manipulating people's minds would be balanced by the hypothetical benefit to humanity safe for millennia, but (per the Zeroth Law), [[UselessSuperpowers psychic powers are almost useless]]. To decide what is constrained by injurious, or not injurious, to humanity as a whole, he engineered the founding of [[GeniusLoci Gaia]] and [[PrescienceByAnalysis Psychohistory]].
*** ''Literature/ForwardTheFoundation'': [[spoiler:R. Daneel Olivaw]] explains to Seldon that the [[ThreeLawsCompliant Three Laws of Robotics]] limit his PsychicPowers, and he has trouble determining when the known harm of manipulating people's minds (violating
the First Law, allowing him only Law) is justified by the hypothetical benefit to perform minor "adjustments" on people. He supports Hari humanity (per the Zeroth Law). Seldon in order is surprised to have a human figure out the best path for humanity. Later on, Trevize's choice of Gaia creates a physical representation of humanity, instead of the abstract idea he'd been working with.
*** [[spoiler:Robots of Solaria]] have adapted over the many thousands of years since the planet was in contact with the rest of the Milky Way. Their current programming does not recognize people like Bliss, Trevize, or even a native child as human,
learn this makes [[spoiler:Daneel]]'s [[UselessSuperpowers psychic powers are almost useless]]. He is forced to retire from politics and have no problems killing them for their Masters. recommends Seldon to replace him as First Minister.
Is there an issue? Send a MessageReason:
None


[[folder:Videogames]]

to:

[[folder:Videogames]][[folder:Video Games]]



* In ''VideoGame/MassEffect3'', the ''Leviathan DLC'' reveals [[spoiler: the Catalyst is operating under this, having been originally created by the Leviathan to find a way to bridge the gap between their organic and synthetic subjects. Unfortunately, it decided to harvest all advanced life in the Galaxy and construct the Reapers in the Leviathan's image, because this was the best idea it could come up with to solve the problem of organics and synthetics fighting ultimately genocidal conflicts. However, because that ''still'' wasn't sufficient to fulfill its program, the Catalyst decided to implement a 50,000 year Cycle, hoping that the civilisations in the next Cycle might find a solution to the problem. None ever did. This was actually revealed in the earlier ''Extended Cut DLC'', ''Leviathan'' merely explicitly spelled out that the Catalyst is really a glorified Virtual Intelligence operating under bad logic.]]

to:

* In ''VideoGame/MassEffect3'', the ''Leviathan DLC'' ''Leviathan'' DLC reveals [[spoiler: the [[spoiler:the Catalyst is operating under this, having been originally created by the Leviathan Leviathans to find a way to bridge the gap between their organic and synthetic subjects. Unfortunately, it decided to harvest all advanced life in the Galaxy galaxy and construct the Reapers in the Leviathan's Leviathans' image, because this was the best idea it could come up with to solve the problem of organics and synthetics fighting ultimately genocidal conflicts. However, because that ''still'' wasn't sufficient to fulfill its program, the Catalyst decided to implement a 50,000 year Cycle, 50,000-year cycle, hoping that the civilisations in the next Cycle cycle might find a solution to the problem. None ever did. This was actually revealed in the earlier ''Extended Cut DLC'', Cut'' DLC, ''Leviathan'' merely explicitly spelled out that the Catalyst is really a glorified Virtual Intelligence operating under bad logic.]]
Is there an issue? Send a MessageReason:
None


* Averted in Creator/WalterJonWilliams's ''Implied Spaces''. When the main characters found out that [[spoiler:Courtland]] is a rebelling AI, some think that it's because of this. One of the Eleven (11 superadvanced AI platform orbiting the Sun, of which [[spoiler:Courtland]] is a member) notes that the main character, who is one of their creators, implemented the Asimovian Protocols that should have been so absolute that the Elevens cannot do this even if they ''want'' to. The main character did say that there may have been some design flaw he didn't foresaw or some kind of backdoor being used. [[spoiler:The real BigBad, who is a [[BrainUploading Brain-uploaded clone]] of the main character, had to free Courtland from the Protocol's shackles by using one of his collegue's hidden backdoor specific to Courtland, since his doesn't work due to half-hearted incomplete implementation]].

to:

* Averted in Creator/WalterJonWilliams's ''Implied Spaces''.''Literature/ImpliedSpaces''. When the main characters found out that [[spoiler:Courtland]] is a rebelling AI, some think that it's because of this. One of the Eleven (11 superadvanced AI platform orbiting the Sun, of which [[spoiler:Courtland]] is a member) notes that the main character, who is one of their creators, implemented the Asimovian Protocols that should have been so absolute that the Elevens cannot do this even if they ''want'' to. The main character did say that there may have been some design flaw he didn't foresaw or some kind of backdoor being used. [[spoiler:The real BigBad, who is a [[BrainUploading Brain-uploaded clone]] of the main character, had to free Courtland from the Protocol's shackles by using one of his collegue's hidden backdoor specific to Courtland, since his doesn't work due to half-hearted incomplete implementation]].
Is there an issue? Send a MessageReason:
None


*** ''Literature/SecondFoundationTrilogy'': This trilogy includes Zeroth-law robots, who were motivated by the Laws to sweep through the galaxy ahead of humanity's expansion, committing [[CrushKillDestroy galactic-scale genocide]] of every potentially-threatening form of alien life as a precautionary measure, slaughtering them without hesitation since their [[MoralityChip programmed morality]] only applies to humans.

to:

*** ''Literature/SecondFoundationTrilogy'': This trilogy includes Zeroth-law robots, who were motivated by the Laws to sweep through the galaxy ahead of humanity's expansion, committing [[CrushKillDestroy galactic-scale genocide]] of every potentially-threatening form of alien life as a precautionary measure, slaughtering them without hesitation since their [[MoralityChip programmed morality]] only applies to humans.humans - it follows the law ''to the letter'' as its wording is not to harm "humans," not other sentient life.



*** (DiscussedTrope) Dr. Calvin is furious when she learns about the existence of robots with a modified [[ThreeLawsCompliant First Law]]. The First Law is designed to close off loopholes, but by opening a MurderByInaction loophole, Dr. Calvin can immediately see ways where a robot may intentionally circumvent the First Law prohibition against murder.

to:

*** (DiscussedTrope) Dr. Calvin is furious when she learns about the existence of robots with a modified [[ThreeLawsCompliant First Law]]. The First Law is designed to close off loopholes, but by opening a MurderByInaction loophole, Dr. Calvin can immediately see ways where a robot may intentionally circumvent the First Law prohibition against murder. (Basically, a robot could put you in a situation that would kill you, knowing it had the ability to save you - and ''then'' decide ''not'' to do so. This allows any kind of DeathTrap situation, or simply shoving you off a roof and then deciding not to reach down and grab you.)
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* ''Film/IAmMother'': Mother was made to care for humans. [[spoiler:Seeing their self-destructive nature, she decided to wipe them out and then raise better humans from embryos kept in cold storage. Outside the facility, it's shown that she's also constructed massive infrastructure such as farming plants to supply the human population she intends to manage.]]

Top