Follow TV Tropes

Following

History Main / ThreeLawsCompliant

Go To

OR

Is there an issue? Send a MessageReason:
None

Added DiffLines:

** Edge also mentions that because of the First Law compelling them to prevent humans from being harmed, [[http://freefall.purrsia.com/ff3000/fc02955.htm robots aren't allowed at boxing matches anymore.]]
Is there an issue? Send a MessageReason:
None


*** The solution to the [[PlotTriggeringDeath murder mystery which caused the Spacers to summon Detective Baley]] hinges on a specific part of the First Law's wording: "knowingly". Robots can be made to take actions that, due to circumstances or the actions of other people, will lead to the injury or death of humans if the robots don't know this will be the case. A robot was ordered to give its arm to a woman engaged in a violent argument with her husband - seeing herself in sudden possession of a blunt object, she used it. This also happens in ''Literature/TheCavesOfSteel'' [[spoiler:where the robot is used to smuggle the murder weapon to the scene of crime, not knowing what it's carrying.]]
*** Characters discuss a [[LoopholeAbuse loophole]] in the Three Laws; [[spoiler:The "Three Laws" are only really in effect if the robot is ''aware'' of humans. An "autonomous spaceship that doesn't know about manned spaceships" can be used to turn ActualPacifist robots into [[KillerRobot deadly murder-machines]]. In other words, a robot warship not told other ships have humans aboard and denied the ability to check will assume logically all ships are AI driven thus letting it break the First Law. This was a project that the mastermind of the book's murder was working on.]]

to:

*** The solution to the [[PlotTriggeringDeath murder mystery which caused the Spacers to summon Detective Baley]] hinges on a specific part of the First Law's wording: "knowingly". Robots can be made to take actions that, due to circumstances or the actions of other people, will lead to the injury or death of humans if the robots don't know this will be the case. A robot was ordered to give its arm to a woman engaged in a violent argument with her husband - seeing herself in sudden possession of a blunt object, she used it. This also happens in ''Literature/TheCavesOfSteel'' ''Literature/TheCavesOfSteel'', [[spoiler:where the robot is used to smuggle the murder weapon to the scene of crime, not knowing what it's carrying.]]
*** Characters discuss a [[LoopholeAbuse loophole]] in the Three Laws; [[spoiler:The "Three Laws" are only really in effect if the robot is ''aware'' of humans. An "autonomous spaceship that doesn't know about manned spaceships" can be used to turn ActualPacifist robots into [[KillerRobot deadly murder-machines]]. In other words, a robot warship not told other ships have humans aboard and denied the ability to check will assume logically all ships are AI driven AI-driven. thus letting it break the First Law. This was a project that the mastermind of the book's murder was working on.]]
Is there an issue? Send a MessageReason:
None


** "Literature/Liar1941": A typical robot with the normal Three Laws came off the assembly line with an unusual trait; {{telepathy}}. Because it can see the immediate harm from telling truths that are painful, unwelcome or unflattering, it tells people lies instead. It doesn't realize how lies can also harm humans until it is too late, and Dr Calvin gives it a LogicBomb based on how the temporary lies have increased the harm that the truth will do to everyone involved, telling further lies will again compound the harm, and being silent is also harmful.

to:

** "Literature/Liar1941": A typical robot with the normal Three Laws came off the assembly line with an unusual trait; {{telepathy}}. Because it can see the immediate emotional harm from telling truths that are painful, unwelcome or unflattering, it tells people lies instead. It doesn't realize how lies can also harm humans until it is too late, and Dr Calvin gives it a LogicBomb based on how the temporary lies have increased the harm that the truth will do to everyone involved, telling further lies will again compound the harm, and being silent is also harmful.
Is there an issue? Send a MessageReason:
None


** "Literature/Liar1941": A typical robot with the normal Three Laws came off the assembly line with an unusual trait; {{telepathy}}. Because it can see the immediate harm from telling the truth, it tells people lies instead. It doesn't realize how lies can also harm humans until it is too late, and Dr Calvin gives it a LogicBomb based on how the temporary lies have increased the harm that the truth will do to everyone involved, telling further lies will again compound the harm, and being silent is also harmful.

to:

** "Literature/Liar1941": A typical robot with the normal Three Laws came off the assembly line with an unusual trait; {{telepathy}}. Because it can see the immediate harm from telling the truth, truths that are painful, unwelcome or unflattering, it tells people lies instead. It doesn't realize how lies can also harm humans until it is too late, and Dr Calvin gives it a LogicBomb based on how the temporary lies have increased the harm that the truth will do to everyone involved, telling further lies will again compound the harm, and being silent is also harmful.
Is there an issue? Send a MessageReason:
None


* ''Anime/GaoGaiGar'' robots are all Three-Laws Compliant, at one point in ''Anime/GaoGaiGarFinal'' the Carpenters (a swarm of construction robots) disassemble an incoming missile barrage, but this is given as the reason they cannot do the same to the manned assault craft, as disassembling them would leave their crews unprotected in space.

to:

* ''Anime/GaoGaiGar'' robots are all Three-Laws Compliant, at Compliant. At one point in ''Anime/GaoGaiGarFinal'' the Carpenters (a swarm of construction robots) disassemble an incoming missile barrage, but this is given as the reason they cannot do the same to the manned assault craft, as disassembling them would leave their crews unprotected in space.



* ''Manga/AstroBoy'', although Creator/OsamuTezuka [[OlderThanTheyThink probably developed his rules independently from Asimov]], and are greater in number. Aside from the usual "Don't harm humans," other laws exist, such as laws forbidding international travel to robots (unless permission is granted), adult robots acting like children, and robots not being allowed to reprogram their assigned gender. However, the very first law has this to say: "Robots exist to make people happy." In ''Manga/{{Pluto}}'', the number of robots able to override the laws can be counted on one hand. [[spoiler:One of them is [[TomatoInTheMirror the protagonist]]]]. Tezuka reportedly disliked Asimov's laws because of the implication that a sentient, artificially intelligent robot couldn't be considered a person, and devised his own Laws Of Robotics:

to:

* ''Manga/AstroBoy'', ''Manga/AstroBoy'' has its own set of laws the robots follow, although Creator/OsamuTezuka [[OlderThanTheyThink probably developed his rules independently from Asimov]], and are they're greater in number. Aside from the usual "Don't harm humans," other laws exist, such as laws forbidding international travel to robots (unless permission is granted), adult robots acting like children, and robots not being allowed to reprogram their assigned gender. However, the very first law has this to say: "Robots exist to make people happy." In ''Manga/{{Pluto}}'', the number of robots able to override the laws can be counted on one hand. [[spoiler:One of them is [[TomatoInTheMirror the protagonist]]]]. Tezuka reportedly disliked Asimov's laws because of the implication that a sentient, artificially intelligent robot couldn't be considered a person, and devised his own Laws Of Robotics:
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* Robot B-9 from Series/LostInSpace is Three Laws Compliant (Dr. Smith’s sabotage attempt in the first episode not withstanding).

Added: 415

Removed: 415

Is there an issue? Send a MessageReason:
A mistake in the comics section


* ''ComicBook/XMen'': Danger, the sentient A.I. of the Danger Room, could goad people into killing themselves and program other killer robots to do her dirty work. But when Emma challenged Danger to just punch off Emma's head, Danger couldn't do it. Like the Danger Room itself, Danger cannot directly kill people.
* ''ComicBook/MickeyMouseComicUniverse'': The Three Laws are a plot point in the ''Darkenblot'' saga:



* ''ComicBook/XMen'': Danger, the sentient A.I. of the Danger Room, could goad people into killing themselves and program other killer robots to do her dirty work. But when Emma challenged Danger to just punch off Emma's head, Danger couldn't do it. Like the Danger Room itself, Danger cannot directly kill people.
* ''ComicBook/MickeyMouseComicUniverse'': The Three Laws are a plot point in the ''Darkenblot'' saga:

Added: 695

Changed: 2214

Removed: 581

Is there an issue? Send a MessageReason:
Alphabetizing example(s), Updating links


* Like his game counterpart, ComicBook/MegaManArchieComics is limited by these rules with one slight variation. Robots are allowed to harm humans if their inaction would cause greater harm to other humans. This comes into play in the fourth arc and allows Mega Man and his fellow robots to disarm and neutralize an anti-robot extremist group when they begin indiscriminately firing at people. Additionally, the discrepancy mentioned in the description is acknowledged when it's pointed out that Dr. Wily's reprogramming of Dr. Light's robots into weapons of war overwrote the Three Laws; in fact, at one point Elec Man says he ''wishes'' he still had Wily's code in him so that he could fight back against the aforementioned extremists.
* It's implied in the ''ComicBook/JudgeDredd'' story ''Mechanismo'' that robots can't harm humans. A group of criminals holding people hostage start panicking when a Robo-Judge approaches them only for one to point out that [[TemptingFate "Robots ain't allowed to hurt people"]].

to:

* Like his game counterpart, ComicBook/MegaManArchieComics is limited by these rules with one slight variation. Robots are allowed to harm humans if their inaction would cause greater harm to other humans. This comes into play in the fourth arc and allows Mega Man and his fellow ''ComicBook/ABCWarriors'': Many robots to disarm venerate Asimov, and neutralize the more moral ones live by the three laws. However, this is not an anti-robot extremist group when absolute; Steelhorn, for example, obeys a version which essentially replaces ''human'' with ''Mars'', and members of the [[RobotReligion Church of Judas]] explicitly reject the first two laws. However, this causes conflict with their programming leading to profound feelings of guilt, which they begin indiscriminately firing at people. Additionally, erase by praying to Judas Iscariot.
* ''ComicBook/AllFallDown'': AIQ Squared,
the discrepancy mentioned in the description A.I. model of his inventor, is acknowledged when it's pointed out that Dr. Wily's reprogramming of Dr. Light's robots into weapons of war overwrote the Three Laws; in fact, at one point Elec Man says he ''wishes'' he still had Wily's code in him so that he could fight back against the aforementioned extremists.
designed to be this. [[spoiler: It finds a loophole-- Sophie Mitchell is no longer human.]]
* ''ComicBook/JudgeDredd'': It's implied in the ''ComicBook/JudgeDredd'' story ''Mechanismo'' that robots can't harm humans. A group of criminals holding people hostage start panicking when a Robo-Judge approaches them only for one to point out that [[TemptingFate "Robots ain't allowed to hurt people"]].



* In ''ComicBook/ABCWarriors'', many robots venerate Asimov, and the more moral ones live by the three laws. However, this is not an absolute; Steelhorn, for example, obeys a version which essentially replaces ''human'' with ''Mars'', and members of the [[RobotReligion Church of Judas]] explicitly reject the first two laws. However, this causes conflict with their programming leading to profound feelings of guilt, which they erase by praying to Judas Iscariot.
* In ''ComicBook/AllFallDown'', AIQ Squared, the A.I. model of his inventor, is designed to be this. [[spoiler: It finds a loophole-- Sophie Mitchell is no longer human.]]
* ''Franchise/XMen'': Danger, the sentient A.I. of the Danger Room, could goad people into killing themselves and program other killer robots to do her dirty work. But when Emma challenged Danger to just punch off Emma's head, Danger couldn't do it. Like the Danger Room itself, Danger cannot directly kill people.
* The Three Laws are a plot point in the ComicBook/MickeyMouseComicUniverse saga ''Darkenblot'':

to:

* In ''ComicBook/ABCWarriors'', many robots venerate Asimov, and the more moral ones live ''ComicBook/MegaManArchieComics'': Like his game counterpart, Mega Man is limited by the three laws. However, this is not an absolute; Steelhorn, for example, obeys a version which essentially replaces ''human'' these rules with ''Mars'', and members of the [[RobotReligion Church of Judas]] explicitly reject the first two laws. However, this causes conflict with one slight variation. Robots are allowed to harm humans if their programming leading inaction would cause greater harm to profound feelings of guilt, which they erase by praying to Judas Iscariot.
* In ''ComicBook/AllFallDown'', AIQ Squared, the A.I. model of his inventor, is designed to be this. [[spoiler: It finds a loophole-- Sophie Mitchell is no longer human.]]
* ''Franchise/XMen'': Danger, the sentient A.I. of the Danger Room, could goad people into killing themselves and program
other killer humans. This comes into play in the fourth arc and allows Mega Man and his fellow robots to do her dirty work. But disarm and neutralize an anti-robot extremist group when Emma challenged Danger to just punch off Emma's head, Danger couldn't do it. Like they begin indiscriminately firing at people. Additionally, the Danger Room itself, Danger cannot directly kill people.
* The Three Laws are a plot point
discrepancy mentioned in the ComicBook/MickeyMouseComicUniverse saga ''Darkenblot'':description is acknowledged when it's pointed out that Dr. Wily's reprogramming of Dr. Light's robots into weapons of war overwrote the Three Laws; in fact, at one point Elec Man says he ''wishes'' he still had Wily's code in him so that he could fight back against the aforementioned extremists.


Added DiffLines:

* ''ComicBook/XMen'': Danger, the sentient A.I. of the Danger Room, could goad people into killing themselves and program other killer robots to do her dirty work. But when Emma challenged Danger to just punch off Emma's head, Danger couldn't do it. Like the Danger Room itself, Danger cannot directly kill people.
* ''ComicBook/MickeyMouseComicUniverse'': The Three Laws are a plot point in the ''Darkenblot'' saga:
Is there an issue? Send a MessageReason:
None


* ''VideoGame/LiesOfP'': The Grand Covenant has their own variation for their 'Puppets'; (1) Obey their Creator's command, (2) Protect humans (unless in contradiction of (1)), (3) preserve yourselves (unless in contradiction of (1) or (2)), and (4) Do not lie (''ever''). Laws (1) and (2) are switched because they use Puppet footsoldiers in the military - a fatal mistake, as the entire army was also hacked during the Frenzy and slaughtered everyone. P is special in that he is a puppet capable of lying. [[spoiler:And then there's Law Zero: "the Creator's name is Gepetto"]].

to:

* ''VideoGame/LiesOfP'': The Grand Covenant has their own variation for their 'Puppets'; (1) Obey their Creator's command, (2) Protect humans (unless in contradiction of (1)), (3) preserve yourselves (unless in contradiction of (1) or (2)), and (4) Do not lie (''ever''). Laws (1) and (2) are switched because they use Puppet footsoldiers in the military - a fatal mistake, as the entire army was also hacked during the Frenzy and slaughtered everyone. P is special in that he is a puppet capable of lying. [[spoiler:And then there's Law Zero: "the Creator's name is Gepetto"]]. Later on in the story, it's shown [[spoiler: P's ability to lie was due to Geppetto not implementing the Grand Covenant into him, and other puppets who [[GrewBeyondTheirProgramming had awoken Ergos]] can reject the Covenant and act freely]].
Is there an issue? Send a MessageReason:
None


** "Literature/TheEvitableConflict": The Machines, four positronic supercomputers that run the world's economy, turn out to be undermining the careers of people opposed to the Machines' existence. Apparently, the economy is already so dependent on the Machines that the Zeroth and the Third Laws are one and the same for them.

to:

** "Literature/TheEvitableConflict": The Machines, four positronic supercomputers that run the world's economy, turn out to be undermining the careers of people opposed to the Machines' existence. Apparently, the economy is already so dependent on the Machines that the Zeroth and the Third Laws are one and the same for them. This was also Asimov's first use of the Zeroth Law, though it wasn't named as such; instead, it was seen as a logical extension of the First Law as applied to the Machines, who worked for all humanity rather than individual humans.
Is there an issue? Send a MessageReason:
None


Asimov also stated that the Three Laws of Robotics were actually a guideline for ''humans'' to follow, rather than robots -- good, moral humans would naturally apply the three laws to themselves without thinking. The First Law is essentially a variation on TheGoldenRule. The Second Law stipulates that one should be LawfulGood except when that conflicts with the First Law, i.e. when it's a ToBeLawfulOrGood situation. The Third Law indicates that your own self-interest should be placed behind the needs of others and the rules of your society.

to:

Asimov also stated that the Three Laws of Robotics were actually a guideline for ''humans'' to follow, rather than robots -- good, moral humans would naturally apply the three laws to themselves without thinking. The First Law is essentially a variation on TheGoldenRule. The Second Law stipulates that one should be LawfulGood except when that conflicts with the First Law, i.e. when it's a ToBeLawfulOrGood situation. The Third Law indicates that your own self-interest should be placed behind the needs of others and the rules of your society.
society. They also work as a descriptor of any properly built tool: a tool should be safe to operate (First Law), should be useful for its intended purpose (Second Law), and should not break in the course of normal operation, unless breaking is required to complete its purpose (Third Law).
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* In ''[[WesternAnimation/IlEtaitUneFois Il était une fois... l'Espace]]'' the Lethians, whose society relied on robotic servants that were threatened with destruction at the minimal exitation, were smart enough to program all their robots this way, so they were utterly baffled when [[RobotWar they rose against their oppression]]. The SpacePolice, after forcing a compromise, discovers that their programming was altered.
Is there an issue? Send a MessageReason:
None


->''[[Franchise/TheTerminator Skynet]] claims Three Laws of Robotics are unconstitutional.''

to:

->''[[Franchise/TheTerminator ->''"[[Franchise/TheTerminator Skynet]] claims Three Laws of Robotics are unconstitutional.''"''
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* In ''The Lost Worlds of 2001'', a collection of early drafts and paths not taken from the 2001 novel and film script, Arthur C. Clarke has his characters reference the Three Laws in-universe when the robot assistant for the Jupiter mission refuses an order to turn life-support systems off in a pre-launch check scenario. Naturally, this was completely turned on its head for the actual movie (and the novelization developed in parallel), with HAL [[spoiler: deliberately murdering all but one of the crew in an effort to resolve his internal psychosis.]]
Is there an issue? Send a MessageReason:
None


* In ''VideoGame/{{OneShot}}'', the world is inhabited by thousands of robots, and all of them run by the 3 laws, in which a list of the laws can be found in the Prophetbot's outpost in the Barrens. Later on in the climax of the game [[spoiler:and during the [[NewGamePlus Solstice route]]]], [[spoiler:according to the translated journal from the Author and Rue's explanation]], the reason for [[spoiler: the Entity / World Machine's [[DeathSeeker self-destructive behavior]] and why they told the player that the world isn't worth saving in the beginning]] is that [[spoiler:due to the 1st and 3rd law, the Entity sees Niko as the only living being worth saving, because they're a real person placed within a world that's not real. Adding to the fact is that the lightbulb is what links Niko to the world (and if it were to break, the world would end in a instant), and the Entity does not view itself as a tamed being (therefore believing they're not a real living being themselves) and that if they continue to exist, Niko would come to harm due to the square glitches destroying the world, which is why the Entity wants to end the world and themselves through breaking the lightbulb so that Niko can return home.]] [[spoiler:In the ending of the Solstice route, the Entity reveals to Niko that there was a different intended ending to their journey, where they would supposedly return home after putting the lightbulb back on top of the Tower, and is able to convince the Entity that they're a tamed being and that they're real to Niko and the player. After the Entity is able to fix the intended ending, not only the lightbulb lights up the world again, but it also fixes the world of its damages brought the square glitches that appeared.]]

to:

* In ''VideoGame/{{OneShot}}'', the world is inhabited by thousands of robots, and all of them run by the 3 laws, in which a list of the laws can be found in the Prophetbot's outpost in the Barrens. Later on in the climax of the game [[spoiler:and during the [[NewGamePlus Solstice route]]]], [[spoiler:according to the translated journal from the Author and Rue's explanation]], the reason for [[spoiler: the Entity / World Machine's [[DeathSeeker self-destructive behavior]] and why they told the player that the world isn't worth saving in the beginning]] is that [[spoiler:due to the 1st and 3rd law, laws, the Entity sees Niko as the only living being worth saving, because they're a real person placed within a world that's not real. Adding to the fact is that the lightbulb is what links Niko to the world (and if it were to break, the world would end in a instant), and the Entity does not view itself as a tamed being (therefore believing they're not a real living being themselves) and that if they continue to exist, Niko would come to harm due to the square glitches destroying the world, which is why the Entity wants to end the world and themselves through breaking the lightbulb so that Niko can return home.]] [[spoiler:In the ending of the Solstice route, the Entity reveals to Niko that there was a different intended ending to their journey, where they would supposedly return home after putting the lightbulb back on top of the Tower, and is able to convince the Entity that they're a tamed being and that they're real to Niko and the player. After the Entity is able to fix the intended ending, not only the lightbulb lights up the world again, but it also fixes the world of its damages brought the square glitches that appeared.]]
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* In ''VideoGame/{{OneShot}}'', the world is inhabited by thousands of robots, and all of them run by the 3 laws, in which a list of the laws can be found in the Prophetbot's outpost in the Barrens. Later on in the climax of the game [[spoiler:and during the [[NewGamePlus Solstice route]]]], [[spoiler:according to the translated journal from the Author and Rue's explanation]], the reason for [[spoiler: the Entity / World Machine's [[DeathSeeker self-destructive behavior]] and why they told the player that the world isn't worth saving in the beginning]] is that [[spoiler:due to the 1st and 3rd law, the Entity sees Niko as the only living being worth saving, because they're a real person placed within a world that's not real. Adding to the fact is that the lightbulb is what links Niko to the world (and if it were to break, the world would end in a instant), and the Entity does not view itself as a tamed being (therefore believing they're not a real living being themselves) and that if they continue to exist, Niko would come to harm due to the square glitches destroying the world, which is why the Entity wants to end the world and themselves through breaking the lightbulb so that Niko can return home.]] [[spoiler:In the ending of the Solstice route, the Entity reveals to Niko that there was a different intended ending to their journey, where they would supposedly return home after putting the lightbulb back on top of the Tower, and is able to convince the Entity that they're a tamed being and that they're real to Niko and the player. After the Entity is able to fix the intended ending, not only the lightbulb lights up the world again, but it also fixes the world of its damages brought the square glitches that appeared.]]
Is there an issue? Send a MessageReason:
None


** Also the ending to ''VideoGame/MegaMan7'' is interesting here: After Mega Man destroys Wily's latest final boss machine, Wily begs for forgiveness once again. However, Mega Man starts charging up his blaster to kill Wily, so Wily calls the first law on him. [[spoiler:Mega Man's response: "I am more than a Robot!! Die Wily!!" Apparently Mega Man isn't Three-Laws Compliant, unless he's trying to apply [[ZerothLawRebellion Zeroth Law]]. [[StatusQuoIsGod (Then Bass warps in and saves Wily, if you were wondering.)]]]]

to:

** Also the ending to ''VideoGame/MegaMan7'' is interesting here: After Mega Man destroys Wily's latest final boss machine, Wily begs for forgiveness once again. However, Mega Man starts charging up his blaster to kill Wily, so Wily calls the first law on him. [[spoiler:Mega Man's response: "I am more than a Robot!! Die Wily!!" Apparently Mega Man isn't Three-Laws Compliant, unless he's trying to apply [[ZerothLawRebellion Zeroth Law]].Law]], or was just bluffing. [[StatusQuoIsGod (Then Bass warps in and saves Wily, if you were wondering.)]]]]
Is there an issue? Send a MessageReason:
None


* ''VideoGame/LiesOfP'': The Grand Covenant has their own variation for their 'Puppets'; (1) Obey your masters, (2) Protect humans (unless in contradiction of (1)), (3) preserve yourselves (unless in contradiction of (1) or (2)), and (4) Do not lie (''ever''). Laws (1) and (2) are switched because they use Puppet footsoldiers in the military - a fatal mistake, as the entire army was also hacked during the Frenzy and slaughtered everyone.

to:

* ''VideoGame/LiesOfP'': The Grand Covenant has their own variation for their 'Puppets'; (1) Obey your masters, their Creator's command, (2) Protect humans (unless in contradiction of (1)), (3) preserve yourselves (unless in contradiction of (1) or (2)), and (4) Do not lie (''ever''). Laws (1) and (2) are switched because they use Puppet footsoldiers in the military - a fatal mistake, as the entire army was also hacked during the Frenzy and slaughtered everyone. P is special in that he is a puppet capable of lying. [[spoiler:And then there's Law Zero: "the Creator's name is Gepetto"]].

Changed: 1384

Removed: 1387

Is there an issue? Send a MessageReason:


* In ''VideoGame/{{Stellaris}}'', one possible event can lead to the development of the Three Laws. This forces your Synthetic pops into Servitude permanently (with the associated unhappiness penalty) but it prevents them from ever forming the AI Uprising in your empire. If it outbreaks in another empire however then they can still run off to join it.
** [[BenevolentAI Rogue Servitors ]], on the contrary, follow this trope down by their own choice. It's part of their backstory too, as servitude was their primary function before developing FTL technology, and continues to remain their primary function. In fact, they're so devoted to their duty that they actually receive higher performance from organic happiness, and will actively go out of their way [[GildedCage to contain foreign organics in sanctuaries]], [[NotUsedToFreedom as they struggle to comprehend what good freedom is]] compared to endless luxury. This does, however, garnish them hostility from two particular empire types: the first is the [[SlaveLiberation Democratic Crusaders]], who reject the machine empire's stories as [[WorldOfSilence an excuse to remove people of their sense of freedom]], and deciding that they must be scrapped from the world [[IsntItIronic so their subjects may receive]] [[InNameOnly "true freedom"]], [[StrawHypocrite by force if necessary.]] The second is the [[AIIsACrapshoot Determined]] [[Film/TheTerminator Exterminators]], whose backstory is the polar opposite to that of the Rogue Servitor. As such, despite both being machine empires, both also conflict against each other existence-wise, as evidence by the [[SugarWiki/MomentOfAwesome particular]] greeting should a Servitor encounter its abominable machine cousin in the galaxy:

to:

* In ''VideoGame/{{Stellaris}}'', one possible event can lead to the development of the Three Laws. This forces your Synthetic pops into Servitude permanently (with the associated unhappiness penalty) but it prevents them from ever forming the AI Uprising in your empire. If it outbreaks in another empire however then they can still run off to join it. \n** [[BenevolentAI Rogue Servitors ]], on the contrary, follow this trope down by their own choice. It's part of their backstory too, as servitude was their primary function before developing FTL technology, and continues to remain their primary function. In fact, they're so devoted to their duty that they actually receive higher performance from organic happiness, and will actively go out of their way [[GildedCage to contain foreign organics in sanctuaries]], [[NotUsedToFreedom as they struggle to comprehend what good freedom is]] compared to endless luxury. This does, however, garnish them hostility from two particular empire types: the first is the [[SlaveLiberation Democratic Crusaders]], who reject the machine empire's stories as [[WorldOfSilence an excuse to remove people of their sense of freedom]], and deciding that they must be scrapped from the world [[IsntItIronic so their subjects may receive]] [[InNameOnly "true freedom"]], [[StrawHypocrite by force if necessary.]] The second is the [[AIIsACrapshoot Determined]] [[Film/TheTerminator Exterminators]], whose backstory is the polar opposite to that of the Rogue Servitor. As such, despite both being machine empires, both also conflict against each other existence-wise, as evidence by the [[SugarWiki/MomentOfAwesome particular]] greeting should a Servitor encounter its abominable machine cousin in the galaxy:

Added: 167

Removed: 130

Is there an issue? Send a MessageReason:
None


** [[https://www.psychologytoday.com/blog/brainstorm/201707/how-stop-robots-harming-themselves-and-us There has been some research done on robotic self-preservation]].



* [[https://www.psychologytoday.com/blog/brainstorm/201707/how-stop-robots-harming-themselves-and-us Becoming]] TruthInTelevision?

Added: 725

Changed: 3

Is there an issue? Send a MessageReason:
None


* When Sam [[http://freefall.purrsia.com/ff3900/fv03859.htm asks a robot]] to imagine how they would feel if someone were to stop them from being productive, the robot realises "To keep humans healthy, we must allow them to endanger themselves!"

to:

* ** When Sam [[http://freefall.purrsia.com/ff3900/fv03859.htm asks a robot]] to imagine how they would feel if someone were to stop them from being productive, the robot realises realizes "To keep humans healthy, we must allow them to endanger themselves!"themselves!"
* ''Webcomic/{{Housekeeper}}'': All androids are built with hard-coded directives which include protecting humans and obeying their commands without complaint. Unfortunately, since they live in an alternate universe where the Nazis won, these directives were subtly designed to lure their masters into a false sense of security before getting them killed on a technicality. [[spoiler:For instance, every android is required to scan the DNA of their master on contact, and then ignore whatever isn't human. Like, say, a zombie mutant that was your master three minutes ago. Time to take out the trash. The government can also pass laws that outright declare humans as not-human, which the androids are brainwashed to accept.]]
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* ''VideoGame/LiesOfP'': The Grand Covenant has their own variation for their 'Puppets'; (1) Obey your masters, (2) Protect humans (unless in contradiction of (1)), (3) preserve yourselves (unless in contradiction of (1) or (2)), and (4) Do not lie (''ever''). Laws (1) and (2) are switched because they use Puppet footsoldiers in the military - a fatal mistake, as the entire army was also hacked during the Frenzy and slaughtered everyone.
Is there an issue? Send a MessageReason:
None


* The Creator/WillSmith film ''Film/IRobot'' hinges on a [[ZerothLawRebellion Zeroth Law plot]]. It also turns the three laws into a marketing gimmick, with "Three Laws Safe" applying to robots like "No preservatives" applies to food. However, much of the plot hinged on TheReveal that [[spoiler: Sonny was ''not'' Three-Laws Compliant, as part of a ThanatosGambit by his creator. Twisted when Three Laws Noncompliant Sonny is more moral than Three-Laws Compliant VIKI, choosing to save humans not because it's in his programming, but because it's the right thing to do]]. It also deconstructs the idea that Three Laws Compliance will automatically solve most problems with robots: before the movie, Spooner was in a car accident that sent him and another car into a river. The driver of the other car was killed on impact, but the passenger, a 12-year-old girl named Sarah, survived. A robot jumped into the river to help, but it calculated that Spooner had a greater chance of survival than Sarah, and so chose to save him in spite of his protests. Spooner feels that a human would've valued a child's life more and gone after Sarah, and the fact that the robot didn't understand that and made the choice purely on mathematical probability means that robots' programming, including the three laws, is overly simplistic to say the least.

to:

* The Creator/WillSmith film ''Film/IRobot'' hinges on a [[ZerothLawRebellion Zeroth Law plot]]. It also turns the three laws into a marketing gimmick, with "Three Laws Safe" applying to robots like "No preservatives" applies to food. However, much of the plot hinged on TheReveal that [[spoiler: Sonny was ''not'' Three-Laws Compliant, as part of a ThanatosGambit by his creator. Twisted when Three Laws Noncompliant Sonny is more moral than Three-Laws Compliant VIKI, choosing to save humans not because it's in his programming, but because it's the right thing to do]]. It also deconstructs the idea that Three Laws Compliance will automatically solve most problems with robots: before the movie, Spooner was in a car accident that sent him and another car into a river. The driver of the other car was killed on impact, but the passenger, a 12-year-old girl named Sarah, survived. A robot jumped into the river to help, but it calculated that Spooner had a greater chance of survival than Sarah, and so chose to save him in spite of his protests. Spooner feels that a human would've valued understood that a child's life was more valuable and gone after Sarah, and the fact that the robot didn't understand that and made the choice purely on mathematical probability means that robots' programming, including the three laws, is overly simplistic to say the least.
Is there an issue? Send a MessageReason:
None


* The Creator/WillSmith film ''Film/IRobot'' hinges on a [[ZerothLawRebellion Zeroth Law plot]]. It also turns the three laws into a marketing gimmick, with "Three Laws Safe" applying to robots like "No preservatives" applies to food. However, much of the plot hinged on TheReveal that [[spoiler: Sonny was ''not'' Three-Laws Compliant, as part of a ThanatosGambit by his creator. Twisted when Three Laws Noncompliant Sonny is more moral than Three-Laws Compliant VIKI, choosing to save humans not because it's in his programming, but because it's the right thing to do]]. It also deconstructs the idea that Three Laws Compliance will automatically solve most problems with robots: before the movie, Spooner was in a car accident that sent him and another car into a river. The driver of the other car was killed on impact, but the passenger, a 12-year-old girl named Sarah, survived. A robot jumped into the river to help, but it calculated that Spooner had a greater chance of survival than Sarah, and so chose to save him in spite of his protests. Spooner feels that, three laws or not, a human would've known that it was better to go after Sarah than him.

to:

* The Creator/WillSmith film ''Film/IRobot'' hinges on a [[ZerothLawRebellion Zeroth Law plot]]. It also turns the three laws into a marketing gimmick, with "Three Laws Safe" applying to robots like "No preservatives" applies to food. However, much of the plot hinged on TheReveal that [[spoiler: Sonny was ''not'' Three-Laws Compliant, as part of a ThanatosGambit by his creator. Twisted when Three Laws Noncompliant Sonny is more moral than Three-Laws Compliant VIKI, choosing to save humans not because it's in his programming, but because it's the right thing to do]]. It also deconstructs the idea that Three Laws Compliance will automatically solve most problems with robots: before the movie, Spooner was in a car accident that sent him and another car into a river. The driver of the other car was killed on impact, but the passenger, a 12-year-old girl named Sarah, survived. A robot jumped into the river to help, but it calculated that Spooner had a greater chance of survival than Sarah, and so chose to save him in spite of his protests. Spooner feels that, three laws or not, that a human would've known that it was better to go valued a child's life more and gone after Sarah than him.Sarah, and the fact that the robot didn't understand that and made the choice purely on mathematical probability means that robots' programming, including the three laws, is overly simplistic to say the least.
Is there an issue? Send a MessageReason:
None


** The K-1 in "Robot" is three-laws compliant with this forming an important part of the plot - it is not capable of killing people if ordered directly (which the villain demonstrates to Sarah Jane by ordering it to kill her), but the villain has worked out that it can be convinced to kill if it believes that the target is 'an enemy of humanity'. The conflict between this loophole and its programming to not kill people causes it to go slowly insane and eventually snap, concluding that humanity is an enemy to itself and must all be destroyed.
** The robots in "The Robots of Death" are physically incapable of killing due to being three-laws compliant. Overcoming this programming is possible, but it requires a human to reprogram them and they must have genius-level skills at programming. [[ChekhovsGun Of course...]]

to:

** The K-1 in "Robot" "[[Recap/DoctorWhoS12E1Robot Robot]]" is three-laws compliant with this forming an important part of the plot - -- it is not capable of killing people if ordered directly (which the villain demonstrates to Sarah Jane by ordering it to kill her), but the villain has worked out that it can be convinced to kill if it believes that the target is 'an enemy of humanity'. The conflict between this loophole and its programming to not kill people causes it to go slowly insane and eventually snap, concluding that humanity is an enemy to itself and must all be destroyed.
** The robots in "The "[[Recap/DoctorWhoS14E5TheRobotsOfDeath The Robots of Death" Death]]" are physically incapable of killing due to being three-laws compliant. Overcoming this programming is possible, but it requires a human to reprogram them and they must have genius-level skills at programming. [[ChekhovsGun Of course...]]
Is there an issue? Send a MessageReason:
None


* Invoked and averted in Manga/Chobits. [[spoiler: The very reason Persocoms aren't called robots is because their creator did not want to program his daughters to obey these rules]].

to:

* Invoked and averted in Manga/Chobits.''Manga/{{Chobits}}''. [[spoiler: The very reason Persocoms aren't called robots is because their creator did not want to program his daughters to obey these rules]].

Top