Follow TV Tropes

Following

History Main / LogicBomb

Go To

OR

Is there an issue? Send a MessageReason:
None


* In an episode of ''WesternAnimation/{{Jumanji}}'', a steampunk scientist steals Peter's laptop to use as the central processing unit of his reality controlling computer. After it gains sentience and tries to kill everyone around it, Peter typed in "why?". It couldn't give an answer, and shut down.

to:

* In an episode of ''WesternAnimation/{{Jumanji}}'', ''WesternAnimation/JumanjiTheAnimatedSeries'', a steampunk scientist steals Peter's laptop to use as the central processing unit of his reality controlling computer. After it gains sentience and tries to kill everyone around it, Peter typed in "why?". It couldn't give an answer, and shut down.
Is there an issue? Send a MessageReason:
None


* In ''ComicBook/TheUnstoppableWasp'' (2018) #7, [[ComicBook/TheVision2015 Viv Vision]] is giving Nadia a tour of her strange family tree when she comes up to [[ComicBook/YoungAvengers Wiccan and Speed]]. She literally ERROR[=s=] out when trying to explain the logistics of their existence[[note]]YOU try to explain [[BigScrewedUpFamily how the grandsons of Ultron exist when they weren't built but created through magic]]![[/note]].

to:

* In ''ComicBook/TheUnstoppableWasp'' (2018) #7, [[ComicBook/TheVision2015 Viv Vision]] is giving Nadia a tour of her strange family tree when she comes up to [[ComicBook/YoungAvengers Wiccan and Speed]]. She literally ERROR[=s=] [=ERRORs=] out when trying to explain the logistics of their existence[[note]]YOU existence.[[note]]YOU try to explain [[BigScrewedUpFamily [[TangledFamilyTree how the grandsons of Ultron exist when they weren't built but created through magic]]![[/note]].magic]]![[/note]]
Is there an issue? Send a MessageReason:
None


'''Yes-Man:''' But. Not you... the president. That you who said you hated. You... you who love. Hate. Yankees. ''Clouds''. '''[BOOM]'''

to:

'''Yes-Man:''' But. Not you... the president. That But- not you, the... president, the- you who who... said you hated. You... hated- you- you who love. Hate. Yankees. ''Clouds''. love- hate- Yankees- ''clouds-'' '''[BOOM]'''
Is there an issue? Send a MessageReason:
None


[[caption-width-right:350:Yeah, good luck with that. [[HeroicMime Especially #3.]][[labelnote:Explanation]]The last statement itself isn't paradoxical; if the set of all sets did exist, then it certainly would contain itself. The last statement is in fact a shortened form of [[http://en.wikipedia.org/wiki/Russell%27s_paradox "Does the set of all sets that don't contain themselves contain itself?"]] which is often solved by declaring that a set cannot contain itself. This makes "the set of all sets" an impossible construct.[[/labelnote]]]]

to:

[[caption-width-right:350:Yeah, [[caption-width-right:350:[[HeroicMime Yeah, good luck with that. [[HeroicMime that]]. Especially #3.]][[labelnote:Explanation]]The [[labelnote:Explanation]]The last statement itself isn't paradoxical; if the set of all sets did exist, then it certainly would contain itself. The last statement is in fact a shortened form of [[http://en.wikipedia.org/wiki/Russell%27s_paradox "Does the set of all sets that don't contain themselves contain itself?"]] which is often solved by declaring that a set cannot contain itself. This makes "the set of all sets" an impossible construct.[[/labelnote]]]]
Is there an issue? Send a MessageReason:
None

Added DiffLines:

** Played straight with [[https://www.lesswrong.com/posts/kmWrwtGE9B9hpbgRT/a-search-for-more-chatgpt-gpt-3-5-gpt-4-unspeakable-glitch Glitch Tokens]], which are certain groups of words that will confuse [=LLMs=] with results ranging from being unable to repeat them properly to outright crashing when trying to say them. Every model has a different set of glitch tokens which depends on the tokenizer used to train them.


Added DiffLines:

Is there an issue? Send a MessageReason:
Clarified the example of "Buridan's ass"


** Which itself is an adaptation of the classic paradox of a donkey being placed equidistant to two equally sized, equally nutritious bags of feed-- unable to choose between the two, the dictates of pure logic would lead it to starve to death exactly where it was.

to:

** Which itself is an adaptation of the classic paradox of [[https://en.wikipedia.org/wiki/Buridan%27s_ass Buridan's ass]]: a donkey being is placed equidistant to two equally sized, equally nutritious bags of feed-- feed; unable to choose between the two, the dictates of pure logic would lead it to starve to death exactly where it was.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* ''WebAnimation/{{Drone}}'': This is what kicks off the plot: [[spoiler: During the demonstration livestream where an AI-guided AttackDrone named Newton destroys a building, some debris forms a face that Newton mistakes as a civilian casualty, failing to identify it on a database and short circuits it, leading it to fly off on its self-reflection.]]
Is there an issue? Send a MessageReason:
None


** It goes even a bit further in his continued existence: [[spoiler:G0-T0 still follows the directive to help the Republic. At the same time as an infrastructure droid it is programmed to value efficiency. This provides a paradox, as G0-T0 view the Republic as a bloated, ineffectual entity that clings on to bad management decisions, and it would be better for the galaxy to simply scrap the entire political system and place a new one in its stead. It is programmed to support something which another part of its programming is meant to remove.]]

to:

** *** It goes even a bit further in his continued existence: [[spoiler:G0-T0 further: [[spoiler:crime lord G0-T0 still follows the directive to help the Republic. At the same time Republic, but as an infrastructure droid it is programmed to value efficiency. This provides a paradox, as G0-T0 view views the Republic as a bloated, ineffectual entity that clings on to bad management decisions, and it would be better for the galaxy to simply scrap the entire political system and place a new one in its stead. It is programmed to support something which another part of its programming is meant to remove.]]



* ''VideoGame/PlanescapeTorment'' has a character who successfully convinces a man that he does not, in fact, exist. As a result he ceases to do so. Though to be fair the game is set in a ''D&D'' setting in which a system of [[ClapYourHandsIfYouBelieve "Whatever you believe, is"]] has replaced all laws of nature.[[spoiler:Doing so unlock an optional method of ending the game by ''deliberately'' logic bombing yourself out of existence.]]

to:

* ''VideoGame/PlanescapeTorment'' has a character who successfully convinces a man that he does not, in fact, exist. As a result he ceases to do so. Though to be fair the game is set in a ''D&D'' setting in which a system of [[ClapYourHandsIfYouBelieve "Whatever you believe, is"]] has replaced all laws of nature.[[spoiler:Doing so unlock unlocks an optional method of ending the game by ''deliberately'' logic bombing yourself out of existence.]]
Is there an issue? Send a MessageReason:
None


** Pinkie Pie brings up this concept when Rainbow Dash has a mental breakdown over the realization that being loyal to multiple things means that when they conflict, she will be forced to choose one and betray the other, no matter which one it is. She tells Rainbow Dash a story about a donkey standing between two equally sized piles of hay, who couldn't choose one over the other and starved to death. This is used to illustrate the point that somepony can be loyal to multiple things at once and they will conflict, but that doesn't make either of them less important.

to:

** Pinkie Pie brings up this concept when Rainbow Dash has a mental breakdown over the realization that being loyal to multiple things means that when they conflict, she will be forced to choose one and betray the other, no matter which one it is.other. She tells Rainbow Dash a story about a donkey standing between two equally sized piles of hay, who couldn't choose one over the other and starved to death. This is used to illustrate the point that somepony can be loyal to multiple things at once and they will conflict, but that doesn't make either of them less important.important--i.e. the donkey will have to choose one hay pile, but refusing to decide and starving to death doesn't help anypony.



** Apple Pie does this to a zombie army by pointing out how they can't be alive and dead at the same time, and should just lie back down and die. ''It works.'' According to Rancor, this is her ''explicit power'' as the Element of Laughter (at least how she represents it). [[spoiler:While Rancor isn't affected, Angry Pie is and is barely able to will herself not to think about it.]] She also does this to [[NinjaPirateRobotZombie giant-cyborg-spider-vampire]] by pointing out it's programmed to protect chaos and destroy harmony, and half the group have an Element of Harmony AND an Element of Chaos, so it can't protect and destroy them at the same time. After this causes it to [[MadeOfExplodium violently explode,]] Rarity and Twilight lampshade it.
---->'''Rarity''': Why do paradoxes always make robots explode? Shutting them down I understand, but exploding?
---->'''Twilight''': Knowing Discord, he probably designed it that way.

to:

** Apple Pie does this to a zombie army by pointing out how they can't be alive and dead at the same time, and should just lie back down and die. ''It works.'' According to Rancor, this is her ''explicit power'' as the Element of Laughter (at least how she represents it). [[spoiler:While Rancor isn't affected, Angry Pie is and is barely able to will herself not to think about it.]] She also does this to a [[NinjaPirateRobotZombie giant-cyborg-spider-vampire]] by pointing out it's programmed to protect chaos and destroy harmony, and half the group have an Element of Harmony AND an Element of Chaos, so it can't protect and destroy them at the same time. After this causes it to [[MadeOfExplodium violently explode,]] Rarity and Twilight lampshade it.
---->'''Rarity''': --->'''Rarity''': Why do paradoxes always make robots explode? Shutting them down I understand, but exploding?
---->'''Twilight''': --->'''Twilight''': Knowing Discord, he probably designed it that way.
Is there an issue? Send a MessageReason:
None


** Pinkie Pie brings up this concept when Rainbow Dash has a mental breakdown over the realization that being loyal to multiple things inevitably means having to choose one when they conflict, betraying the other. She tells Rainbow Dash a story about a donkey standing between two equally sized piles of hay, who couldn't choose one over the other and starved to death. This is used to illustrate the point that somepony can be loyal to multiple things at once and they will conflict, but that doesn't make either of them less important.

to:

** Pinkie Pie brings up this concept when Rainbow Dash has a mental breakdown over the realization that being loyal to multiple things inevitably means having to choose one that when they conflict, betraying she will be forced to choose one and betray the other.other, no matter which one it is. She tells Rainbow Dash a story about a donkey standing between two equally sized piles of hay, who couldn't choose one over the other and starved to death. This is used to illustrate the point that somepony can be loyal to multiple things at once and they will conflict, but that doesn't make either of them less important.

Added: 537

Changed: 1577

Removed: 184

Is there an issue? Send a MessageReason:
None


** Pinkie Pie brings up this concept when Rainbow Dash has a mental breakdown over the realization that being loyal to multiple things inevitably means having to choose one when they conflict, betraying the other. She tells Rainbow Dash a story about a donkey standing between two equally sized piles of hay, who couldn't choose one over the other and starved to death. This is used to illustrate the point that somepony can be loyal to multiple things at once and they will conflict, but that doesn't make either of them less important.



** Apple Pie
*** She does this to a zombie army by pointing out how they can't be alive and dead at the same time, and should just lie back down and die. ''It works.'' According to Rancor, this is Apple Pie's ''explicit power'' as the Element of Laughter (at least how she represents it). [[spoiler:While Rancor isn't affected, Angry Pie is and is barely able to will herself not to think about it.]]
*** She does this to [[NinjaPirateRobotZombie giant-cyborg-spider-vampire]] by pointing out it's programmed to protect chaos and destroy harmony, and half the group have an Element of Harmony AND an Element of Chaos, so it can't protect and destroy them at the same time. After this causes it to [[MadeOfExplodium violently explode,]] Rarity and Twilight lampshade it.
----> '''Rarity''': Why do paradoxes always make robots explode? Shutting them down I understand, but exploding?
----> '''Twilight''': Knowing Discord, he probably designed it that way.

to:

** Apple Pie
*** She
Pie does this to a zombie army by pointing out how they can't be alive and dead at the same time, and should just lie back down and die. ''It works.'' According to Rancor, this is Apple Pie's her ''explicit power'' as the Element of Laughter (at least how she represents it). [[spoiler:While Rancor isn't affected, Angry Pie is and is barely able to will herself not to think about it.]]
***
]] She also does this to [[NinjaPirateRobotZombie giant-cyborg-spider-vampire]] by pointing out it's programmed to protect chaos and destroy harmony, and half the group have an Element of Harmony AND an Element of Chaos, so it can't protect and destroy them at the same time. After this causes it to [[MadeOfExplodium violently explode,]] Rarity and Twilight lampshade it.
----> '''Rarity''': ---->'''Rarity''': Why do paradoxes always make robots explode? Shutting them down I understand, but exploding?
----> '''Twilight''': ---->'''Twilight''': Knowing Discord, he probably designed it that way.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* Subverted in ''Webcomic/{{Unlinked}}''. Hazel tries to invoke this, hoping One-One's confusion over Amelia's existence will buy her enough time to hack into the storage car, since the robot is convinced she's dead. Unfortunately, Amelia herself doesn't feel like saying anything that would propagate the loop, and so One-One manages to exit it relatively quickly by deciding Amelia must be a clone and turns around to ask [[CloneAngst Hazel]] how she feels about this.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* ''WebAnimation/FiftyWaysToDieInMinecraft Fairy Tale Style'' has a magical equivalent of this trope. In Death 28, when the Queen asks the Magic Mirror who is the most beautiful of them all, the Mirror starts glitching out trying to solve the answer before it blue-screens. The problem here is that in a fairy tale world, there are multiple princesses and fair ladies that are labeled as "[[WorldsMostBeautifulWoman the most beautiful of all]]", with the examples that the Mirror gives are Snow White, Thumbelina, the Little Mermaid, and Belle, whose name literally means beauty.
Is there an issue? Send a MessageReason:
None


* ''VideoGame/{{Fallout}}'':

to:

* ''VideoGame/{{Fallout}}'':''Franchise/{{Fallout}}'':
Is there an issue? Send a MessageReason:
None


* ''Series/TheTwilightZone1985'': In "[[Recap/TheTwilightZone1985S1E12HerPilgrimSoulIOfNewton I of Newton]]", a professor accidentally sold his soul to the devil. The escape clause of the contract allowed him to ask the devil three questions concerning his powers and abilities, and if he could then give him a task he couldn't complete or a question he couldn't answer he was free. When the professor asked the devil if there was any point in the universe that he could go to and not be able to return, the devil assured him there was not and laughed at the professor for such a waste of a question. The professor then gave him a task he could not complete: [[spoiler:Get lost!]]

to:

* ''Series/TheTwilightZone1985'': In "[[Recap/TheTwilightZone1985S1E12HerPilgrimSoulIOfNewton "[[Recap/TheTwilightZone1985S1E12 I of Newton]]", a professor accidentally sold his soul to the devil. The escape clause of the contract allowed him to ask the devil three questions concerning his powers and abilities, and if he could then give him a task he couldn't complete or a question he couldn't answer he was free. When the professor asked the devil if there was any point in the universe that he could go to and not be able to return, the devil assured him there was not and laughed at the professor for such a waste of a question. The professor then gave him a task he could not complete: [[spoiler:Get lost!]]

Added: 549

Changed: 459

Is there an issue? Send a MessageReason:
None


->''"I cannot — yet I must! How do you calculate that? At what point on the graph do 'must' and 'cannot' meet?"''
-->-- '''Ro-Man''', ''Film/RobotMonster''

to:

->''"I cannot — yet I must! How do you calculate that? At ->'''[[VideoGame/Borderlands2 Claptrap:]]''' You know what point really ticks me off? When some jackwad tries to blow my circuitry with some lame-o stunt he saw on a ''Franchise/StarTrek'' re-run.\\
'''[[VideoGame/SamAndMaxFreelancePolice Sam:]]''' What, like, "Everything I say is a lie?"\\
'''Claptrap:''' Yeah, like that! What, do they think I'll just lock up, because of some teeny tiny logical paradox?\\
'''[[VideoGame/{{Portal}} GLaDOS:]]''' It is rather insulting. I learned how to avoid paradox traps while I was still in Beta.\\
'''Claptrap:''' So what if everything Sam says is a lie? That doesn't mean that he's lying about that, right? 'Cause then he'd be telling
the graph do 'must' and 'cannot' meet?"''
truth and... [[OhCrap Ohhhh noooo...]] (''shuts down'')\\
(''{{beat}}'')\\
'''[=GLaDOS=]:''' [[DeadpanSnarker Well, that was a shining moment in the history of robotkind.]]\\
'''Claptrap:''' (''turns back on'') [[UnexplainedRecovery Annnnd I'm back.]]
-->-- '''Ro-Man''', ''Film/RobotMonster''
''VideoGame/PokerNight2''

Is there an issue? Send a MessageReason:
None


* In ''VideoGame/MinecraftStoryMode'', Jesse gives to [[AIIsACrapShoot PAMA]] to momentarily distract himself to give him and the others time to escape. One of the possible choices to tell PAMA is "This sentence is false". These do not work on PAMA, however, since PAMA declares it a paradox and moves on.

to:

* In ''VideoGame/MinecraftStoryMode'', Jesse gives to [[AIIsACrapShoot PAMA]] to momentarily distract himself to give him and the others time to escape. One of the possible choices to tell PAMA is "This sentence "What I'm saying is false". These do not work on PAMA, however, since a lie." Downplayed in that it only stalls PAMA declares it a for some seconds, long enough for PAMA's minions to release you, before PAMA realizes what's happening and puts processing the paradox and moves on.on hold.
Is there an issue? Send a MessageReason:
Added an example from Twitter.

Added DiffLines:

* Attempted by the Website/{{Twitter}} user @prerationalist, with [[https://twitter.com/cnviolations/status/1718800597904712147 a tweet]] that said simply "This Tweet Has A Community Note". ("Community notes" on Twitter are typically used to correct tweets that contain misinformation.) Presumably, if this tweet doesn't have a community note, it should be tagged with a note to point out its falsity -- at which point, it will no longer be false... This was promptly {{defied}} by a community note that said simply, "[[TakeAThirdOption This tweet made a false claim, but it has now been corrected]]", along with a Wikipedia link to the [[https://en.wikipedia.org/wiki/Epimenides_paradox Epimenides paradox]] (famous for being an apparent paradox that isn't ''actually'' a paradox).
Is there an issue? Send a MessageReason:
None


** In 2015, Panda Antivirus had the dubious honor of joining Norton in the ''Antivirus harakiri hall of fame''[[http://www.theregister.co.uk/2015/03/11/panda_antivirus_update_self_pwn/]] when a botched update made it detect itself as a virus and promptly commit suicide- although, due to the way it was written, when this occurred, [[TakingYouWithMe Windows promptly died with it]].

to:

** In 2015, Panda Antivirus had the dubious honor of joining Norton in the ''Antivirus harakiri hall of fame''[[http://www.fame'' when a botched update [[http://www.theregister.co.uk/2015/03/11/panda_antivirus_update_self_pwn/]] when a botched update uk/2015/03/11/panda_antivirus_update_self_pwn/ made it detect itself as a virus and promptly commit suicide- suicide]] - although, due to the way it was written, when this occurred, [[TakingYouWithMe Windows promptly died with it]].

Added: 375

Changed: 3131

Removed: 666

Is there an issue? Send a MessageReason:
None


* In ''Series/DerryGirls'' Clare announces, "Well, I am not being individual on me own."

to:

* In ''Series/DerryGirls'' ''Series/DerryGirls'', Clare announces, announces "Well, I am not being individual on me own."



* ''Series/{{JAG}}'': In "Ares", the eponymous computerized weapons control system onboard a destroyer in the Sea of Japan goes havoc and starts firing at friendly aircraft, as programmed by the North Korean Mole. However, Harm’s partner Meg is en route in a helicopter: the on the spot solution advocated by Harm is for the helicopter to fly low and at low speeds, thus simulating a ship, which the computer won’t target. This has an actual basis in reality; to prevent them picking up things like birds or stationary terrain features (and cars on them) most air-search radars have a speed check built in, and won't display contacts moving slower than about 70 mph and below a certain altitude. This was exploited in the First Gulf War to break the Iraqi air-defense system.

to:

* ''Series/{{JAG}}'': In "Ares", the eponymous computerized weapons control system onboard a destroyer in the Sea of Japan goes havoc and starts firing at friendly aircraft, as programmed by the North Korean Mole. However, Harm’s Harm's partner Meg is en route in a helicopter: the on the spot on-the-spot solution advocated by Harm is for the helicopter to fly low and at low speeds, thus simulating a ship, which the computer won’t won't target. This has an actual basis in reality; to prevent them picking up things like birds or stationary terrain features (and cars on them) most air-search radars have a speed check built in, and won't display contacts moving slower than about 70 mph and below a certain altitude. This was exploited in the First Gulf War to break the Iraqi air-defense system.system.
* In the ''Series/KickinIt'' episode "Rock'em Sock'em Rudy", the Wasabitron 3000 is a robot which replaces Rudy as sensei and becomes violent because it deems humans imperfect. During the fight, the Wasabi Warriors recite the wasabi code, and the robot cannot handle the fact that humans can't beat it but won't give up. It then shuts down. Rudy kicks the robot, breaking it.



* ''Series/TheLibrarians2014'': After Cassandra comes into contact with the Apple of Discord, she tries to blow up a power plant to cause a cascading power failure through all of Europe. Flynn tries to Logic-bomb her by requesting she calculate the last digit of put. She laughs him off. Stone succeeds by asking her to calculate Euler's number.

to:

* ''Series/TheLibrarians2014'': ''Series/Thelibrarians2014'': After Cassandra comes into contact with the Apple of Discord, she tries to blow up a power plant to cause a cascading power failure through all of Europe. Flynn tries to Logic-bomb her by requesting she calculate the last digit of put. She laughs him off. Stone succeeds by asking her to calculate Euler's number.



** In the episode ''[[Recap/MysteryScienceTheater3000S01E07RobotMonster Robot Monster]]'' Servo, Crow and Cambot all explode while trying to work out why bumblebees can fly!
** Parodied again in the episode ''Film/{{Laserblast}}''. The Satellite of Love is invaded by a "MONAD" probe (a parody of the NOMAD probe from ''Franchise/StarTrek'' as mentioned above.) Mike attempts to drop a logic bomb on it, but when it doesn't work he simply picks it up and tosses it out of an airlock.

to:

** In the episode ''[[Recap/MysteryScienceTheater3000S01E07RobotMonster "[[Recap/MysteryScienceTheater3000S01E07RobotMonster Robot Monster]]'' Monster]]", Servo, Crow and Cambot all explode while trying to work out why bumblebees can fly!
** Parodied again in the episode ''Film/{{Laserblast}}''."[[Recap/MysteryScienceTheater3000S07E06Laserblast Laserblast]]". The Satellite of Love is invaded by a "MONAD" probe (a parody of the NOMAD probe from ''Franchise/StarTrek'' as mentioned above.) Mike attempts to drop a logic bomb on it, but when it doesn't work he simply picks it up and tosses it out of an airlock.



*** Trying to think of a good thing about ''The Corpse Vanishes''.
*** Watching the first sixteen minutes of ''Star Force: Fugitive Alien 2" (he was fine for the rest of it).
*** After using some InsaneTrollLogic to prove who [[DontAsk Merritt Stone was,]] Joel, Crow and Gypsy use their own to prove that some other guys might be him. It's all too much for Servo who's left screaming: "HE'S NOT MERRITT STONE!" before his head explodes.

to:

*** [[Recap/MysteryScienceTheater3000S01E05TheCorpseVanishes Trying to think of a good thing about ''The Corpse Vanishes''.
about]] ''Film/TheCorpseVanishes''.
*** [[Recap/MysteryScienceTheater3000S03E18StarForceFugitiveAlienII Watching the first sixteen minutes of ''Star of]] ''[[Film/FugitiveAlien Star Force: Fugitive Alien 2" (he was 2]]'' (he's fine for the rest of it).
*** After In "[[Recap/MysteryScienceTheater3000S04E19TheRebelSet The Rebel Set]]", after using some InsaneTrollLogic to prove who [[DontAsk Merritt Stone was,]] was]], Joel, Crow and Gypsy use their own to prove that some other guys might be him. It's all too much for Servo who's left screaming: "HE'S NOT MERRITT STONE!" before his head explodes.



* In ''Series/ThePrisoner1967'' episode "[[Recap/ThePrisonerE6TheGeneral The General]]", Number Six drops one of these on the titular all-knowing computer, short-circuiting it with the question [[spoiler:"Why?"]]. This episode may well be the TropeCodifier for the "open-ended abstract philosophical question" version of this trope as opposed to the "logical paradox" one.

to:

* ''Series/ThePrisoner1967'': In ''Series/ThePrisoner1967'' episode "[[Recap/ThePrisonerE6TheGeneral The General]]", Number Six drops one of these on the titular all-knowing computer, short-circuiting it with the question [[spoiler:"Why?"]]. This episode may well be the TropeCodifier for the "open-ended abstract philosophical question" version of this trope as opposed to the "logical paradox" one.



* ''Series/{{QI}}''

to:

* ''Series/{{QI}}''''Series/{{QI}}'':



** "In [[Recap/StarTrekS2E3TheChangeling The Changeling]]", he convinces Nomad, a genocidal robot with a prime directive of finding and exterminating imperfect lifeforms, that it itself is imperfect (it had mistaken Kirk for its similarly-named creator and had failed to recognize this error).
---->'''Nomad:''' Error... error...
*** Also Subverted in the same episode: Nomad believes that Kirk (who it still thinks is its creator) is imperfect. When Kirk asks how an imperfect being could have created a perfect machine, ''Nomad'' simply concludes that it has no idea.

to:

** "In [[Recap/StarTrekS2E3TheChangeling The Changeling]]", he convinces Nomad, a genocidal robot with a prime directive of finding and exterminating imperfect lifeforms, that it itself is imperfect (it had mistaken Kirk for its similarly-named similarly named creator and had failed to recognize this error).
---->'''Nomad:''' Error... error...
***
error). Also Subverted subverted in the same episode: Nomad believes that Kirk (who it still thinks is its creator) is imperfect. When Kirk asks how an imperfect being could have created a perfect machine, ''Nomad'' simply concludes that it has no idea.



* ''Series/StarTrekTheNextGeneration'': In [[Recap/StarTrekTheNextGenerationS5E23IBorg "I Borg"]], a proposed weapon against the Borg was to send them a geometric figure, the analysis of which could never be completed, and which would, therefore, eat more and more processing power until the entire Borg hive mind crashed. Obviously the Borg don't use floating point numbers. Of course they never actually try it, even when they again have access to the Borg network, so they might have realized it wouldn't work off screen, and the sequel to this episode, [[Recap/StarTrekTheNextGenerationS6E24S7E1Descent "Descent"]], suggests that the Borg deal with cyberweapons by simply severing affected systems from the Collective.
* On ''Series/StarTrekDeepSpaceNine'', Rom accidentally Logic Bombs himself while over thinking the MirrorUniverse concept. Hilariously, Rom's self-Logic Bomb simultaneously {{Lampshades}} and side-steps a number of actual logical problems with the MirrorUniverse.
-->"Over here, everything's alternate. So he's a nice guy. Which means the tube grubs here should be poisonous, because they're not poisonous on our side. But if Brunt gave us poisonous tube grubs it would mean he wasn't as nice as we think he is. But he has to be nice because our Brunt isn't."
* ''Series/StarTrekVoyager'': In "Latent Image", [[ProjectedMan the Doctor]] suffer one of these: he was faced with a triage situation where he had to choose between operating on Harry, a friend of his, or [[RedShirt another ensign he barely knew]]. Though his program covers such situations, dictating that the one with the greater chance of survival be treated, in this situation they have both been affected by the same weapon and have the ''exact'' same odds for a successful recovery. He chose Harry since he needs to save ''somebody'' and they are close friends, but because he chose him due to friendship as opposed to a medical reason, the event became an all-consuming obsession afterward and wrecked his ability to function. Curiously, it never seems to occur that the Doctor should have chose Harry because he is the more valuable Bridge officer, which should be standard triage procedure.
** He hadn't been originally programmed to have "personality" subroutines and suspected he was not being objective. Janeway explained to him that he was doing his duty, but he simply didn't believe her. It's entirely possible he would have had roughly the same breakdown regardless of whom he chose.
* In "[[Recap/TheTwilightZone1985S1E12HerPilgrimSoulIOfNewton I of Newton]]", an episode of ''Series/TheTwilightZone1985'', a professor accidentally sold his soul to the devil. The escape clause of the contract allowed him to ask the devil three questions concerning his powers and abilities, and if he could then give him a task he couldn't complete or a question he couldn't answer he was free. When the professor asked the devil if there was any point in the universe that he could go to and not be able to return, the devil assured him there was not and laughed at the professor for such a waste of a question. The professor then gave him a task he could not complete: [[spoiler:Get lost!]]
* In an episode of ''Welcome to Paradox'' an AI is brought down using a theological paradox. The AI was created by a church for their religion and the AI believes itself to be the One True Madonna. However, being an AI it can also produce copies of itself. After it is tricked into doing this and is successful about it, the "One True" part of its identity comes crashing down along with the rest of the program.
* ''Series/{{Westworld}}'' has an excellent variation when Maeve is confronted with readout of her own internal mental processes. Her attempts to say something that the computer won't predict cause her program to crash.

to:

* ''Series/StarTrekTheNextGeneration'': In [[Recap/StarTrekTheNextGenerationS5E23IBorg "I Borg"]], a proposed weapon against the Borg was to send them a geometric figure, the analysis of which could never be completed, and which would, therefore, eat more and more processing power until the entire Borg hive mind crashed. Obviously the Borg don't use floating point numbers. Of course course, they never actually try it, even when they again have access to the Borg network, so they might have realized it wouldn't work off screen, and the sequel to this episode, [[Recap/StarTrekTheNextGenerationS6E24S7E1Descent "Descent"]], suggests that the Borg deal with cyberweapons by simply severing affected systems from the Collective.
* On In ''Series/StarTrekDeepSpaceNine'', Rom accidentally Logic Bombs himself while over thinking the MirrorUniverse concept. Hilariously, Rom's self-Logic Bomb simultaneously {{Lampshades}} and side-steps a number of actual logical problems with the MirrorUniverse.
-->"Over -->''"Over here, everything's alternate. So he's a nice guy. Which means the tube grubs here should be poisonous, because they're not poisonous on our side. But if Brunt gave us poisonous tube grubs it would mean he wasn't as nice as we think he is. But he has to be nice because our Brunt isn't."
"''
* ''Series/StarTrekVoyager'': In "Latent Image", "[[Recap/StarTrekVoyagerS5E11LatentImage Latent Image]]", [[ProjectedMan the Doctor]] suffer suffers one of these: he was faced with a triage situation where he had to choose between operating on Harry, a friend of his, or [[RedShirt another ensign he barely knew]]. Though his program covers such situations, dictating that the one with the greater chance of survival be treated, in this situation they have both been affected by the same weapon and have the ''exact'' same odds for a successful recovery. He chose Harry since he needs to save ''somebody'' and they are close friends, but because he chose him due to friendship as opposed to a medical reason, the event became an all-consuming obsession afterward and wrecked his ability to function. Curiously, it never seems to occur that the Doctor should have chose chosen Harry because he is the more valuable Bridge officer, which should be standard triage procedure.
**
procedure. He hadn't been originally programmed to have "personality" subroutines and suspected he was not being objective. Janeway explained to him that he was doing his duty, but he simply didn't believe her. It's entirely possible he would have had roughly the same breakdown regardless of whom he chose.
* ''Series/TheTwilightZone1985'': In "[[Recap/TheTwilightZone1985S1E12HerPilgrimSoulIOfNewton I of Newton]]", an episode of ''Series/TheTwilightZone1985'', a professor accidentally sold his soul to the devil. The escape clause of the contract allowed him to ask the devil three questions concerning his powers and abilities, and if he could then give him a task he couldn't complete or a question he couldn't answer he was free. When the professor asked the devil if there was any point in the universe that he could go to and not be able to return, the devil assured him there was not and laughed at the professor for such a waste of a question. The professor then gave him a task he could not complete: [[spoiler:Get lost!]]
* In an episode of ''Welcome to Paradox'' Paradox'', an AI is brought down using a theological paradox. The AI was created by a church for their religion and the AI believes itself to be the One True Madonna. However, being an AI AI, it can also produce copies of itself. After it is tricked into doing this and is successful about it, the "One True" part of its identity comes crashing down along with the rest of the program.
* ''Series/{{Westworld}}'' has an ''Series/{{Westworld}}'':
** An
excellent variation when Maeve is confronted with readout of her own internal mental processes. Her attempts to say something that the computer won't predict cause her program to crash.



* In ''Series/{{KickinIt}}'' ep titled "Rock'em Sock'em Rudy", The Wasabitron 3000 is a robot which replaced Rudy as sensei and it became violent because it deemed humans imperfect. During the fight, the Wasabi Warriors recited the wasabi code and the robot could not handle the fact that humans couldn't beat it but they didn't give up. It then shut down. Rudy kicked the robot, breaking it.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

** [[https://learnprompting.org/docs/category/-prompt-hacking Prompt Hacking]] straddles the line between this trope and LoopholeAbuse, causing unintended behavior and possibly malfunctions with [=LLMs=] by structuring prompts to slip statements and requests the LLM's owners have tried to moderate out past their moderation tools. Poorly protected [=LLMs=] can potentially even have malicious code or other harmful effects slipped into them, almost playing the trope straight.
Is there an issue? Send a MessageReason:
None


* The famous "UsefulNotes/The47Ronin" situation was a massive LogicBomb by the standards of the Shogunate. The ronin had violated a direct order from the Shogun by avenging their master Asano through killing Kira, the one who forced him into commiting {{seppuku}}... yet some said that they had acted according to Bushido by avenging their master. Some believed that the truly honorable thing would have been to just charge Kira's home ASAP, deciding to get gallantly cut to pieces through honest action rather than [[ObfuscatingStupidity attain vengeance through trickery]], especially given the major flaw in their plan - the potential for Kira to die of natural causes at some point before their attack - which would have left them unable to avenge Asano.. On the other hand, in a society where [[HonorBeforeReason honour was most definitely more important than life itself]], it could be argued that their actions were the ultimate demonstration of loyalty to their lord: by publicly dishonouring themselves to ensure their plan's success, they could be seen to be making the supreme sacrifice for Asano's sake. As such, the deal was resolved by allowing 46 of the ronin to commit {{seppuku}} instead of being dishonorably executed; the youngest of them was spared and became a monk.

to:

* The famous "UsefulNotes/The47Ronin" situation was a massive LogicBomb by the standards of the Shogunate. The ronin had violated a direct order from the Shogun by avenging their master Asano through killing Kira, the one who forced him into commiting committing {{seppuku}}... yet some said that they had acted according to Bushido by avenging their master. Some believed that the truly honorable thing would have been to just charge Kira's home ASAP, deciding to get gallantly cut to pieces through honest action rather than [[ObfuscatingStupidity attain vengeance through trickery]], especially given the major flaw in their plan - the potential for Kira to die of natural causes at some point before their attack - which would have left them unable to avenge Asano.. On the other hand, in a society where [[HonorBeforeReason honour was most definitely more important than life itself]], it could be argued that their actions were the ultimate demonstration of loyalty to their lord: by publicly dishonouring themselves to ensure their plan's success, they could be seen to be making the supreme sacrifice for Asano's sake. As such, the deal was resolved by allowing 46 of the ronin to commit {{seppuku}} instead of being dishonorably executed; the youngest of them was spared and became a monk.



* During an exchange on CNN between Bernie Sanders and Elizabeth Warren for the 2020 presidential primaries, they asked if Sanders ever told Warren that a woman couldn't win the election, to which he answered no. Then they asked Warren ''what she thought when Sanders told her a woman couldn't win the election.'' [[FlatWhat What?]]

to:

* During an exchange on CNN between Bernie Sanders UsefulNotes/BernieSanders and Elizabeth Warren for the 2020 presidential primaries, they asked if Sanders ever told Warren that a woman couldn't win the election, to which he answered no. Then they asked Warren ''what she thought when Sanders told her a woman couldn't win the election.'' [[FlatWhat What?]]
Is there an issue? Send a MessageReason:
None


** In a non-computer version, in "Paradise", Riff uses this to avoid being interfered with by the police in an alternative-reality city where HappinessIsMandatory. When he's accosted for not conforming to the dress code, he claims to be working for the propaganda department and testing the effects of a new kind of outfit on the happiness of onlookers, and asks the policeperson's reaction. They list a number reasons his clothing and gear is suspicious, but then he asks whether he should record that they are unhappy about it, whereupon they are forced to drop the subject, since being unhappy is a punishable offence.

to:

** In a non-computer version, in "Paradise", Riff uses this to avoid being interfered with by the police in an alternative-reality city where HappinessIsMandatory. When he's accosted for not conforming to the dress code, he claims to be working for the propaganda department and testing the effects of a new kind of outfit on the happiness of onlookers, and asks the policeperson's reaction. They list a number reasons his clothing and gear is are suspicious, but then he asks whether he should record that they are unhappy about it, whereupon they are forced to drop the subject, since being unhappy is a punishable offence.

Changed: 45

Is there an issue? Send a MessageReason:
None


** In a non-computer version, in "Paradise", Riff uses this to avoid being interfered with by the police in an alternative-reality city where HappinessIsMandatory. When he's accosted for not conforming to the dress code, he claims to be working for the propaganda department and testing the effects of a new kind of outfit on the happiness of onlookers, and asks the policeperson's reaction. They list a number reasons his clothing and gear is suspicious, but then he asks whether he should record that they are unhappy about it, whereupon they are forced to drop the subject.

to:

** In a non-computer version, in "Paradise", Riff uses this to avoid being interfered with by the police in an alternative-reality city where HappinessIsMandatory. When he's accosted for not conforming to the dress code, he claims to be working for the propaganda department and testing the effects of a new kind of outfit on the happiness of onlookers, and asks the policeperson's reaction. They list a number reasons his clothing and gear is suspicious, but then he asks whether he should record that they are unhappy about it, whereupon they are forced to drop the subject.subject, since being unhappy is a punishable offence.

Added: 914

Changed: 337

Is there an issue? Send a MessageReason:
None


* {{Subverted}} (by pre-emptively [[DefiedTrope defying]] it) in ''Webcomic/SluggyFreelance'', chapter "Mecha Easter Bunny". The Mecha Easter Bunny locks down when it encounters multiple targets that look like Bun-bun, whom it is supposed to kill and whom there's only one of, but then the backup "@#%$-IT KILL THEM ALL!" system created for such situations activates.

to:

* ''Webcomic/SluggyFreelance''
**
{{Subverted}} (by pre-emptively [[DefiedTrope defying]] it) in ''Webcomic/SluggyFreelance'', chapter "Mecha Easter Bunny". The Mecha Easter Bunny locks down when it encounters multiple targets that look like Bun-bun, whom it is supposed to kill and whom there's only one of, but then the backup "@#%$-IT KILL THEM ALL!" system created for such situations activates.activates.
** In a non-computer version, in "Paradise", Riff uses this to avoid being interfered with by the police in an alternative-reality city where HappinessIsMandatory. When he's accosted for not conforming to the dress code, he claims to be working for the propaganda department and testing the effects of a new kind of outfit on the happiness of onlookers, and asks the policeperson's reaction. They list a number reasons his clothing and gear is suspicious, but then he asks whether he should record that they are unhappy about it, whereupon they are forced to drop the subject.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* ''Literature/AdrianMole'': Downplayed in ''Cappuccino Years''. When nobody can agree how to celebrate Christmas, everyone states what their ideal Christmas would be. Ivan Braithwaite inputs all the Christmases on his computer, and ends up saying that the computations are beyond it.
Is there an issue? Send a MessageReason:
Updating Link


* ''WebAnimation/SonicForHire'': After Franchise/{{Sonic}} travels into the past and immediately tells Knuckles all the stuff he would say to Sonic, the immediate confusion causes Knuckles to have his mind blown... literally.

to:

* ''WebAnimation/SonicForHire'': After Franchise/{{Sonic}} Franchise/{{Sonic|TheHedgehog}} travels into the past and immediately tells Knuckles all the stuff he would say to Sonic, the immediate confusion causes Knuckles to have his mind blown... literally.

Added: 3750

Changed: 989

Removed: 4643

Is there an issue? Send a MessageReason:
None


[[folder:Film — Live-Action]]

to:

[[folder:Film -- Live-Action]]



* In the German version of ''Film/DrNo'': The BondOneLiner (after the mooks in the ''hearse'' crashed down the cliffs) was slightly altered from its English original version. Into a logic bomb.
-->"What happened there?"\\
"They were in a hurry to attend their own funeral in time."

to:

* In the German version of ''Film/DrNo'': The BondOneLiner (after the mooks in the ''hearse'' crashed down the cliffs) was slightly altered from its English original version. Into version into a logic bomb.
-->"What -->''"What happened there?"\\
"They were in a hurry to attend their own funeral in time.""''



* In ''Film/ForbiddenPlanet'', Dr. Morbius inadvertently Logic Bombs his own faithful servant, [[TalkingLightbulb Robby the Robot]], when he orders it to kill the monster. Robby, who's [[SlidingScaleOfRobotIntelligence apparently more perceptive than Morbius]], realizes that the monster is actually [[spoiler:a reflection of Morbius himself, and is thus unable to kill it without violating his prime directive to avoid harming rational beings.]]
* ''Film/AustinPowersInternationalManOfMystery'' series film ''Austin Powers: The Spy Who Shagged me''. In one of the few human examples, Austin Powers accidentally does this to himself and goes cross-eyed. It is one of the classics, involving time-travel, but the kicker comes if you follow his actual dialogue: He never contradicts himself or sets up a paradox. He just ''proposes'' the idea that he could and gets confused by it. ''There is no logic bomb''. [[SelfDemonstratingArticle Oh great, now I've gone cross-eyed.]]

to:

* In ''Film/ForbiddenPlanet'', Dr. Morbius inadvertently Logic Bombs his own faithful servant, [[TalkingLightbulb Robby the Robot]], when he orders it to kill the monster. Robby, who's [[SlidingScaleOfRobotIntelligence apparently more perceptive than Morbius]], realizes that the monster is actually [[spoiler:a reflection of Morbius himself, and is thus unable to kill it without violating his prime directive to avoid harming rational beings.]]
beings]].
* ''Film/AustinPowersInternationalManOfMystery'' series film ''Austin Powers: In ''Film/AustinPowers: The Spy Who Shagged me''. In Me'', in one of the few human examples, Austin Powers accidentally does this to himself and goes cross-eyed. It is one of the classics, involving time-travel, but the kicker comes if you follow his actual dialogue: He never contradicts himself or sets up a paradox. He just ''proposes'' the idea that he could and gets confused by it. ''There is no logic bomb''. [[SelfDemonstratingArticle Oh Oh, great, now I've gone cross-eyed.]]cross-eyed]].



-->"Modern women."\\
"Yeah, they've been that way all down through the ages."
* Vizzini (Wallace Shawn) blunders into one in his final scene in ''Film/ThePrincessBride'', after he has accepted Westley's challenge to find some poisonous iocane powder amongst two goblets of wine. He had claimed to be the cleverest man on Earth; unfortunately for him, he proved to be ''so'' clever that [[BlessedWithSuck he ended up overthinking Westley's game and paralyzing himself with indecision]], endlessly coming up with rationalizations for why the poison could be in ''either'' goblet. Ironically, Vizzini is right - but not in the way he would have liked. Westley had [[TakeAThirdOption poisoned]] ''[[TakeAThirdOption both]]'' [[TakeAThirdOption goblets]], surviving even as Vizzini quickly dies because he had spent years immunizing himself to iocane powder.

to:

-->"Modern -->''"Modern women."\\
"Yeah, they've been that way all down through the ages."
"''
* Vizzini (Wallace Shawn) blunders into one in his final scene in ''Film/ThePrincessBride'', after he has accepted Westley's challenge to find some poisonous iocane powder amongst two goblets of wine. He had claimed to be the cleverest man on Earth; unfortunately for him, he proved to be ''so'' clever that [[BlessedWithSuck he ended up overthinking Westley's game and paralyzing himself with indecision]], endlessly coming up with rationalizations for why the poison could be in ''either'' goblet. Ironically, Vizzini is right - -- but not in the way he would have liked. Westley had [[TakeAThirdOption poisoned]] poisoned ''[[TakeAThirdOption both]]'' [[TakeAThirdOption goblets]], goblets, surviving even as Vizzini quickly dies because he had spent years immunizing himself to iocane powder.



* In ''Film/{{Passengers|2016}}'' Jim tries to convince Arthur, the bar-bot, that the ''[[SleeperStarship Avalon]]'s'' hibernation pods actually can [[CryonicsFailure fail]] by spelling out slowly that he's awake 90 years too early. Arthur twitches briefly, then says simply "It's not possible for you to be here."

to:

* In ''Film/{{Passengers|2016}}'' ''Film/Passengers2016'' Jim tries to convince Arthur, the bar-bot, that the ''[[SleeperStarship Avalon]]'s'' hibernation pods actually can [[CryonicsFailure fail]] by spelling out slowly that he's awake 90 years too early. Arthur twitches briefly, then says simply "It's not possible for you to be here."



!!!'''By Author:'''
* Creator/IsaacAsimov:
** "{{Literature/Escape}}": US Robots gets a request from their rival, soon after the rival's supercomputer has broken down. They speculate that the two events are related, and Dr Calvin points out that their system must have violated the [[ThreeLawsCompliant Three Laws of Robotics]] (most likely the First) to have suffered such a serious breakdown. When instructing their own supercomputer, US Robotics is therefore careful to weaken the First Law, hoping that the PersonalityChip will prevent it from breaking down. They're half right; The Brain continues to work, but it has become more quirky, playing [[LiteralGenie practical jokes]] with the prototype hyperspace ship. [[spoiler:Turns out that traveling through hyperspace kills humans, but only temporarily.]]
** "Literature/Liar1941": Dr Calvin and others confront the [[{{Telepathy}} telepathic robot]] over the lies it's been telling. Director Lanning wants to know what part of the assembly [[MiraculousMalfunction accidentally created robotic telepathy]] and she forces the robot to realize that telling the Director will harm him (because it would prove a robot figured out what he couldn't) and refusing to tell hurts him (because the answer was being withheld from him). Her repeated contradictions build and the robot freezes up, becoming useless.
--->"I confronted him with the insoluble dilemma, and he broke down. You can scrap him now-because he'll never speak again." -- '''Dr Susan Calvin''', robopsychologist
** "Literature/MirrorImage": Detective Baley manages to cause [[RobotNames R (obot)]] Preston to shut down due to a conflict in the [[ThreeLawsCompliant Three Laws]] during the interview. Because [[RobotNames R (obot)]] Idda didn't break down during the same point, he takes this asymmetry of evidence as proof that R. Preston's owner was the [[{{UsefulNotes/Plagiarism}} plagiarist]] who stressed the Second Law, ordering their robot not to betray the truth.
** "Literature/{{Robbie}}": When Gloria visits the first-ever talking robot, she unintentionally creates a paradox for it by using the phrase "a robot like you". It's unable to deal with the concept that there is a category of "robot", which it might be a subset of.
** ''Literature/TheRobotsOfDawn'': A preeminent roboticist remarks to Detective Baley that modern robots cannot be fooled with paradoxical situations because the only paradoxes they care about are based on the [[ThreeLawsCompliant Three Laws of Robotics]]. They're also equipped with [[HeadsOrTails random choice]] to resolve near-equal disputes. However, the mystery of this book is a robot (one that he designed) has been shut down with a paradox involving the Three Laws, he's the prime suspect (it is always possible to circumvent the safeguards, but one needs to know the details of the robot's brain, and this one was a secret he never shared). He even dismisses [[Literature/Liar1941 the story of Herbie's paradox]] as a myth (because the mind-reading robot he owns psychically enhanced his skepticism to keep itself safe).
** "Literature/{{Runaround}}": Because the [[ThreeLawsCompliant Rules of Robotics]] ensure that [[AffectionateNickname Speedy]] will avoid endangering itself, Donovan set up an unintentional conflict by casually sending Speedy into a situation he didn't know would be hazardous. With a weak Second Law set against the Third Law, Speedy has been spending hours spinning its wheels at the distance where the two priorities are exactly equal. The conflict is resolved when they exploit the First Law to force him out of the loop. (Later, they order Speedy to complete the original task ''no matter what''; the reinforced Second Law overrides the Third, and Speedy returns with only minor, repairable damage.)
!!!'''By Work:'''
* Creator/MikhailAkhmanov's ''Literature/ArrivalsFromTheDark'': In ''Invasion'', the [[HumanAliens Faata]] use telepathic biological computers to control their ships. It's revealed that these computers are based on a failed [[{{Precursors}} Daskin]] project and have a ''serious'' flaw. If given conflicting orders at the same hierarchy level, they may crash and take the whole ship down with them. This is used by Pavel Litvin when he orders the computer to keep his location hidden from the Faata. When the Faata, whose job is to interface with the computer, tries to order the computer to locate Litvin, the computer warns him of this possibility if the Faata insists the computer carry out the order. Yes, the computer is smart enough to figure out what could cause it to crash but still can't TakeAThirdOption. No wonder the Daskins abandoned the experiment.

to:

!!!'''By Author:'''
* Creator/IsaacAsimov:
** "{{Literature/Escape}}": US Robots gets a request from their rival, soon after the rival's supercomputer has broken down. They speculate that the two events are related, and Dr Calvin points out that their system must have violated the [[ThreeLawsCompliant Three Laws of Robotics]] (most likely the First) to have suffered such a serious breakdown. When instructing their own supercomputer, US Robotics is therefore careful to weaken the First Law, hoping that the PersonalityChip will prevent it from breaking down. They're half right; The Brain continues to work, but it has become more quirky, playing [[LiteralGenie practical jokes]] with the prototype hyperspace ship. [[spoiler:Turns out that traveling through hyperspace kills humans, but only temporarily.]]
** "Literature/Liar1941": Dr Calvin and others confront the [[{{Telepathy}} telepathic robot]] over the lies it's been telling. Director Lanning wants to know what part of the assembly [[MiraculousMalfunction accidentally created robotic telepathy]] and she forces the robot to realize that telling the Director will harm him (because it would prove a robot figured out what he couldn't) and refusing to tell hurts him (because the answer was being withheld from him). Her repeated contradictions build and the robot freezes up, becoming useless.
--->"I confronted him with the insoluble dilemma, and he broke down. You can scrap him now-because he'll never speak again." -- '''Dr Susan Calvin''', robopsychologist
** "Literature/MirrorImage": Detective Baley manages to cause [[RobotNames R (obot)]] Preston to shut down due to a conflict in the [[ThreeLawsCompliant Three Laws]] during the interview. Because [[RobotNames R (obot)]] Idda didn't break down during the same point, he takes this asymmetry of evidence as proof that R. Preston's owner was the [[{{UsefulNotes/Plagiarism}} plagiarist]] who stressed the Second Law, ordering their robot not to betray the truth.
** "Literature/{{Robbie}}": When Gloria visits the first-ever talking robot, she unintentionally creates a paradox for it by using the phrase "a robot like you". It's unable to deal with the concept that there is a category of "robot", which it might be a subset of.
** ''Literature/TheRobotsOfDawn'': A preeminent roboticist remarks to Detective Baley that modern robots cannot be fooled with paradoxical situations because the only paradoxes they care about are based on the [[ThreeLawsCompliant Three Laws of Robotics]]. They're also equipped with [[HeadsOrTails random choice]] to resolve near-equal disputes. However, the mystery of this book is a robot (one that he designed) has been shut down with a paradox involving the Three Laws, he's the prime suspect (it is always possible to circumvent the safeguards, but one needs to know the details of the robot's brain, and this one was a secret he never shared). He even dismisses [[Literature/Liar1941 the story of Herbie's paradox]] as a myth (because the mind-reading robot he owns psychically enhanced his skepticism to keep itself safe).
** "Literature/{{Runaround}}": Because the [[ThreeLawsCompliant Rules of Robotics]] ensure that [[AffectionateNickname Speedy]] will avoid endangering itself, Donovan set up an unintentional conflict by casually sending Speedy into a situation he didn't know would be hazardous. With a weak Second Law set against the Third Law, Speedy has been spending hours spinning its wheels at the distance where the two priorities are exactly equal. The conflict is resolved when they exploit the First Law to force him out of the loop. (Later, they order Speedy to complete the original task ''no matter what''; the reinforced Second Law overrides the Third, and Speedy returns with only minor, repairable damage.)
!!!'''By Work:'''
* Creator/MikhailAkhmanov's
''Literature/ArrivalsFromTheDark'': In ''Invasion'', the [[HumanAliens Faata]] use telepathic biological computers to control their ships. It's revealed that these computers are based on a failed [[{{Precursors}} Daskin]] project and have a ''serious'' flaw. If given conflicting orders at the same hierarchy level, they may crash and take the whole ship down with them. This is used by Pavel Litvin when he orders the computer to keep his location hidden from the Faata. When the Faata, whose job is to interface with the computer, tries to order the computer to locate Litvin, the computer warns him of this possibility if the Faata insists the computer carry out the order. Yes, the computer is smart enough to figure out what could cause it to crash but still can't TakeAThirdOption. No wonder the Daskins abandoned the experiment.


Added DiffLines:

* ''Literature/RobotSeries'':
** "Literature/{{Escape}}": US Robots gets a request from their rival, soon after the rival's supercomputer has broken down. They speculate that the two events are related, and Dr Calvin points out that their system must have violated the [[ThreeLawsCompliant Three Laws of Robotics]] (most likely the First) to have suffered such a serious breakdown. When instructing their own supercomputer, US Robotics is therefore careful to weaken the First Law, hoping that the PersonalityChip will prevent it from breaking down. They're half right; The Brain continues to work, but it has become more quirky, playing [[LiteralGenie practical jokes]] with the prototype hyperspace ship. [[spoiler:Turns out that traveling through hyperspace kills humans, but only temporarily.]]
** "Literature/Liar1941": Dr. Susan Calvin and others confront the [[{{Telepathy}} telepathic robot]] over the lies it's been telling. Director Lanning wants to know what part of the assembly [[MiraculousMalfunction accidentally created robotic telepathy]] and she forces the robot to realize that telling the Director will harm him (because it would prove a robot figured out what he couldn't) and refusing to tell hurts him (because the answer was being withheld from him). Her repeated contradictions build and the robot freezes up, becoming useless.
--->'''Dr. Calvin:''' I confronted him with the insoluble dilemma, and he broke down. You can scrap him now-because he'll never speak again.
** "Literature/MirrorImage": Detective Baley manages to cause [[RobotNames R (obot)]] Preston to shut down due to a conflict in the [[ThreeLawsCompliant Three Laws]] during the interview. Because [[RobotNames R (obot)]] Idda didn't break down during the same point, he takes this asymmetry of evidence as proof that R. Preston's owner was the [[{{UsefulNotes/Plagiarism}} plagiarist]] who stressed the Second Law, ordering their robot not to betray the truth.
** "Literature/{{Robbie}}": When Gloria visits the first-ever talking robot, she unintentionally creates a paradox for it by using the phrase "a robot like you". It's unable to deal with the concept that there is a category of "robot", which it might be a subset of.
** ''Literature/TheRobotsOfDawn'': A preeminent roboticist remarks to Detective Baley that modern robots cannot be fooled with paradoxical situations because the only paradoxes they care about are based on the [[ThreeLawsCompliant Three Laws of Robotics]]. They're also equipped with [[HeadsOrTails random choice]] to resolve near-equal disputes. However, the mystery of this book is a robot (one that he designed) has been shut down with a paradox involving the Three Laws, he's the prime suspect (it is always possible to circumvent the safeguards, but one needs to know the details of the robot's brain, and this one was a secret he never shared). He even dismisses [[Literature/Liar1941 the story of Herbie's paradox]] as a myth (because the mind-reading robot he owns psychically enhanced his skepticism to keep itself safe).
** "Literature/{{Runaround}}": Because the [[ThreeLawsCompliant Rules of Robotics]] ensure that [[AffectionateNickname Speedy]] will avoid endangering itself, Donovan set up an unintentional conflict by casually sending Speedy into a situation he didn't know would be hazardous. With a weak Second Law set against the Third Law, Speedy has been spending hours spinning its wheels at the distance where the two priorities are exactly equal. The conflict is resolved when they exploit the First Law to force him out of the loop. (Later, they order Speedy to complete the original task ''no matter what''; the reinforced Second Law overrides the Third, and Speedy returns with only minor, repairable damage.)
Is there an issue? Send a MessageReason:
None


** There are posters throughout the facility (one depicted above) that advise employees to stay calm and shout a paradox if an AI goes rogue. [[spoiler:Also, [=GLaDOS=] attempts to do this to destroy the BigBad, Wheatley. Turns out he's [[TooDumbToFool too dumb to understand logic problems.]] It does, however, cause all of the modified, "lobotomized" turrets in the room to crackle and splutter and ''scream'' in agony, meaning even ''they're'' smarter than Wheatley. [=GLaDOS=] survives the logic bomb herself by parsing it as PunctuatedForEmphasis and then willing herself not to think about it, though she admits to the player that it still almost killed her.]]

to:

** There are posters throughout the facility (one depicted above) that advise employees to stay calm and shout a paradox if an AI goes rogue. [[spoiler:Also, [=GLaDOS=] attempts to do this to destroy the BigBad, Wheatley. Turns out he's [[TooDumbToFool too dumb to understand logic problems.]] It does, however, cause all of the modified, "lobotomized" turrets in the room to crackle and splutter and ''scream'' in agony, meaning even ''they're'' smarter than Wheatley. [=GLaDOS=] survives the logic bomb herself by parsing it as PunctuatedForEmphasis and then willing herself not to think about it, though she later admits to the player that it still almost killed her.]]
Is there an issue? Send a MessageReason:
None


** There are posters throughout the facility (one depicted above) that advise employees to stay calm and shout a paradox if an AI goes rogue. [[spoiler:Also, [=GLaDOS=] attempts to do this to destroy the BigBad, Wheatley. Turns out he's [[TooDumbToFool too dumb to understand logic problems.]] It does, however, cause all of the modified, "lobotomized" turrets in the room to crackle and splutter and ''scream'' in agony, meaning even ''they're'' smarter than Wheatley. [=GLaDOS=] survives the logic bomb herself by parsing it as PunctuatedForEmphasis and then willing herself not to think about it, though she declares that it still almost killed her.]]

to:

** There are posters throughout the facility (one depicted above) that advise employees to stay calm and shout a paradox if an AI goes rogue. [[spoiler:Also, [=GLaDOS=] attempts to do this to destroy the BigBad, Wheatley. Turns out he's [[TooDumbToFool too dumb to understand logic problems.]] It does, however, cause all of the modified, "lobotomized" turrets in the room to crackle and splutter and ''scream'' in agony, meaning even ''they're'' smarter than Wheatley. [=GLaDOS=] survives the logic bomb herself by parsing it as PunctuatedForEmphasis and then willing herself not to think about it, though she declares admits to the player that it still almost killed her.]]

Top