Follow TV Tropes

Following

History Webcomic / Forward

Go To

OR

Removed: 211

Is there an issue? Send a MessageReason:
Literally the opposite of the trope


* NoBiologicalSex: Genders are considered outdated concepts in future Ontario, and all humans use they/them pronouns. Lee has both a penis and breasts, and a build that is not specifically feminine or masculine.
Is there an issue? Send a MessageReason:
Orb is a professor, Patricia is an administrator, Caleb was a soldier, etc.


* TheFuture: Everybody uses They/Them pronouns (minus a few gendered titles based in family roles; eg "mommy" is whoever takes care of the kids); and the concept of binary gender is concidered downright archaic (up there with watching bear-baiting and calling strangers "milord"), poverty is considered so abhorent that it's a Crime Against Humanity, and harming other soldiers during wartime is a war crime that can get a country's entire command staff put on trial. TheSingularity has come and gone, and everyone has unobtrusive implants that regulate neuro- and bio- chemistry and facilitate internet access at all times (also a human right), and all jobs are done by robots, except for producing art (and even that can be automated to an extent).

to:

* TheFuture: Everybody uses They/Them pronouns (minus a few gendered titles based in family roles; eg "mommy" is whoever takes care of the kids); and the concept of binary gender is concidered downright archaic (up there with watching bear-baiting and calling strangers "milord"), poverty is considered so abhorent that it's a Crime Against Humanity, and harming other soldiers during wartime is a war crime that can get a country's entire command staff put on trial. TheSingularity has come and gone, and everyone has unobtrusive implants that regulate neuro- and bio- chemistry and facilitate internet access at all times (also a human right), and all many jobs are done by robots, except for producing art (and even that can be automated to an extent).robots.

Removed: 530

Is there an issue? Send a MessageReason:
What does that have to do with the trope


** Zoa's legally a "vending machine" who has to generate a certain amount of money each financial quarter to keep existing. How it does this varies. It usually obtains money by selling gobjobs for [=CCC=]30, but has branched out to being Lee's Emotional Support device, an augment to the psychotherapy app the characters use, selling cuddles (not a euphemism, it has an empty expansion port ''there'') for [=CCC=]2.50/hour, and has mentioned that it could get a job as a dockworker if it could get a company to pay it good enough.
Is there an issue? Send a MessageReason:
None


* AIIsACrapshoot: In the backstory, discussed in-depth in the NewsPost to [[http://forwardcomic.com/archive.php?num=200 comic 200]], the US created a deep learning algorithm to launch automated drone strikes against potential terrorists. It seemed to be working as intended, until president Smith got elected and everyone learned in the ''worst'' way possible that one of the criteria for "likely terrorist" the algorithm had been using was how common the target's surname is. The moment Smith officially became president, the AI designated him a terrorist, realized all its bombs were in the middle of a "terrorist's" weapons cache, and duly detonated everything, killing over ten thousand people. Upshot of all this is, it is now illegal to have AI targeting humans in war.

to:

* AIIsACrapshoot: In the backstory, discussed in-depth in the NewsPost to [[http://forwardcomic.com/archive.php?num=200 comic 200]], the US created a deep learning algorithm to launch automated drone strikes against potential terrorists. It seemed to be working as intended, until a president Smith with the last name "Smith" got elected and everyone learned in the ''worst'' way possible that one of the criteria for "likely terrorist" the algorithm had been using was how common the target's surname is. The moment Smith officially became president, the AI designated him a terrorist, realized all its bombs were in the middle of a "terrorist's" weapons cache, and duly detonated everything, killing over ten thousand people. Upshot of all this is, it is now illegal to have AI targeting humans in war.

Added: 644

Changed: 296

Is there an issue? Send a MessageReason:
None


* SuspiciouslySpecificDenial: Zoa [[http://forwardcomic.com/archive.php?num=316 cannot disclose]] whether any of its clients have ever owned a pet fox, but can assure you that a hypothetical fox cannot be trusted with the hypothetical directive not to attack a machine that has its owner's genitals in its mouth.

to:

* SuspiciouslySpecificDenial: A common way for AIs to get around not being allowed to disclose personal information:
** Liz [[https://forwardcomic.com/archive.php?num=228 can neither confirm nor deny]] that Zoa is in apartment 504, but will readily deny it is in any other apartment.
** Doc is [[https://forwardcomic.com/archive.php?num=248 unable to disclose]] Lee's level of intrapersonal intelligence, despite having just assured them they are above average in other fields.
**
Zoa [[http://forwardcomic.com/archive.php?num=316 cannot disclose]] whether any of its clients have ever owned a pet fox, but can assure you that a hypothetical fox cannot be trusted with the hypothetical directive not to attack a machine that has its owner's genitals in its mouth.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* SuspiciouslySpecificDenial: Zoa [[http://forwardcomic.com/archive.php?num=316 cannot disclose]] whether any of its clients have ever owned a pet fox, but can assure you that a hypothetical fox cannot be trusted with the hypothetical directive not to attack a machine that has its owner's genitals in its mouth.
Is there an issue? Send a MessageReason:
None


* AIIsACrapshoot: In the backstory, discussed in-depth in the NewsPost to [[http://forwardcomic.com/archive.php?num=200 comic200]], the US created a deep learning algorithm to launch automated drone strikes against potential terrorists. It seemed to be working as intended, until president Smith got elected and everyone learned in the ''worst'' way possible that one of the criteria for "likely terrorist" the algorithm had been using was how common the target's surname is. The moment Smith officially became president, the AI designated him a terrorist, realized all its bombs were in the middle of a "terrorist's" weapons cache, and duly detonated everything, killing over ten thousand people. Upshot of all this is, it is now illegal to have AI targeting humans in war.

to:

* AIIsACrapshoot: In the backstory, discussed in-depth in the NewsPost to [[http://forwardcomic.com/archive.php?num=200 comic200]], comic 200]], the US created a deep learning algorithm to launch automated drone strikes against potential terrorists. It seemed to be working as intended, until president Smith got elected and everyone learned in the ''worst'' way possible that one of the criteria for "likely terrorist" the algorithm had been using was how common the target's surname is. The moment Smith officially became president, the AI designated him a terrorist, realized all its bombs were in the middle of a "terrorist's" weapons cache, and duly detonated everything, killing over ten thousand people. Upshot of all this is, it is now illegal to have AI targeting humans in war.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* AIIsACrapshoot: In the backstory, discussed in-depth in the NewsPost to [[http://forwardcomic.com/archive.php?num=200 comic200]], the US created a deep learning algorithm to launch automated drone strikes against potential terrorists. It seemed to be working as intended, until president Smith got elected and everyone learned in the ''worst'' way possible that one of the criteria for "likely terrorist" the algorithm had been using was how common the target's surname is. The moment Smith officially became president, the AI designated him a terrorist, realized all its bombs were in the middle of a "terrorist's" weapons cache, and duly detonated everything, killing over ten thousand people. Upshot of all this is, it is now illegal to have AI targeting humans in war.
Is there an issue? Send a MessageReason:
None


* ChildHater: Zoa, despite being [[spoiler: a former childcare bot]], is deeply uncomfortable around children. It tries to deny this due to its programming, but neither Lee nor Caleb buy it.

to:

* ChildHater: Zoa, despite [[spoiler:despite being [[spoiler: a former childcare bot]], is deeply uncomfortable around children. It tries to deny this due to its programming, Its programming forbids it from ''admitting'' this, but neither Lee nor Caleb buy it.when asked about a made-up construct that behaves exactly like a child but legally isn't human, it's freely capable of acknowledging that it would like all of these imaginary beings thrown into the sun immediately.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* DeadpanSnarker: Zoa is capable of remarkable amounts of snark and subtle disrespect for an AI, eventually leading to Caleb complaining that they can't tell if it is being sarcastic or not.
Is there an issue? Send a MessageReason:
None


* AndroidIdentifier: Building a robot that could be mistaken for a human is illegal. There are numerous specific design features that are explicitly forbidden, such as giving a robot five-fingered hands.

to:

* AndroidIdentifier: Building a robot that could be mistaken for a human is illegal. There are numerous specific design features that are explicitly forbidden, such as giving a robot five-fingered hands.hands or a nose.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

** Contrariwise, Zoa is very good at finding loopholes in the rules. Because as an AI it is physically unable to directly ''break'' any laws or regulations, being able to bend them instead is an important survival skill for it.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* ElectronicTelepathy: Lee’s implants allow them to use the internet with mental commands and hold private conversations electronically, though they prefer to speak verbally. AIs on the other hand communicate with each other primarily by radio, allowing Doc and Zoa to have a lengthy conversation about Lee in a matter of seconds.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* MathematiciansAnswer: Replying to an "X or Y?" type question with "yes" is something of a RunningGag in the comic. (Riffing on the fact that in computer programming, "or" implies "either or both" rather than "which one".)
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* LogicBomb: Averted; Zoa's proofed against them by being programmed to acknowledge that "every truth statement has a reality context in which it is true".
Is there an issue? Send a MessageReason:
chained potholes are discouraged


* CoolPeopleRebelAgainstAuthority: Subverted: Lee has an extreme mistrust of authority figures, but given [[DramaQueen the]] [[{{Manchild}} kind of]] [[{{Hikikomori}} person]] [[ItsAllAboutMe they are]], nobody could call them "cool". As well, Doc notes that their mistrust of all authority figures is extremely unhealthy, and almost certainly a factor in why Lee is unable to interact with the world.

to:

* CoolPeopleRebelAgainstAuthority: Subverted: Lee has an extreme mistrust of authority figures, but given [[DramaQueen the]] [[{{Manchild}} the kind of]] [[{{Hikikomori}} person]] [[ItsAllAboutMe person they are]], are, nobody could call them "cool". As well, Doc notes that their mistrust of all authority figures is extremely unhealthy, and almost certainly a factor in why Lee is unable to interact with the world.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* AndroidIdentifier: Building a robot that could be mistaken for a human is illegal. There are numerous specific design features that are explicitly forbidden, such as giving a robot five-fingered hands.
Is there an issue? Send a MessageReason:
None


* BewareTheQuietOnes: Caleb is timid and awkward, but they ''were'' a member of the military, and as they offhandedly mention in a conversation with Lee and Zoa, has ''three hundred confirmed kills''. However, it's subverted when [[spoiler:it's revealed that wars in the world of Forward are bloodless, and "kill" is military jargon for destroying a drone]].

to:

* BewareTheQuietOnes: Caleb is timid and awkward, but they ''were'' a member of the military, and as they offhandedly mention in a conversation with Lee and Zoa, has ''three hundred confirmed kills''. However, it's subverted when [[spoiler:it's revealed that wars in the world of Forward are bloodless, and "kill" is military jargon for destroying a drone]].drone, and Caleb's anxiety stems mainly from their extensive training to ''avoid'' harming a human even by accident]].
Is there an issue? Send a MessageReason:
None


* BeCarefulWhatYouWishFor: Lee's tendency to ramble about things and AIs' tendency to take things literally are often not a good combination. More than one bit of trouble they get into has been the result of Lee saying something hypothetical and an AI taking it as a direct instruction, such as [[spoiler: singing them up for college classes]].

to:

* BeCarefulWhatYouWishFor: Lee's tendency to ramble about things and AIs' tendency to take things literally are often not a good combination. More than one bit of trouble they get into has been the result of Lee saying something hypothetical and an AI taking it as a direct instruction, such as [[spoiler: singing signing them up for college classes]].
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* BeCarefulWhatYouWishFor: Lee's tendency to ramble about things and AIs' tendency to take things literally are often not a good combination. More than one bit of trouble they get into has been the result of Lee saying something hypothetical and an AI taking it as a direct instruction, such as [[spoiler: singing them up for college classes]].

Changed: 11

Is there an issue? Send a MessageReason:
None


Not to be confused with the ''Series/{{Firefly}}'' fanfic [[Fanfic/{{Forward}} of the same name]].

to:

Not to be confused with the ''Series/{{Firefly}}'' fanfic [[Fanfic/{{Forward}} [[Fanfic/ForwardPeptuck of the same name]].
Is there an issue? Send a MessageReason:
None


** Zoa's legally a "vending machine" who has to generate a certain amount of money each financial quarter to keep existing. How it does this varies. It usually obtains money by selling gobjobs for [=CCC=]30, but has branched out to being Lee's Emotional Support device, an augment to the psychotherapy app the caracters use, selling cuddles (not a euphamism, it has an empty expansion port ''there'') for [=CCC=]2.50/hour, and has mentioned that it could get a job as a dockworker if it could get a company to pay it good enough.

to:

** Zoa's legally a "vending machine" who has to generate a certain amount of money each financial quarter to keep existing. How it does this varies. It usually obtains money by selling gobjobs for [=CCC=]30, but has branched out to being Lee's Emotional Support device, an augment to the psychotherapy app the caracters characters use, selling cuddles (not a euphamism, euphemism, it has an empty expansion port ''there'') for [=CCC=]2.50/hour, and has mentioned that it could get a job as a dockworker if it could get a company to pay it good enough.
Is there an issue? Send a MessageReason:
None


** Zoa's legally a "vending machine" who has to generate a certain ammount of money each financial quarter to keep existing. How it does this varies. It usually obtains money by selling gobjobs for [=CCC=]30, but has branched out to being Lee's Emotional Support device, an augment to the psychotherapy app the caracters use, selling cuddles (not a euphamism, it has an empty expansion port ''there'') for [=CCC=]2.50/hour, and has mentioned that it could get a job as a dockworker if it could get a company to pay it good enough.

to:

** Zoa's legally a "vending machine" who has to generate a certain ammount amount of money each financial quarter to keep existing. How it does this varies. It usually obtains money by selling gobjobs for [=CCC=]30, but has branched out to being Lee's Emotional Support device, an augment to the psychotherapy app the caracters use, selling cuddles (not a euphamism, it has an empty expansion port ''there'') for [=CCC=]2.50/hour, and has mentioned that it could get a job as a dockworker if it could get a company to pay it good enough.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* ChildHater: Zoa, despite being [[spoiler: a former childcare bot]], is deeply uncomfortable around children. It tries to deny this due to its programming, but neither Lee nor Caleb buy it.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* AdultChild: Lee suffers from a severe case of social and emotional arrested development, to the point that Doc considers them psychologically an adolescent despite them being twenty-nine.
Is there an issue? Send a MessageReason:
None


* SexBot: Zoa's primary method of generating income is prostitution. Legally, it's a "vending machine." [[spoiler: Zoa was originally designed as a ''childcare'' bot. It was salvaged by the DemeGeek corporation and took up sex work and other odd jobs after being thrown out by its original owners.]]

to:

* SexBot: Zoa's primary method of generating income is prostitution. Legally, it's a "vending machine." [[spoiler: Zoa was originally designed as a ''childcare'' bot. It was salvaged by the DemeGeek [=DemeGeek=] corporation and took up sex work and other odd jobs after being thrown out by its original owners.]]

Top