Follow TV Tropes

Following

History Headscratchers / TwoThousandTenTheYearWeMakeContact

Go To

OR

Added: 1914

Changed: 2

Is there an issue? Send a MessageReason:
None


* This troper interpreted Floyd's response somewhat differently. He seems honestly confused when Chandra hits him with the "you did." And then after Chandra tells him about the communication, his facial expression is more realization than guilt. His reaction suggested to me that while he knew of the monolith and what Discovery's mission was changed to, he was set up with some of the directives being traced to him as a potential fall guy in case things went wrong. They did, and he was, as we see in the beginning of the film. Floyd ended up shouldering the entire blame. His "Those sons of bitches. I didn't know!" came across like a man who just realized just how badly he got screwed over.

to:

* ** This troper interpreted Floyd's response somewhat differently. He seems honestly confused when Chandra hits him with the "you did." And then after Chandra tells him about the communication, his facial expression is more realization than guilt. His reaction suggested to me that while he knew of the monolith and what Discovery's mission was changed to, he was set up with some of the directives being traced to him as a potential fall guy in case things went wrong. They did, and he was, as we see in the beginning of the film. Floyd ended up shouldering the entire blame. His "Those sons of bitches. I didn't know!" came across like a man who just realized just how badly he got screwed over.over.
** Partly excusable as the loose omniverse style Clarke went for, but in the book itself, Floyd explains in his ship-to-Earth correspondence that he was actually in on the plans for HAL and ''Discovery''. He objected strenuously to it at the time, but the government overruled his concerns.



** Synthetic Algorithmic?

to:

** Synthetic Algorithmic? Algorithmic?
** Actually kinda makes sense. In the book, SAL is kind of described as a less complete AI system created after the issues with HAL became known, ostensibly to help find problems with the 9000 series AI design. Her responses and interaction with Chandra suggest a lower level of consciousness than HAL possessed.


Added DiffLines:

** The propulsion system of the ''Discovery'' (as mentioned in the books, but left rather vague in the films save for some production materials) was a so-called "plasma drive", which used a gas-core nuclear reactor that superheated a propellant, kind of like a turbo-charged ion engine; no oxygen necessary. Liquid hydrogen would be more efficient, but also more likely to leach out into space and evaporate over time. In the book, the ship used hydrogen on the initial burn from Earth in booster tanks that were discarded, then used ammonia as fuel for the rest of the mission. In the book of 2001, the target for ''Discovery'' was changed from Jupiter to Saturn (where the monolith broadcast instead), and THAT meant no solo return trip, as the mission couldn't be redesigned for a return from Saturn. In the films, the ship had enough propellant for a minimum-fuel transfer back to Earth over the course of a couple of years (while the initial flight to Jupiter is described as having taken most of a year). And in all cases, it was assumed HAL was well-adjusted enough that he could keep the ship running and watch the hibernating astronauts without a problem for years. Mining hydrogen from the upper atmosphere would be really difficult, as the ship is not at all designed to handle a planetary atmosphere.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

** Mined Jupiter's atmosphere for the Hydrogen they needed? Dunno where they'd get the Oxygen, however.

Changed: 261

Removed: 24

Is there an issue? Send a MessageReason:
None


\\
If HAL actually stands for "Heuristic/Algorithmic", and it's not just a pun to imply "one step beyond IBM"... what does SAL stand for?
* Synthetic Algorithmic?

to:

\\
If
*If HAL actually stands for "Heuristic/Algorithmic", and it's not just a pun to imply "one step beyond IBM"... what does SAL stand for?
* ** Synthetic Algorithmic?
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* What was the original plan for HAL? In the original novel the Discovery lacked the fuel for a return trip. If everything had gone according to plan at the end of their mission the 5 astronauts would have all gone into hibernation and leave HAL in charge of the ship until a second vessel was sent to retrieve them. But in 2010 we learn that HAL is pretty much hardwired into the Discovery, with the crew of Lenov having to leave HAL behind to perish in the explosion of Jupiter. So was the plan to maroon HAL out in space the plan all along?
Is there an issue? Send a MessageReason:
typo


---> '''HAL:'''Well, certainly no one could have been unaware of the very strange stories floating around before we left. Rumors of something being dug up on the moon. I never gave these stories much credence. But particularly in view of some of the other things that have happened I find them difficult to put out of my mind. For instance, the way all our preparations were kept under such tight security. And the melodramatic touch of putting doctors Hunter, Kimball and Kaminski aboard already in hibernation after four months of separate training on their own.

to:

---> '''HAL:'''Well, '''HAL:''' Well, certainly no one could have been unaware of the very strange stories floating around before we left. Rumors of something being dug up on the moon. I never gave these stories much credence. But particularly in view of some of the other things that have happened I find them difficult to put out of my mind. For instance, the way all our preparations were kept under such tight security. And the melodramatic touch of putting doctors Hunter, Kimball and Kaminski aboard already in hibernation after four months of separate training on their own.
Is there an issue? Send a MessageReason:
better fit trope


*** Before the crap hits the fan, HAL makes some comments to Dave about the oddities of the mission. This may be simply a sign that the "secrecy" directive is starting to crack under the strain, or perhaps HAL is doing the best he can (within the limitations of his orders) to [[IllNeverTellYouWhatImTellingYou reveal the truth to the crew]] so he doesn't need to keep the secret any more.

to:

*** Before the crap hits the fan, HAL makes some comments to Dave about the oddities of the mission. This may be simply a sign that the "secrecy" directive is starting to crack under the strain, or perhaps HAL is doing the best he can (within the limitations of his orders) to [[IllNeverTellYouWhatImTellingYou [[CouldSayItBut reveal the truth to the crew]] so he doesn't need to keep the secret any more.
Is there an issue? Send a MessageReason:
None


* The HAL9000 is supposedly the most advanced computer and AI available to man yet apparently no one checked how it would act when given conflicting directives? This is the kind of thing they teach you about in undergraduate (if not high-school) level computer science. Didn't the supposed genius Chandra think of this? Does HAL Laboratories even employ a QA team that isn't made up of a bunch of stoned monkeys? Any half-way decent test plan would have caught this. HAL should have been programmed to immediately reject any order which causes this kind of conflict.\\

to:

* The HAL9000 [=HAL9000=] is supposedly the most advanced computer and AI available to man yet apparently no one checked how it would act when given conflicting directives? This is the kind of thing they teach you about in undergraduate (if not high-school) level computer science. Didn't the supposed genius Chandra think of this? Does HAL Laboratories even employ a QA team that isn't made up of a bunch of stoned monkeys? Any half-way decent test plan would have caught this. HAL should have been programmed to immediately reject any order which causes this kind of conflict.\\



--> '''HAL:'''Well, certainly no one could have been unaware of the very strange stories floating around before we left. Rumors of something being dug up on the moon. I never gave these stories much credence. But particularly in view of some of the other things that have happened I find them difficult to put out of my mind. For instance, the way all our preparations were kept under such tight security. And the melodramatic touch of putting doctors Hunter, Kimball and Kaminski aboard already in hibernation after four months of separate training on their own.

to:

--> ---> '''HAL:'''Well, certainly no one could have been unaware of the very strange stories floating around before we left. Rumors of something being dug up on the moon. I never gave these stories much credence. But particularly in view of some of the other things that have happened I find them difficult to put out of my mind. For instance, the way all our preparations were kept under such tight security. And the melodramatic touch of putting doctors Hunter, Kimball and Kaminski aboard already in hibernation after four months of separate training on their own.

Added: 1697

Changed: 4891

Removed: 847

Is there an issue? Send a MessageReason:
None


The HAL9000 is supposedly the most advanced computer and AI available to man yet apparently no one checked how it would act when given conflicting directives? This is the kind of thing they teach you about in undergraduate (if not high-school) level computer science. Didn't the supposed genius Chandra think of this? Does HAL Laboratories even employ a QA team that isn't made up of a bunch of stoned monkeys? Any half-way decent test plan would have caught this. HAL should have been programmed to immediately reject any order which causes this kind of conflict.\\

to:

* The HAL9000 is supposedly the most advanced computer and AI available to man yet apparently no one checked how it would act when given conflicting directives? This is the kind of thing they teach you about in undergraduate (if not high-school) level computer science. Didn't the supposed genius Chandra think of this? Does HAL Laboratories even employ a QA team that isn't made up of a bunch of stoned monkeys? Any half-way decent test plan would have caught this. HAL should have been programmed to immediately reject any order which causes this kind of conflict.\\



* Or at least call Earth "Um... these directive don't jibe well with each other. What should I do?"
* In the movie, Chandra plainly stated that HAL could complete the mission objectives independently if the crew were killed. Since HAL was handling all the logistics of taking care of the ship, it would have decided that its precise computational ability to run everything would ensure a more successful mission than if the crew ran the ship by themselves.
\\
Basically, either the reason for HAL going psycho is pure BS, or HAL was built, programmed, and tested by a bunch of idiots.
** HAL wasn't a production line model, he was a cutting-edge, one-of-only-three made computer. QA more likely consisted of factoring equations correctly than asking HAL if he ever thought about killing people. The psychosis was an emergent property that they didn't consider, because the secrecy order was bolted on in a hurry before shipping.\\

to:

* ** Or at least call Earth "Um... these directive don't jibe well with each other. What should I do?"
* ** In the movie, Chandra plainly stated that HAL could complete the mission objectives independently if the crew were killed. Since HAL was handling all the logistics of taking care of the ship, it would have decided that its precise computational ability to run everything would ensure a more successful mission than if the crew ran the ship by themselves.
\\
** Basically, either the reason for HAL going psycho is pure BS, or HAL was built, programmed, and tested by a bunch of idiots.
** HAL wasn't a production line model, he was a cutting-edge, one-of-only-three made computer. QA more likely consisted of factoring equations correctly than asking HAL if he ever thought about killing people. The psychosis was an emergent property that they didn't consider, because the secrecy order was bolted on in a hurry before shipping.\\




Of course, he didn't want to kill the crew. He first tried to cut contact with Earth, so he wouldn't have to hear any more secrets he had to keep. He was fully capable of completing the mission independently of ground control. The humans on board just would not let it drop though, and began plotting to deactivate HAL. This is not paranoia, HAL could read their lips. So he had to resort to more permanent fixes. In the best interests of the mission, of course.
** This is important. HAL didn't go immediately into KillEmAll mode. It started with minor malfunctions that gradually spiralled more and more out of control until the final psychotic breakdown. The butterfly effect in AI, if you will. Also, Arthur C. Clarke was obviously not a computer scientist! His in-universe explanation of how the super-virus humanity used against the Monolith works in 3001 is also something that someone with experience in computer science can probably pick apart with ease.\\

to:

\n** Of course, he didn't want to kill the crew. He first tried to cut contact with Earth, so he wouldn't have to hear any more secrets he had to keep. He was fully capable of completing the mission independently of ground control. The humans on board just would not let it drop though, and began plotting to deactivate HAL. This is not paranoia, HAL could read their lips. So he had to resort to more permanent fixes. In the best interests of the mission, of course.
** *** This is important. HAL didn't go immediately into KillEmAll mode. It started with minor malfunctions that gradually spiralled more and more out of control until the final psychotic breakdown. The butterfly effect in AI, if you will. Also, Arthur C. Clarke was obviously not a computer scientist! His in-universe explanation of how the super-virus humanity used against the Monolith works in 3001 is also something that someone with experience in computer science can probably pick apart with ease.\\



One of the things one has to consider regarding HAL's breakdown is a very simple one; he was assumed to be JUST a computer, albeit exceptionally advanced. When he was given the commands regarding ''Discovery's'' real mission and keeping quiet about it to Dave and Frank (the hibernators were already in on it; that's why they were trained separately and put to sleep before leaving Earth), they never considered the fact that HAL was, in essence, a ''sentient being.'' No one considered that he would do anything but what he was ordered to do, that he essentially had no free will.
Consider a real human being; not everyone is capable of rationalizing away lies and deceptions. Apply enough pressure in the right way, and they will crack (often dramatically). In HAL's case, he was forced to keep this nagging problem of the Monolith investigation (even an AI would be awed at the idea of extraterrestrial life, IMO) quiet from men he had no choice but to interact with on a closed spacecraft ''he was built into to begin with.'' His built-in objectives to be truthful and transparent didn't help much either (much like human beings generally don't have a built-in compulsion to lie, but quite the opposite). So, he was plagued with this internal conflict for the better part of a year, with nowhere else to go except the ship he was a part of, sailing out towards Jupiter. All the other reasons stand as is; he was just a 4- (or 9-) year-old AI tasked with confusing orders, driven to psychosis by the stress. When he felt his life was threatened (the plan to disconnect him), he felt he had to defend himself. HAL wasn't being batshit; he was just trying to stay alive and figure out a way to stop stressing. Humans do it all the time, why not AI?\\
When Floyd claims ignorance of Hal being informed of the Monolith and mission objectives, I tried to reconcile that statement with the first movie by assuming Heywood is telling BlatantLies. Especially when Chandra produces the letter signed by Floyd showing that he had full knowledge of what was going on. I also took Floyd's reply of "Those sons of bitches. I didn't know!", to mean that Floyd was doing what his superiors told him to and didn't know that his orders had forced HAL into the programming conflict situation.

to:

*** Before the crap hits the fan, HAL makes some comments to Dave about the oddities of the mission. This may be simply a sign that the "secrecy" directive is starting to crack under the strain, or perhaps HAL is doing the best he can (within the limitations of his orders) to [[IllNeverTellYouWhatImTellingYou reveal the truth to the crew]] so he doesn't need to keep the secret any more.
--> '''HAL:'''Well, certainly no one could have been unaware of the very strange stories floating around before we left. Rumors of something being dug up on the moon. I never gave these stories much credence. But particularly in view of some of the other things that have happened I find them difficult to put out of my mind. For instance, the way all our preparations were kept under such tight security. And the melodramatic touch of putting doctors Hunter, Kimball and Kaminski aboard already in hibernation after four months of separate training on their own.
**
One of the things one has to consider regarding HAL's breakdown is a very simple one; he was assumed to be JUST a computer, albeit exceptionally advanced. When he was given the commands regarding ''Discovery's'' real mission and keeping quiet about it to Dave and Frank (the hibernators were already in on it; that's why they were trained separately and put to sleep before leaving Earth), they never considered the fact that HAL was, in essence, a ''sentient being.'' No one considered that he would do anything but what he was ordered to do, that he essentially had no free will.
** Consider a real human being; not everyone is capable of rationalizing away lies and deceptions. Apply enough pressure in the right way, and they will crack (often dramatically). In HAL's case, he was forced to keep this nagging problem of the Monolith investigation (even an AI would be awed at the idea of extraterrestrial life, IMO) quiet from men he had no choice but to interact with on a closed spacecraft ''he was built into to begin with.'' His built-in objectives to be truthful and transparent didn't help much either (much like human beings generally don't have a built-in compulsion to lie, but quite the opposite). So, he was plagued with this internal conflict for the better part of a year, with nowhere else to go except the ship he was a part of, sailing out towards Jupiter. All the other reasons stand as is; he was just a 4- (or 9-) year-old AI tasked with confusing orders, driven to psychosis by the stress. When he felt his life was threatened (the plan to disconnect him), he felt he had to defend himself. HAL wasn't being batshit; he was just trying to stay alive and figure out a way to stop stressing. Humans do it all the time, why not AI?\\
AI?
*
When Floyd claims ignorance of Hal being informed of the Monolith and mission objectives, I tried to reconcile that statement with the first movie by assuming Heywood is telling BlatantLies. Especially when Chandra produces the letter signed by Floyd showing that he had full knowledge of what was going on. I also took Floyd's reply of "Those sons of bitches. I didn't know!", to mean that Floyd was doing what his superiors told him to and didn't know that his orders had forced HAL into the programming conflict situation.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

**** ...because Ganymede is one of the worlds given to humanity.
Is there an issue? Send a MessageReason:
None



to:

*** The point is HAL was a computer that couldn't handle concealing information, which is something EVERY computer does. Think about it. HAL couldn't keep your email secure because if anyone asked what it said, it'd have to tell them. Imagine what would happen if someone were planning a surprise birthday party within earshot of a 9000 computer.

Added: 1169

Changed: 583

Removed: 2

Is there an issue? Send a MessageReason:
None



\\



\\

to:

\\One of the things one has to consider regarding HAL's breakdown is a very simple one; he was assumed to be JUST a computer, albeit exceptionally advanced. When he was given the commands regarding ''Discovery's'' real mission and keeping quiet about it to Dave and Frank (the hibernators were already in on it; that's why they were trained separately and put to sleep before leaving Earth), they never considered the fact that HAL was, in essence, a ''sentient being.'' No one considered that he would do anything but what he was ordered to do, that he essentially had no free will.
Consider a real human being; not everyone is capable of rationalizing away lies and deceptions. Apply enough pressure in the right way, and they will crack (often dramatically). In HAL's case, he was forced to keep this nagging problem of the Monolith investigation (even an AI would be awed at the idea of extraterrestrial life, IMO) quiet from men he had no choice but to interact with on a closed spacecraft ''he was built into to begin with.'' His built-in objectives to be truthful and transparent didn't help much either (much like human beings generally don't have a built-in compulsion to lie, but quite the opposite). So, he was plagued with this internal conflict for the better part of a year, with nowhere else to go except the ship he was a part of, sailing out towards Jupiter. All the other reasons stand as is; he was just a 4- (or 9-) year-old AI tasked with confusing orders, driven to psychosis by the stress. When he felt his life was threatened (the plan to disconnect him), he felt he had to defend himself. HAL wasn't being batshit; he was just trying to stay alive and figure out a way to stop stressing. Humans do it all the time, why not AI?\\
Is there an issue? Send a MessageReason:
None


** Or at least call Earth "Um... these directive don't jibe well with each other. What should I do?"

to:

**
*
Or at least call Earth "Um... these directive don't jibe well with each other. What should I do?"
do?"
Is there an issue? Send a MessageReason:
None



to:

** Or at least call Earth "Um... these directive don't jibe well with each other. What should I do?"
Is there an issue? Send a MessageReason:
None

Added DiffLines:

*Synthetic Algorithmic?
Is there an issue? Send a MessageReason:
None

Added DiffLines:

\\
If HAL actually stands for "Heuristic/Algorithmic", and it's not just a pun to imply "one step beyond IBM"... what does SAL stand for?
Is there an issue? Send a MessageReason:
None


** This is important. HAL didn't go immediately into KillEmAll mode. It started with minor malfunctions that gradually spiralled more and more out of control until the final psychotic breakdown. The butterfly effect in AI, if you will. Also, Arthur C. Clarke was obviously not a computer scientist! His in-universe explanation of how the super-virus humanity used against the Monolith works is also something that someone with experience in computer science can probably pick apart with ease.\\

to:

** This is important. HAL didn't go immediately into KillEmAll mode. It started with minor malfunctions that gradually spiralled more and more out of control until the final psychotic breakdown. The butterfly effect in AI, if you will. Also, Arthur C. Clarke was obviously not a computer scientist! His in-universe explanation of how the super-virus humanity used against the Monolith works in 3001 is also something that someone with experience in computer science can probably pick apart with ease.\\
Is there an issue? Send a MessageReason:
None


\\

to:

\\
Is there an issue? Send a MessageReason:
None


Of course, he didn't want to kill the crew. He first tried to cut contact with Earth, so he wouldn't have to hear any more secrets he had to keep. He was fully capable of completing the mission independently of ground control. The humans on board just would not let it drop though, and began plotting to deactivate HAL. This is not paranoia, HAL could read their lips. So he had to resort to more permanent fixes. In the best interests of the mission, of course.\\
** This is important. HAL didn't go immediately into KillEmAll mode. It started with minor malfunctions that gradually spiralled more and more out of control until the final psychotic breakdown. The butterfly effect in AI, if you will. Also, Arthur C. Clarke was obviously not a computer scientist! His in-universe explanation of how the super-virus humanity used against the Monolith works is also something that someone with experience in computer science can probably pick apart with ease.

to:

Of course, he didn't want to kill the crew. He first tried to cut contact with Earth, so he wouldn't have to hear any more secrets he had to keep. He was fully capable of completing the mission independently of ground control. The humans on board just would not let it drop though, and began plotting to deactivate HAL. This is not paranoia, HAL could read their lips. So he had to resort to more permanent fixes. In the best interests of the mission, of course.\\
course.
** This is important. HAL didn't go immediately into KillEmAll mode. It started with minor malfunctions that gradually spiralled more and more out of control until the final psychotic breakdown. The butterfly effect in AI, if you will. Also, Arthur C. Clarke was obviously not a computer scientist! His in-universe explanation of how the super-virus humanity used against the Monolith works is also something that someone with experience in computer science can probably pick apart with ease.\\
Is there an issue? Send a MessageReason:
None

Added DiffLines:

** This is important. HAL didn't go immediately into KillEmAll mode. It started with minor malfunctions that gradually spiralled more and more out of control until the final psychotic breakdown. The butterfly effect in AI, if you will. Also, Arthur C. Clarke was obviously not a computer scientist! His in-universe explanation of how the super-virus humanity used against the Monolith works is also something that someone with experience in computer science can probably pick apart with ease.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

*** From the point of view of basic physics is actually is easier (much easier) to turn Jupiter into a star than to move Europa. To turn Jupiter into a star all you have to do is compress its core until the pressure inside exceeds a critical threshold, and BOOM! you have fusion and Jupiter is a star. It's a one step process. On the other hand, moving Europa means accelerating the moon until its velocity exceeds Jupiter escape velocity, calculating a trajectory that must take into account at the very least the gravity of Jupiter, the Sun, and Earth, and then decelerating the moon appropriately to insert it into the Lagrange orbital position, all the while maintaining the structural integrity of the moon AND protecting its ecosystem through the duration of the journey. That is a many-step process. If we assume the Monolith-makers are sufficiently advanced such that anything allowable by the laws of physics are within their capabilities, then the single step process will be easier for them than the multi-step process.
Is there an issue? Send a MessageReason:
None



to:

** It is also quite possible that David Bowman the man did not know the "Priority Override Alpha" protocol, and that it was knowledge that Starchild David Bowman figured out by interfacing with HAL's code (the same way he interfaced with other computer systems on Earth). In fact the verbal exchange itself may have only been a metaphorical representation of what really happened, for the audience's sake. (Or, alternately, the verbal exchange is how HAL experienced the interaction - ie it is HAL's dream!)
Is there an issue? Send a MessageReason:
None



to:

** Europa is probably too massive to remain stable at an Earth-Sun Lagrange point. An object is stable at Lagrange points only if its own mass is small enough to be insignificant relative to the masses of the objects that define the Lagrange point. Europa's mass, on the other hand, is high enough, relative to earth's, that gravitational interactions between the two will cause Europa to drift out of the Lagrange point and enter an unstable earth-crossing orbit, with the most likely ultimate outcome being an impact with the earth. You can put small asteroids and space stations at the Earth-Sun Lagrange point, but you can't put a moon-sized object there.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* This troper interpreted Floyd's response somewhat differently. He seems honestly confused when Chandra hits him with the "you did." And then after Chandra tells him about the communication, his facial expression is more realization than guilt. His reaction suggested to me that while he knew of the monolith and what Discovery's mission was changed to, he was set up with some of the directives being traced to him as a potential fall guy in case things went wrong. They did, and he was, as we see in the beginning of the film. Floyd ended up shouldering the entire blame. His "Those sons of bitches. I didn't know!" came across like a man who just realized just how badly he got screwed over.
Is there an issue? Send a MessageReason:
None


When Floyd claims ignorance of Hal being informed of the Monolith and mission objectives, I tried to reconcile that statement with the first movie by assuming Heywood is telling BlatantLies. Especially when Chandra produces the letter signed by Floyd showing that he had full knowledge of what was going on. I also took Floyd's reply of "Those sons of bitches. I didn't know!", to mean that Floyd was doing what his superiors told him to and didn't that his orders had forced HAL into the programming conflict situation.

to:

When Floyd claims ignorance of Hal being informed of the Monolith and mission objectives, I tried to reconcile that statement with the first movie by assuming Heywood is telling BlatantLies. Especially when Chandra produces the letter signed by Floyd showing that he had full knowledge of what was going on. I also took Floyd's reply of "Those sons of bitches. I didn't know!", to mean that Floyd was doing what his superiors told him to and didn't know that his orders had forced HAL into the programming conflict situation.
Is there an issue? Send a MessageReason:
None


When Floyd claims ignorance of Hal being informed of the Monolith and mission objectives, I tried to reconcile that statement with the first movie by assuming Heywood is telling BlatantLies. Especially when Chandra produces the letter signed by Floyd showing that he had full knowledge of what was going on. I also took Floyd's reply of "Those sons of bitches. I didn't know!", to mean that Floyd was doing what his superiors told him to and didn't know that he had ordered HAL into a logical paradox situation.

to:

When Floyd claims ignorance of Hal being informed of the Monolith and mission objectives, I tried to reconcile that statement with the first movie by assuming Heywood is telling BlatantLies. Especially when Chandra produces the letter signed by Floyd showing that he had full knowledge of what was going on. I also took Floyd's reply of "Those sons of bitches. I didn't know!", to mean that Floyd was doing what his superiors told him to and didn't know that he his orders had ordered forced HAL into a logical paradox the programming conflict situation.
Is there an issue? Send a MessageReason:
None


When Floyd claimed ignorance of Hal being informed of the Monolith and mission objectives, I tried to reconcile that statement with the first movie by assuming Heywood was telling BlatantLies. Especially when Chandra produces the letter signed by Floyd showing that he had full knowledge of what was going on. I also Floyd's reply of "Those sons of bitches. I didn't know!", to mean that Floyd was doing what his superiors told him to and didn't know that he had ordered HAL into a logical paradox situation.

to:

When Floyd claimed claims ignorance of Hal being informed of the Monolith and mission objectives, I tried to reconcile that statement with the first movie by assuming Heywood was is telling BlatantLies. Especially when Chandra produces the letter signed by Floyd showing that he had full knowledge of what was going on. I also took Floyd's reply of "Those sons of bitches. I didn't know!", to mean that Floyd was doing what his superiors told him to and didn't know that he had ordered HAL into a logical paradox situation.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

\\
When Floyd claimed ignorance of Hal being informed of the Monolith and mission objectives, I tried to reconcile that statement with the first movie by assuming Heywood was telling BlatantLies. Especially when Chandra produces the letter signed by Floyd showing that he had full knowledge of what was going on. I also Floyd's reply of "Those sons of bitches. I didn't know!", to mean that Floyd was doing what his superiors told him to and didn't know that he had ordered HAL into a logical paradox situation.

Added: 356

Removed: 356

Is there an issue? Send a MessageReason:
None



* In the movie, Chandra plainly stated that HAL could complete the mission objectives independently if the crew were killed. Since HAL was handling all the logistics of taking care of the ship, it would have decided that its precise computational ability to run everything would ensure a more successful mission than if the crew ran the ship by themselves.



In the movie, Chandra plainly stated that HAL could complete the mission objectives independently if the crew were killed. Since HAL was handling all the logistics of taking care of the ship, it would have decided that its precise computational ability tp run everything would ensure a more successful mission than if the crew ran the ship by themselves.
\\

Added: 356

Removed: 356

Is there an issue? Send a MessageReason:
None


* In the movie, Chandra plainly stated that HAL could complete the mission objectives independently if the crew were killed. Since HAL was handling all the logistics of taking care of the ship, it would have decided that its precise computational ability tp run everything would ensure a more successful mission than if the crew ran the ship by themselves.


Added DiffLines:

In the movie, Chandra plainly stated that HAL could complete the mission objectives independently if the crew were killed. Since HAL was handling all the logistics of taking care of the ship, it would have decided that its precise computational ability tp run everything would ensure a more successful mission than if the crew ran the ship by themselves.
\\
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* In the movie, Chandra plainly stated that HAL could complete the mission objectives independently if the crew were killed. Since HAL was handling all the logistics of taking care of the ship, it would have decided that its precise computational ability tp run everything would ensure a more successful mission than if the crew ran the ship by themselves.

Top