does not compute

{{Short description|A phrase often uttered by computers and robots in popular culture}}

{{Use dmy dates|date=July 2021}}

{{more citations needed|date=February 2009}}

"Does not compute", and variations of it, are phrases often uttered by computers, robots, and other artificial intelligences in popular culture. The phrase indicates a type of cognitive dissonance on the part of the machine in question. The expression of the phrase "does not compute" by robots or computers attempting to process emotions, contradictions or paradoxes is frequently satirized in popular culture, often leading to the machine's inaction, malfunction or self-destruction. The phrase was used as a catchphrase by the television show My Living Doll in 1964.[https://listserv.linguistlist.org/pipermail/ads-l/2001-September/017179.html Does not compute] (Jesse Sheidlower, American Dialect Society mailing list, 2001-09-15) – cites The Random House Historical Dictionary of American Slang It was further popularized in Lost in Space (1965) as a catchphrase often uttered by The Robot character.

The problem of how to hold the result of a computation that is not a number is genuine (for example, 1/0) and represented a problem for early computers that would experience divide-by-zero errors or other mathematical paradoxes that software had not yet been written to deal with, leading to a computer crash. The NaN and related data types were invented to solve this problem.

History and usage

{{Original research section|date=May 2019}}

The phrase was often present in stories which carried a theme of the superiority of human emotion over limitations within the logic utilized by machines. Despite the superior ability of computers to calculate and process information, their lack of emotion and randomness made them unable to resolve cognitive dissonance, which often led to the output of "Does not compute". It was usually the computer's response to information which it had received but could not reconcile with other information it already held to be true. It could also be seen as a depiction of the limited (and thus flawed) nature of a machine's programming; due to its pre-programmed nature, it would be unable to adapt itself to circumstances beyond the scope of its programming, as opposed to humans who could adapt to such unforeseen events.

The phrase was used in the sitcom My Living Doll in which the android protagonist, Rhoda Miller, uttered the phrase regularly when confronted with contradictory information, usually in relation to human behavior. On a few occasions when she did understand the information, her response was "that does compute".

Perhaps the most famous use of the phrase is in the television series Lost in Space where the robot often says, "It does not compute!", to which Dr Smith would give a reply of "What do you mean it doesn't compute, you ninny?!" or something to that effect. However, the robot did not shut down or explode; it simply refused to continue working until a more logical command was given.

In some cases, presenting a computer or robot with such a contradiction would cause it to violently self-destruct. This occurs in several episodes of the original series of Star Trek (e.g. "I, Mudd", "Requiem for Methuselah", "The Return of the Archons" and "The Changeling"), as well as in the finale to Logan's Run. In the episode of the 1968 television series The Prisoner entitled "The General", Patrick McGoohan causes a supercomputer to explode by feeding it the question "Why?".

Such depictions reflect common perceptions of real computers at the time, which usually lacked friendly user interfaces. Computers often responded to bad input with an error message on the same order of utility as "does not compute", although self-destruction was an unlikely result from bad inputs or insoluble problems fed into the computer. The concept of a "killer poke", however, refers to user input intended to induce hardware damage. (See also "Halt and Catch Fire".)

Although not using the phrase "does not compute", the short story "Liar!" (1941) by Isaac Asimov is a striking early example of cognitive dissonance leading to a robot's self-destruction: that whether it lies, tells the truth or says nothing, it will cause humans injury, so being unable to avoid breaking Asimov's First Law of Robotics: "A robot may not harm a human being, or, through inaction, allow a human being to come to harm." This example is a more sophisticated treatment of cognitive dissonance leading to self-destruction than most examples from later television science fiction. Asimov explored the theme of AI cognitive dissonance at length in his robot stories.

In the Doctor Who story "The Green Death", the Doctor attempts to put the computer BOSS, which claims to be infallible, out of action using the liar paradox. BOSS feigns suffering from confusion as he appears to try to resolve the paradox, but has in fact summoned security.

By the 1990s, with the rise of personal computers and the graphical user interface, the public conception of computers became more friendly and sophisticated, and the image of the computer intelligence unable to respond gracefully to unexpected inputs has gradually faded away from fiction, though the phrase did show up in Star Wars: Episode I – The Phantom Menace as comic relief in 1999. It re-appeared in the CGI series Star Wars: The Clone Wars in an episode on the planet Ryloth, when a number of Twi'Lek characters attacked a robotic general, much to the robots' fatal surprise.

References