"2001: A Space Odyssey": HAL's Legacy

In 2001: A Space Odyssey, Stanley Kubrick and Arthur C. Clarke attempted to portray as accurately as possible the technology of future space travel from their vantage point in 1968, with the help of computer pioneers Honeywell, Burroughs and IBM. Now that we have finally reached that popular milestone, how does the vision of the film stand up to the reality of 2001? What did the filmmakers accurately predict and what did they miss? Why don't we have a HAL yet?

2001, the film, has always functioned more as a metaphor rather than as a way in which things need actually work. In the late 1960s, computer scientists speculated that by reverse-engineering the human brain, we could make computers "think" like humans. Much progress was made initially; unfortunately, artificial intelligence (AI) is a far more complex task than first thought, along the lines of the origins of life and the basis of matter. While computer scientists since the 1970s have made important forays into machine learning, neural networks, global information networks and chess, the reality of a supremely intelligent, humanlike HAL running every aspect of Discovery's mission to Jupiter is still far off. This is because AI progress has tended toward specific areas such as chess and voice recognition, rather than to a broad understanding of reasoning, learning and creativity.

The question, according to David G. Stork, author of Hal's Legacy, is "should we try to make computers intelligent by mimicking a human brain or, instead, exploit their particular strengths — such as rapid search and large memories?" For example, the focus of chess-playing computers used to be on reproducing the methods of human grandmasters, whereas Deep Blue, the machine that beat Gary Kasparov in 1997, achieved its victory by performing rapid searches of possible sequences. Similarly, many modern computer applications do not function at all like humans, performing such mundane and repetitive tasks as assembling auto parts or compiling millions of websites for a search engine. On the contrary, computers have proven not to be very good at certain tasks at which humans excel, including such automatic actions as reading comprehension and facial recognition.

Similarly, society's perception of computers over the course of the last 33 years has dramatically evolved. Kubrick and Clarke could not have foretold the development, by the early 1970s, of microchips and easily updatable software running on the hardware. Computers in the 1960s were hardware-based, and indeed the spacecraft Discovery, including HAL, consisted of an excessive amount of control buttons and mechanical consoles. Software-based personal computers, laptops, PDA's and networks have since taken computers out of the laboratory and democratized them into virtually every aspect of our lives, from cellular telephones and cameras to refrigerators and even clothing.

Interestingly, while the movie did not foretell the pervasiveness of microcomputers in our daily lives, it did get two important details correct: 
First, in the 1960s the thought of a computer as world chess champion was preposterous. Not only is a computer the world chess champion in 2001 the film and the reality, but we nonchalantly accept it. Second, Kubrick and Clarke boldly envisioned joint U.S.-Soviet missions by 2001, and indeed that goal we have achieved with flying colors. Witness the present crew of the International Space Station!

The reality of 2001 is that, in narrowly-definable fields of AI such as voice recognition and chess playing, we have surpassed HAL, but in common sense applications such as language comprehension and emotions, we still have a very long way to go. Whereas the film's vision consisted of humanlike, generally intelligent computers, our current technology is based on collections of specialized, task-oriented applications in small domains connected by distinctly nonhuman, mainframe computers.

Next week I will discuss the differing visions of space exploration between the 1960s and the early 21st century.


Michael Tanenbaum