Artificial intelligence is impacting the future of virtually every industry and every human being, and we owe it all to an Irishman named McCarthy.
By Ray Cavanaugh
Few phrases evoke the rapid hi-tech acceleration of our era like “artificial intelligence” (also known by the somewhat less-threatening acronym of “AI”). Until very recently, this world, for better or worse, was ours. But, having ascended to the apex of the Animal Kingdom on the basis of our intelligence, we now find ourselves increasingly eclipsed by something that has no “life.”
AI was originally known as “Automata Studies,” but one man, John McCarthy, had the presence of mind to recognize that such a name was less than ideal, largely because few people knew what “automata” meant. So he rechristened it as “artificial intelligence.” Aside from supplying this much-needed name change, McCarthy was, at the very least, one of the seminal figures in AI, and many consider him the field’s progenitor.
McCarthy was born in Boston on Sep. 4, 1927. His mother, Ida McCarthy née Glatt, was a Lithuanian Jewish immigrant who worked at different times as a journalist and social worker. His father, John Patrick McCarthy, was from Cromane, County Kerry. He worked as a longshoreman and union organizer. “The family politics was Marxist, the family religion, atheism,” as related by Philip J. Hilts in his book Scientific Temperaments: Three Lives in Contemporary Science.
As one might expect of a future computer science icon, McCarthy did well academically. Despite starting school late due to childhood illness, he still managed to skip enough grades to finish high school early in California, where his family moved to live in a climate more conducive to his health.
The whiz kid stayed reasonably healthy, but at the California Institute of Technology (Caltech) he encountered a serious obstacle – compulsory gym class. Like many a geek, McCarthy was less than athletic, and he downright refused to attend gym class. But Caltech wasn’t bluffing about it being compulsory; he was kicked out of school.
McCarthy then entered the U.S. Army, and claimed to find their physical demands more agreeable than the ones at university. He never saw any combat and spent most of his time working as a supply clerk. He eventually returned to Caltech, which apparently had become less rigid about its phys ed requirement. After graduating with his bachelor’s degree in 1948, he headed to Princeton University. There he obtained his Ph.D. in mathematics in 1951.
At Princeton, he taught for a short time before relocating to Dartmouth, where he hosted an artificial intelligence conference in 1956. He then headed to MIT, where he co-founded their Artificial Intelligence Laboratory in 1958. Around this time, he invented the LISP programming language, which became the most-used language for AI application design and remains among the oldest programming languages with widespread influence. McCarthy was also the first to implement the concept of time-sharing, in which multiple users share one central processing unit (also known as the “CPU,” or, in layman’s terms, the “brains of the computer”).
In 1962, he relocated to Stanford, where he spent the next four decades as a Professor of Computer Science and where he founded the Stanford Artificial Intelligence Laboratory. By the 1960s, the cutting-edge subject of AI was generating much excitement, but some critics were unconvinced of its capabilities. Among the field’s most notable detractors was UC Berkeley professor Hubert Dreyfus, who viewed AI as a modern incarnation of alchemy.
But McCarthy knew he was no alchemist and remained “driven by an insatiable desire to model human reasoning using computers,” as stated at jmc.stanford.edu (a website dedicated to McCarthy). Among the many formal distinctions he would receive: the 1971 Turing Award (also known as the “Nobel Prize of computing,” for it is the most prestigious honor in computer science); the National Medal of Science (1990); the Benjamin Franklin Medal (2003); membership in the National Academy of Sciences; and a slew of honorary doctorates. He also served as President of the American Association for Artificial Intelligence (1983-84).
For someone who had garnered such recognition, McCarthy could be far from gleeful when discussing his career. Sam Williams’s book Arguing A.I.: The Battle for 21st Century Science includes a quote from the renowned scientist talking about one of his AI programs: “I somehow thought that getting this group of people working together would produce great results. As it happened, they didn’t work together and it wouldn’t have done any good if they had, because they didn’t have good enough ideas.”
But even the critical McCarthy, along with the hardest of AI skeptics, had to acknowledge an AI triumph in 1997, when Garry Kasparov – the Mike Tyson of chess – lost to non-human intelligence in a match of six games. This conquest of AI over the human mind (and an exceedingly high-functioning human mind, no less) gave fodder to hysterical predictions involving intelligent machines.
For his part, McCarthy was quick to curb wild speculation. One cannot deny, however, that recent years have seen AI rise in relevance. The website emerj.com provides a number of examples of AI with widespread impact. Among them include: ride-sharing apps, autopilot features on the planes we ride, spam filters on our emails, and mobile check deposits. Also, those ubiquitous social media platforms make heavy use of AI. Additionally, a much stronger AI presence (for example: streets full of self-driving cars) seems to be nearing our collective horizon.
McCarthy, who retired as a professor at the end of the year 2000, explored an AI-inundated society in some of his science fiction writing. His short story, “The Robot and the Baby,” involves a late-21st century “single mother addicted to alcohol and crack” who receives a government-issued robot to assist with parenting and household chores. When this robot, named R781, fails to retrieve a sandwich fast enough for the mother’s liking, she proceeds to kick the robot and her own baby out of the house. She later accuses R781 of “kidnapping” her child and successfully sues the robot’s manufacturer.
McCarthy, the part-time writer and full-time computer science icon, died of a heart attack at his California home on Oct. 24, 2011, at age 84. He left behind two daughters and one son, along with his first and third wives (his second wife died while climbing the Himalayas). He also left behind an ever-burgeoning legacy of inorganic self-regulating entities that can imitate or, perish the thought, even surpass human intelligence. Indeed, AI lacks blood and flesh, but it has a human father, and an Irish one at that.
Ray Cavanaugh is a freelance scribe from Massachusetts. His mother is from Kerry and his father is a few generations removed from Wexford. He’s a regular Window on the Past contributor to Irish America.