November 22, 2024

Healthy About Liver

Masters of Health

Debugging the mythology: was Ada Lovelace really the first computer programmer?

Debugging the mythology: was Ada Lovelace really the first computer programmer?
Debugging the mythology: was Ada Lovelace really the first computer programmer?

Ada Lovelace—in the background, an illustration of the difference machine. Image: Alexander Bertram-Powell / Alamy

Resembling the myths of the ancient Greeks, romanticised stories about science feature brilliant geniuses who make spectacular breakthroughs. Apart from Marie Skłodowska Curie, virtually all these intellectual heroes are male, so campaigners have energetically searched out role models for encouraging female students to choose a scientific career. None of them can yet rival Curie, but among British pioneers, the leading candidate for celebrity status is the 19th-century aristocrat Ada Lovelace.

Misleadingly dubbed “the first computer programmer,” Lovelace’s iconic reputation was confirmed in the late 1970s when the United States Defense Department immortalised her as ADA, a specialised computer language. Does she deserve such prestige? Rich, glamorous and charming, Lovelace was a visionary mathematician who presciently grasped the potential of computers, speculating that one day they might be able to compose poetry or music. But that is very different from compiling a functioning program that can instruct a computer how to process information and make its own decisions.

A remarkable woman in her own right, Lovelace has been elevated to stardom by being associated with two men. The first was her father, the poet Lord George Byron, who abandoned the family a few months after Ada was born and died in Greece when she was eight. Dominated by her autocratic mother, the child was submitted to a rigorous education in mathematics interspersed with long periods of lying flat on a board. Desperate to escape this oppressive upbringing, Ada Byron eloped with a tutor but then married a Viscount. Later becoming the Earl and Countess of Lovelace, they devoted their lives to studying and reading as they travelled between their various houses.

The flamboyant daughter of Britain’s most notorious poet, Lovelace was soon adopted into London’s elite intellectual circles. Her distinguished acquaintances included Charles Dickens, Michael Faraday and Mary Somerville, although her fairy-tale story has no happy ending—she gambled away much of her fortune on horse racing and died from cancer when she was only 36. As a teenager, she had met the talented but temperamental inventor Charles Babbage, and they established a close and mutually advantageous relationship.

A frequent family guest, Babbage tactfully encouraged Lovelace’s scientific interests and helped expand her mathematical expertise; at the same time, he gained a young and influential admirer with contacts in the right places. Babbage assiduously promoted his prestigious protégé, describing her rapturously to Faraday as “that Enchantress who has thrown her magical spell around the most abstract of Sciences [mathematics].” But other scientists were more sceptical. The astronomer John Herschel commented scathingly on her mathematical abilities: “I find myself in reading her notes at a loss in the same kind of way I feel as when trying to understand any other thing which the explainer himself has not clear ideas of.”

Like Lovelace, Babbage has acquired a heroic position that he does not fully merit. Although certainly a brilliant and inspired inventor, this so-called “Father of Computing” had little impact on either the birth or development of modern digital computers. He began acquiring his posthumous glory in 1971, the centenary of his death, when Maurice Wilkes—who led the Cambridge team building EDSAC, an early electronic computer—acclaimed him as a “Computer Pioneer.” This suggestion was eagerly picked up: by restoring to glory a forgotten ancestor, the nascent computer industry could immediately acquire a glorious national history. Alan Turing was not even a candidate—his picture now adorns the £50 banknote, but in that era he was regarded with suspicion as a convicted homosexual, and his triumphs at Bletchley Park were still largely unknown.

Paradoxically, Wilkes also accused Babbage of holding back progress by failing to deliver finished products, thereby discouraging successors. Babbage’s first project was his Difference Engine, intended to be a a giant calculating machine that could churn out long tables of figures automatically and accurately. The government awarded him funding to realise this device, hoping that it would eventually prove cheaper than hiring human “computers,” many of whom were women.

After about ten years, money and enthusiasm ran out before the machine was fully functioning, and Babbage turned his attention to a far more ambitious scheme: his Analytical Engine. Although never completed, this was based on the unprecedented concept that it would “eat its own tail”—that it could modify its own behaviour by using the results of its calculations to determine its next step. This internal decision-making process is a fundamental feature of computers, although Babbage’s machine operated on the familiar decimal system based on the numbers one to nine, not the either/or binary of electronic equipment. Relying on complex systems of interacting cog wheels, the Engine degenerated into a technological nightmare that fruitlessly consumed vast amounts of time and energy.

Grace Hopper—digital revolutionary. Image: Granger Historical Picture Archive / Alamy

Lovelace became involved while the venture still represented an exciting path towards the future. Babbage invited her to translate a French paper about his Analytical Engine by the mathematician Luigi Menabrea, who later became prime minister of Italy. After nine months of working under Babbage’s guidance, her published version included a commentary that was twice as long as Menabrea’s original. It was attributed to her, but—as Herschel hinted—Babbage may have had an input; it is impossible to know how much. Most famously, one of her additional notes, G, sets out a table for calculating what are called the Bernoulli numbers, which carry great mathematical significance. Even if she was solely responsible for it, the chart is not a program, but shows the stages that would occur in a pre-programmed machine if one existed.

Heroes are made, not born. If computer scientists feel they need a 19th-century ancestor, then perhaps Herman Hollerith should supplant Babbage? To tabulate the US census, Hollerith invented eponymous punched cards which are still being used 100 years later—and he also founded a company that became the international giant IBM.

And as a female role model, the American mathematics graduate Grace Hopper seems eminently more suitable than London’s flighty Victorian socialite. A rear admiral in the US Navy during the Second World War, this programming pioneer gave her name to a powerful supercomputer. Hopper revolutionised the digital world by insisting that instead of forcing people to communicate in symbolic code, computers should be taught to speak English. She also made a permanent mark on the English language—the term “debugging” was coined after she removed a moth that had flown inside some circuitry.