Best place tobuy Valium on line you can find
Best place toget CBD gummies online you can find
Best place tobuy Tramadols online you can find

Alan Turing: Princeton’s Celebrated Alumnus and the Digital Universe

By Linda Arntzenius

Photo illustration by Jorge Naranjo

Mathematical logic seems an unlikely field for a national hero. And yet, Winston Churchill described Alan Turing as having “made the single biggest contribution to Allied victory in the war against Nazi Germany.” Churchill was referring to Turing’s code-breaking for the British intelligence service. As astounding as that was, Turing’s impact goes far beyond his efforts to break the German “Enigma” at Bletchley Park. To computer scientists he’s venerated as a pioneer whose theoretical “Universal Turing Machine” laid the groundwork for today’s digital revolution.

So why did it take so long for Turing to become a household name?

The answer to that question can now be quite simply stated. Turing was a practicing homosexual at a time when homosexual acts were subject to criminal prosecution in the U.K. Not only that, in the paranoid post-war period, homosexuals in Britain, as in the United States, were highly suspect, assumed to be vulnerable to blackmail during the spy-games of the Cold War.

When Turing was charged with “gross indecency” in 1952—just a year after he’d been elected a Fellow of the Royal Society he lost his security clearance. Rather than go to prison, he agreed to be treated with estrogen injections, a chemical method of castration then thought to “cure” homosexuality or at least diminish sexual urges. Two years later, Turing committed suicide just two weeks before his 42nd birthday by way of a cyanide-laced apple. His war work remained classified for decades. Some of it still is.

ALAN TURING: THE ENIGMA

Acknowledging the injustice done to the mathematical genius in 2009, then British Prime Minister Gordon Brown issued an apology. In 2013, Turing received a Royal Pardon from Her Royal Highness, Queen Elizabeth II. Actor Stephen Fry immediately tweeted, “At bloody last. Next step a banknote if there’s any justice!” No banknote so far, but Turing’s face graces a British postage stamp.

Oxford mathematician Andrew Hodges has been credited for bringing about this long overdue public recognition of Turing through his definitive 1983 biography Alan Turing: The Enigma. Hodges’s research led to the 1986 play Breaking the Code by Hugh Whitemore, which starred Derek Jacobi in a role he reprised on British television in 1996. It also inspired the recent drama-documentary Code Breaker and the British-American movie The Imitation Game with Benedict Cumberbatch, Keira Knightley and Charles Dance.

The latter film portrays Turing as a lone hero battling hard-nosed army brass who can’t or won’t take the time to understand him. In fact, Turing was part of a collaborative effort at Bletchley Park and the code-breaking machine was not a “computer” in the way that we use the word today, to mean programmable digital electronic device with a built in memory. Rather it was an electro-mechanical machine that speeded up the work already being attempted by dozens of mostly female “computers” laboriously working through all the possible settings of German encryption.

When Hodges spoke at the Princeton Public Library a month or so before The Imitation Game arrived in American cinemas, the room was packed to capacity. But Hodges made it plain that he would be talking about Turing’s accomplishments as a pioneer of computer science and artificial intelligence rather than the film.

Print

PRINCETON POSTGRAD

Born in London in 1912, Turing grew up with his older brother John in St. Leonards-on-Sea, where his parents had left their sons in the care of a retired Army Colonel and his wife. Turing’s father served in the Indian Civil Service and his parents spent most of their time in Colonial India. From 1926 until 1931, he attended a boy’s boarding school. From there he went on to King’s College in Cambridge University, where he studied quantum mechanics, probability and logic, graduating in 1935.

It was at Cambridge he wrote the paper that historian of science George Dyson says “would lead the way from logic to machines,” and earn him renown as the father of theoretical computer science. Titled “On Computable Numbers,” it was published in the proceedings of the London Mathematical Society shortly after Turing arrived in Princeton in 1936 as a graduate student at the University.

In the 1930s, Princeton University was a magnet for young talented mathematicians from Europe. The newly founded Institute for Advanced Study, which didn’t get its own building until 1939, was sharing space with the University’s stellar mathematics department in Fine Hall (now Jones Hall), where Oswald Veblen had built the department into a leading center. Veblen and John von Neumann had joined the Institute whose faculty co-mingled with University mathematicians and Princeton students drawn to work with them and others like Alonzo Church and Kurt Gödel, not to mention Albert Einstein.

“On Computable Numbers” introduced Turing’s idea that a machine could compute anything that a human could compute with paper, pencil and time. He envisioned a simple two-dimensional “machine” that would include “programmed” instructions; a machine that could manipulate numbers according to a set of instructions that would themselves be expressed in numbers. He conceived of “automata,” or “universal Turing machines,” that would be capable of performing any calculation using paper tape and binary digits. In effect, he invented the idea of software.

Von Neumann, who had met Turing when he was a student at Cambridge and recognized his talents, asked the new graduate to stay on in Princeton as his research assistant. Knowing that war with Germany was imminent, however, Turing was keen to return home. “I hope Hitler will not have invaded England before I come back,” he wrote to a friend.

In May 1938, after defending his doctoral dissertation, he sailed back to England, where war was soon declared. As a talented mathematician, Turing was recruited to work on the German Enigma cipher machine. At Bletchley, he worked to build the decoding machine known as “The Bombe,” which would successfully decode German U-boat messages and save lives during the Battle of the Atlantic.

FROM LOGIC TO MACHINE

Today’s digital universe can be traced to the “physical realization” of Turing’s dreams, says Dyson. That realization was constructed by von Neumann and a team of engineers at the Institute, an unlikely place for such a practical hands-on project. Beginning in 1945 in the Institute’s basement von Neumann’s team went beyond the sort of electro-mechanical device with switches and rotors that had been used during the war to one that would use vacuum tubes to store “programs.” That way, the machine’s hardware settings did not have to be reset for each new calculation. The machine would have a memory.

The computer von Neumann built was an advance on earlier high speed machines that had been built to compute ballistics tables. Known as the IAS Machine or MANIAC (Mathematical Analyzer, Numerical Integrator, and Computer), it would be used to determine the feasibility of the development of a hydrogen bomb. Von Neumann had worked on the Manhattan Project during the war and during the summer of 1951, Los Alamos scientists used his Institute machine for a classified complex thermonuclear calculation, for which it ran for 24 hours at a time without interruption over a period of some 60 days. It was also used to solve fundamental problems in meteorology.

AHEAD OF HIS TIME

As Hodges shows, Turing was fascinated, perhaps even obsessed, by questions of mind, soul, free will, and creativity. In other words, what it means to be human. Could a machine learn, he wondered. Could a machine make mistakes, could it feel emotions? By the late 1940s, Turing was anticipating the field we know today as artificial intelligence. His 1950 paper, “Computing Machinery and Intelligence,” posed the question “can machines think?” To answer the question he proposed a test. The name he chose for the test, inspired the title of the recent movie. He called it “The Imitation Game.” A questioner would put questions to a computer and a human being located in a separate room and therefore unseen by the questioner. If the questioner cannot tell the computer’s response from the human’s response, then, in Turing’s view, the computer is a thinking machine capable of simulating human behavior.

Turing realized that being human means being fallible. He understood that human intelligence involves making mistakes and learning from them. Instead of seeking to make machines that would be infallible in their calculations, he envisioned the development of “learning machines.” “What we want is a machine that can learn from experience,” he wrote. “The possibility of letting the machine alter its own instructions provides the mechanism for this.”

According to Dyson, “Turing gave provocative hints about what might lie ahead.” When asked by a friend “under what circumstances he would say that a machine is conscious,” Turing quipped that “if the machine was liable to punish him for saying otherwise then he would say that it was conscious.”

FURTHER READING

Described as “one of the best scientific biographies ever written,” Alan Turing: The Enigma by Andrew Hodges was re-issued last year as a 700-page paperback with a new preface and a foreword by Douglas Hofstadter. It draws from primary sources and interviews with those who knew Turing and explains how his revolutionary idea laid the foundation for modern computing.

Hodges maintains a highly informative website with everything and anything one would wish to know about his life, his work, and his legacy. Take a look: www.turing.org.uk.

Turing’s Cathedral: The Origins of the Digital Universe by George Dyson pays tribute to Turing’s vision and relates the developments of his ideas by others, principally John von Neumann at the Institute for Advanced Study. Dyson grew up in Princeton and is the son of Institute for Advanced Study faculty member Freeman Dyson. As a child, Dyson played in the barn where spare parts for the Institute’s computer were stored and knew many of those who worked on it.

The Computer from Pascal to von Neumann by Herman H. Goldstine traces the modern computer from the days of Charles Babbage and Ada Lovelace through the developments of World War II, the ENIAC at the Moore School of Electrical Engineering of the University of Pennsylvania and von Neumann’s Electronic Computer Project, acknowledging the contribution of George Boole, Alan Turing and John von Neumann along the way.

casino Brango - best slots online in the world