Dozens of names cross the mind when considering a single individual who had significant impact on Western culture in the Age of Pluralism and Modernism. The 20th century began on January 1, 1901 and ended on December 31, 2000. During that time period, influential musical masters included B.B. King, Elvis Presley, and Michael Jackson. Scientists of the same time period counted Albert Einstein, Marie Curie, and Ivan Pavlov in their number. Sigmund Freud revolutionized the field of psychology, and while Adolph Hitler promoted war, Mahatma Gandhi promoted peace. While any of these people and many others may be considered the most influential genius of the 20th century, perhaps no other solitary human changed and continues to change global culture to the extent of Alan Turing, the father of the modern computer.
Living in the years between 1912 and 1954, Alan Turing was born in London and demonstrated signs of his genius at an early age (Turing & Copeland, 2004). He was solving advanced mathematics at the age of 15 without ever having had any instruction in the basics of calculus. When he was 16, he was able to grasp the meaning of Einstein’s refutation of Newton’s laws of motion from a text that did not go into the topic in depth (Hodges, 1983). He continued to pursue his interest in mathematics and was elected a Fellow at King’s College in Cambridge when he was only 22 years old on the basis of his dissertation on the central limit theorem (Aldrich, 2009).
The beginning of Turing’s impact on the world occurred in 1936 when he reinterpreted work belonging to Kurt Gödel, a German logician and mathematician. Turing replaced the author’s use of formal language based on arithmetic with algorithms to use with a hypothetical device that would be able to solve any mathematical computation. This device became known as a “Turing machine” and was the precursor to present-day computers. The Turing machine was a computer that existed in theory capable of running data-based programs (Dan Crow, 2012). Although a relatively simple idea, the computer was able to run an extremely large number of programs. Turing’s work with the Turing machine was later combined with that of Alonzo Church and the Church-Turing thesis was cultivated to allow a computer to run any software.
During the course of the next two years, Turing continued his work in mathematics, but also became interested in cryptology. In 1938, he earned his PhD from Princeton with his theory of relative computing utilizing ordinal logic. Turing was able to assist war efforts in England during World War II by breaking German codes by developing crib-based decryption (Clark and Steadman, 2012). He was the head of Hut 8, the division specifically dedicated to deciphering the codes of the German navy. He continued working with the war department and created significant breakthroughs in coding and encryption including breaking the German naval Enigma machine through the use of a cryptoanalytic device. The work was crucial in a number of British victories at sea against the Nazis and may have cut the war short by as many as four years. Turing’s participation in the government decryption efforts was cut short after his homosexual lifestyle was discovered, leading to the revocation of his security clearances and expulsion from the cryptological program of Great Britain.
In 1948, Turing was appointed Reader at Victoria University at Manchester and the following year began to work on developing software for one of the earliest computers to function with stored programs. Dealing with the concept of artificial intelligence as a simple basic “mind” that could be educated, Turing invented the “Turing test” which was used to define whether a computer was “intelligent”. The CAPTCHA test, a reversed form of the Turing test, is used today on the Internet to determine if a user is human or a computer. Fifty years later, the test is still contributing to the concept of artificial intelligence (Pinar Saygin, Cicekli, & Akman, 2000). Matrix equations are solved today using Turing’s LU decomposition method in which computers solve the determinant of a matrix (Bunch and Hopcroft, 1974). Near the end of his life, Turing turned his mathematical genius toward computational biology (Kunz, 2016). Turing published a paper in 1952 on morphogenesis, the process by which an organism begins to develop its shape. He was able to anticipate the theories of researchers such as Boris Belousov and Anatol Zhabotinsky concerning oscillating chemical reactions. His work led to the use of mathematical modeling of molecular data, a technique used by biological scientists today to study the processes occurring in living organisms.
The impact of the work of Alan Turing continues to this day. Time Magazine named Turing as one of the “100 Most Important People of the 20th Century”, stating, “The fact remains that everyone who taps at a keyboard, opening a spreadsheet or a word-processing program, is working on an incarnation of a Turing machine (Gray, 1999)”. The concept of computing was invented by Alan Turing and the architect of the modern computer, John von Neumann, was strongly influenced by his work. Today and in the future, computers remain at the heart of world communications, commerce, economics, transportation, and politics. Through computer applications using the Internet, a cultural revolution has enveloped millions of users. Access to information technology rivals the Industrial Revolution in the changes to world culture. It has altered methods of education, preservation of ethnic heritage, administration of health care, and discussions of ethics. The middle class has access to a lifestyle previously reserved for the wealthy as mental abilities replace physical ones. In the future, interfacing with computers may be possible by tracking eye and head movements. Work is already taking place on complete body suits to feed information audibly, visually, and tactilely. Without the genius of Turing, this may not have happened at all. For these reasons, Alan Turing should receive the 20th Century Genius Award.
References
Aldrich, J. (2009). England and Continental Probability in the Inter-War Years. Journal
Electronique D'histoire Des Probabilités Et De La Statistique, 5(2). Retrieved from http://www.jehps.net/Decembre2009/Aldrich.pdf
Bunch, J. & Hopcroft, J. (1974). Triangular Factorization and Inversion by Fast Matrix
Multiplication. Mathematics of Computation, 28(125), 231.
http://dx.doi.org/10.2307/2005828
Clark, L. & Steadman, I. (2012). Turing's achievements: codebreaking, AI and the birth of
computer science (Wired UK). Wired UK. Retrieved 20 May 2016, from
http://www.wired.co.uk/news/archive/2012-06/18/turing-contributions?page=all
Dan Crow, S. (2012). Gigaom | Why we owe it all to Alan Turing. Gigaom.com. Retrieved 20
May 2016, from https://gigaom.com/2012/06/23/why-we-owe-it-all-to-alan-turing/
Hodges, A. (1983). Alan Turing. New York: Simon and Schuster.
Kunz, P. (2016). Alan Turing and Systems Biology. Ercim-news.ercim.eu. Retrieved 20 May
2016, from http://ercim-news.ercim.eu/en91/special/alan-turing-and-systems-biology
Pinar Saygin, A., Cicekli, I., & Akman, V. (2000). Minds and Machines, 10(4), 463-518.
http://dx.doi.org/10.1023/a:1011288000451
Turing, A. & Copeland, B. (2004). The Essential Turing. Oxford: Clarendon Press.