Claude Shannon

Date

Claude Elwood Shannon (April 30, 1916 – February 24, 2001) was an American expert in many areas, including mathematics, electrical engineering, computer science, cryptography, and invention. He is known as the "father of information theory" and is credited with starting the Information Age. Shannon was the first to explain how Boolean algebra—important for all digital electronic circuits—could be used.

Claude Elwood Shannon (April 30, 1916 – February 24, 2001) was an American expert in many areas, including mathematics, electrical engineering, computer science, cryptography, and invention. He is known as the "father of information theory" and is credited with starting the Information Age.

Shannon was the first to explain how Boolean algebra—important for all digital electronic circuits—could be used. He also helped create the field of artificial intelligence. Rodney Brooks, a roboticist, said Shannon was the 20th-century engineer who contributed most to 21st-century technologies. Solomon W. Golomb, a mathematician, called Shannon’s intellectual achievements "one of the greatest of the twentieth century."

At the University of Michigan, Shannon earned two degrees: a Bachelor of Science in electrical engineering and a Bachelor of Science in mathematics, both in 1936. At age 21, while studying for a master’s degree in electrical engineering at MIT, his 1937 thesis, "A Symbolic Analysis of Relay and Switching Circuits," showed how Boolean algebra could be used to build any logical relationship in digital circuits. This work laid the theory behind digital computing and is called the "birth certificate of the digital revolution." He earned a PhD in mathematics from MIT in 1940. His thesis on genetics included important findings, though it was not published at first.

During World War II, Shannon worked on breaking codes and securing communications for the United States. His 1948 paper, "A Mathematical Theory of Communication," created the field of information theory. This work is called a "blueprint for the digital era" and "the Magna Carta of the Information Age." Golomb compared Shannon’s impact on the digital age to the inventor of the alphabet’s influence on literature. Shannon’s theories helped develop technologies like the compact disc, the Internet, mobile phones, and understanding of black holes. He introduced the term "bit" and co-invented pulse-code modulation and the first wearable computer. He also created the signal-flow graph.

Shannon joined the Central Intelligence Agency’s Special Cryptologic Advisory Group in 1951. From 1956 to 1978, he taught at MIT. He also helped start the field of artificial intelligence by co-organizing the 1956 Dartmouth workshop, considered the beginning of the discipline. His Theseus machine was one of the first electrical devices to learn through trial and error, an early example of artificial intelligence.

Biography

The Shannon family lived in Gaylord, Michigan, and Claude was born in a hospital in nearby Petoskey. His father, Claude Sr. (1862–1934), was a businessman and, for a time, a judge who handled legal matters related to estates in Gaylord. His mother, Mabel Wolf Shannon (1880–1945), was a teacher of languages and also served as the principal of Gaylord High School. Claude Sr. was a descendant of settlers from New Jersey, while Mabel was the child of people who had moved to the United States from Germany. The Shannon family was active in their Methodist Church during Claude’s childhood.

Most of the first 16 years of Claude’s life were spent in Gaylord, where he attended public school and graduated from Gaylord High School in 1932. He showed an interest in subjects related to machines and electricity. His favorite subjects were science and math. At home, he built models of planes, a radio-controlled boat, and a telegraph system that connected to a friend’s house half a mile away. While growing up, he also worked as a messenger for the Western Union company.

Claude’s childhood hero was Thomas Edison, whom he later learned was a distant cousin. Both Shannon and Edison were descendants of John Ogden (1609–1682), a leader in the American colonies and an ancestor of many important people.

In 1932, Shannon entered the University of Michigan, where he studied the work of George Boole. He graduated in 1936 with two bachelor’s degrees: one in electrical engineering and the other in mathematics.

In 1936, Shannon began his graduate studies in electrical engineering at the Massachusetts Institute of Technology (MIT). There, he worked on Vannevar Bush’s differential analyzer, which was an early type of computer that used parts powered by electricity and machines to solve math problems. While studying the complicated circuits of this machine, Shannon designed circuits based on Boole’s ideas. In 1937, he wrote his master’s thesis, A Symbolic Analysis of Relay and Switching Circuits. A paper from this thesis was published in 1938. This work was important for switching circuit theory, as it showed how circuits could be used to perform tasks related to logic and math. Shannon proved that these circuits could simplify the way relays were used in telephone systems. He also showed that these circuits could solve problems that Boole’s math could solve. In the final part of his thesis, he included diagrams of circuits, including a digital 4-bit full adder. His work was different from earlier engineers, who used practical methods. Shannon’s approach was more abstract and used math, which made his ideas groundbreaking and influenced modern electrical engineering.

Using switches to perform logic is the basic idea behind all electronic digital computers. Shannon’s work became the base for designing digital circuits. His ideas became widely known in the engineering community during and after World War II. The strict mathematical approach in his work replaced earlier, less organized methods. In 1987, Howard Gardner called Shannon’s thesis “possibly the most important and also the most famous master’s thesis of the century.” In 1972, Herman Goldstine described it as “surely … one of the most important master’s theses ever written … It helped to change digital circuit design from an art to a science.” A reviewer of his work said, “To the best of my knowledge, this is the first application of the methods of symbolic logic to so practical an engineering problem. From the point of view of originality I rate the paper as outstanding.” Shannon’s master’s thesis won the 1939 Alfred Nobel Prize.

Shannon received his PhD in mathematics from MIT in 1940. Vannevar Bush suggested that Shannon work on his dissertation at the Cold Spring Harbor Laboratory to develop a math-based explanation for Mendelian genetics. This research led to Shannon’s PhD thesis, An Algebra for Theoretical Genetics. However, the thesis was not published after Shannon lost interest, though it contained important results. Notably, he was one of the first to use an algebraic system to study population genetics. He also created a general formula for how traits are passed through generations in a population, which was new at the time.

In 1940, Shannon became a National Research Fellow at the Institute for Advanced Study in Princeton, New Jersey. In Princeton, he discussed his ideas with scientists and mathematicians like Hermann Weyl and John von Neumann. He also met Albert Einstein and Kurt Gödel occasionally. Shannon worked across different fields, and this ability may have helped him later develop mathematical information theory.

Shannon worked at Bell Labs for a few months in the summer of 1937 and returned there during World War II to work on fire-control systems and cryptography under a contract with the National Defense Research Committee (NDRC).

Shannon is credited with inventing signal-flow graphs in 1942. He discovered the topological gain formula while studying how an analog computer worked.

In early 1943, Shannon met Alan Turing, a leading British mathematician. Turing had been sent to Washington to share methods used by the British to break German military codes. He was also interested in how to encode speech and spent time at Bell Labs. Shannon and Turing met during tea time in the cafeteria. Turing showed Shannon his 1936 paper that described what is now called the “universal Turing machine.” This impressed Shannon, as many of its ideas matched his own.

Shannon and his team developed anti-aircraft systems that tracked enemy missiles and planes and calculated paths for intercepting them.

In 1945, as the war ended, the NDRC published a summary of technical reports. In a section about fire control, a special essay titled Data Smoothing and Prediction in Fire-Control Systems, coauthored by Shannon, Ralph Beebe Blackman, and Hendrik Wade Bode, treated the problem of smoothing data in fire-control systems by comparing it to separating a signal from noise in communication systems. This work marked the beginning of the Information Age.

Shannon’s work on cryptography was closely connected to his later research on communication theory. At the end of the war, he wrote a classified report for Bell Telephone Labs titled A Mathematical Theory of Cryptography, dated September 1945. A version of this paper was published in 1949 as Communication Theory of Secrecy Systems in the Bell System Technical Journal. This paper included many ideas and math similar to those in his A Mathematical Theory of Communication. Shannon said that his wartime work on communication theory and cryptography developed at the same time and were closely linked.

The Mathematical Theory of Communication

Shannon's book, The Mathematical Theory of Communication, begins with an explanation by Warren Weaver. Although Shannon's work focuses on communication, Weaver explained the ideas in a way that made them easier to understand for people not familiar with complex math. By combining their communication skills and ideas, they created the Shannon-Weaver model. However, the mathematical and theoretical parts of the model come only from Shannon's work, which followed Weaver's explanation. For people who are not experts, Weaver's introduction helps explain the book better. But Shannon's later work, including his logical reasoning, math, and clear explanations, was responsible for defining the problem itself.

Other work

In 1949, Shannon completed a paper that was published in March 1950. This paper estimated the number of possible moves in chess, which is about 10. This number is now called the "Shannon number" and is still considered a good estimate of how complex chess is. It is often mentioned as a challenge to solving chess by checking every possible move.

On March 9, 1949, Shannon presented a paper titled "Programming a Computer for Playing Chess" at the National Institute for Radio Engineers Convention in New York. In the paper, he explained how to program a computer to play chess by using scoring for positions and selecting moves. He suggested ways to reduce the number of possibilities a computer needs to consider during a game. The paper was published in Philosophical Magazine in March 1950 and is seen as one of the first articles about programming computers to play chess. In 1950, Shannon also wrote an article titled "A Chess-Playing Machine," which was published in Scientific American. Both papers greatly influenced future chess programs.

Shannon’s method for choosing a move involved a process called minimax, which used an evaluation function to assess the value of a chess position. He gave an example where the value of the white pieces was subtracted from the value of the black pieces. The value of pieces was based on standard chess rules: 1 point for a pawn, 3 points for a knight or bishop, 5 points for a rook, and 9 points for a queen. He also considered certain position factors, such as subtracting ½ point for doubled pawns, backward pawns, or isolated pawns. He added 0.1 point for each possible move a player could make.

Shannon adapted Kerckhoffs' principle, stating that "The enemy knows the system." This version is now known as "Shannon's maxim."

Shannon also contributed to combinatorics and detection theory. His 1948 paper introduced tools used in combinatorics. In 1944, he worked on detection theory, producing one of the earliest explanations of the "matched filter" principle.

Shannon was a successful investor who gave lectures on investing. A report from Barron's on August 11, 1986, showed that Shannon’s returns on 1,026 mutual funds were higher than 1,025 of them. Comparing his investment performance from the late 1950s to 1986 with Warren Buffett’s from 1965 to 1995, Shannon earned about 28%, while Buffett earned about 27%. One of Shannon’s methods, called "Shannon's demon," involved balancing a portfolio equally between cash and a stock, then regularly adjusting it to benefit from random changes in stock prices. Shannon planned to write about investing but never published his ideas, despite giving many lectures. He was among the first investors to download stock prices. A snapshot of his portfolio in 1981 showed a value of $582,717.50, which would be worth about $1.5 million in 2015, not including another stock he owned.

Commemorations

The Shannon centenary in 2016 honored the life and contributions of Claude Elwood Shannon on the 100th anniversary of his birth on April 30, 1916. This event was partly inspired by the Alan Turing Year. A group of people from the IEEE Information Theory Society, including Christina Fragouli, Rüdiger Urbanke, Michelle Effros, Lav Varshney, and Sergio Verdú, organized global events. The plan was shared during the History Panel at the 2015 IEEE Information Theory Workshop in Jerusalem and in the IEEE Information Theory Society newsletter.

A full list of confirmed events was posted on the IEEE Information Theory Society website.

Some of the activities included:

  • Bell Labs held the First Shannon Conference on the Future of the Information Age on April 28–29, 2016, in Murray Hill, New Jersey, to celebrate Claude Shannon and his lasting influence. The event featured speeches by important people in the information age, informal stories about Shannon, and technical talks on topics like bioinformatics, economic systems, and social networks. It also included a student competition.
  • Bell Labs created a website on April 30, 2016, that told the story of Shannon’s work at Bell Labs (under an NDRC contract with the U.S. Government) from 1942 to 1957. The site included details about the Mathematics Department, bios of Shannon’s colleagues, and original technical papers he wrote.
  • The Republic of Macedonia released a special stamp to honor Shannon. A similar stamp by the U.S. Postal Service is being planned, with a public petition supporting it.
  • A documentary titled The Bit Player, about Shannon and information theory, was made by Sergio Verdú and Mark Levinson.
  • A joint celebration of George Boole’s 200th birthday and Shannon’s 100th birthday was organized by University College Cork and the Massachusetts Institute of Technology. The first event was a workshop in Cork called When Boole Meets Shannon, followed by exhibits in Boston and at MIT.
  • Many organizations worldwide held events, including the Boston Museum of Science, Heinz-Nixdorf Museum, Institute for Advanced Study, Technische Universität Berlin, University of South Australia, Unicamp, University of Toronto, Chinese University of Hong Kong, Cairo University, Telecom ParisTech, National Technical University of Athens, Indian Institute of Science, Indian Institute of Technology Bombay, Indian Institute of Technology Kanpur, Nanyang Technological University of Singapore, University of Maryland, University of Illinois at Chicago, École Polytechnique Fédérale de Lausanne, Pennsylvania State University, University of California Los Angeles, Massachusetts Institute of Technology, Chongqing University of Posts and Telecommunications, and University of Illinois at Urbana-Champaign.
  • A logo shown on this page was created by many people online through Crowdspring.
  • A presentation titled Saving Face: Information Tricks for Love and Life was held on May 4, 2016, at the National Museum of Mathematics in New York. It discussed Shannon’s work in information theory. A video and other materials from the event are available.

Selected works

  • Claude E. Shannon: A Symbolic Analysis of Relay and Switching Circuits, master's degree paper, Massachusetts Institute of Technology, 1937.
  • Claude E. Shannon: "A Mathematical Theory of Communication," Bell System Technical Journal, Volume 27, pages 379–423 and 623–656, 1948 (abstract).
  • Claude E. Shannon and Warren Weaver: The Mathematical Theory of Communication. University of Illinois Press, Urbana, Illinois, 1949. ISBN 0-252-72548-4.
  • Neil Sloane, editor (1993): Claude Shannon: Collected Works, IEEE Press.

More
articles