Claude Elwood Shannon (April 30, 1916 – February 24, 2001) was an American mathematician, electrical engineer, and cryptographer known as "the father of information theory". Shannon is noted for having founded information theory with a landmark paper, A Mathematical Theory of Communication, that he published in 1948.
He is also well known for founding digital circuit design theory in 1937, when—as a 21-year-old master's degree student at the Massachusetts Institute of Technology (MIT)—he wrote his thesis demonstrating that electrical applications of Boolean algebra could construct any logical numerical relationship. Shannon contributed to the field of cryptanalysis for national defense during World War II, including his fundamental work on codebreaking and secure telecommunications.
|Born||April 30, 1916|
Petoskey, Michigan, United States
|Died||February 24, 2001 (aged 84)|
Medford, Massachusetts, United States
|Alma mater||University of Michigan,|
|Awards||Stuart Ballantine Medal (1955)|
IEEE Medal of Honor (1966)
National Medal of Science (1966)
Harvey Prize (1972)
Claude E. Shannon Award (1972)
Harold Pender Award (1978)
John Fritz Medal (1983)
Kyoto Prize (1985)
National Inventors Hall of Fame (2004)
|Fields||Mathematics and electronic engineering|
Institute for Advanced Study
|Doctoral advisor||Frank Lauren Hitchcock|
|Doctoral students||Danny Hillis|
Shannon was born in Petoskey, Michigan and grew up in Gaylord, Michigan. His father, Claude, Sr. (1862–1934), a descendant of early settlers of New Jersey, was a self-made businessman, and for a while, a Judge of Probate. Shannon's mother, Mabel Wolf Shannon (1890–1945), was a language teacher, and also served as the principal of Gaylord High School.
Most of the first 16 years of Shannon's life were spent in Gaylord, where he attended public school, graduating from Gaylord High School in 1932. Shannon showed an inclination towards mechanical and electrical things. His best subjects were science and mathematics. At home he constructed such devices as models of planes, a radio-controlled model boat and a barbed-wire telegraph system to a friend's house a half-mile away. While growing up, he also worked as a messenger for the Western Union company.
His childhood hero was Thomas Edison, who he later learned was a distant cousin. Both Shannon and Edison were descendants of John Ogden (1609–1682), a colonial leader and an ancestor of many distinguished people.
In 1932, Shannon entered the University of Michigan, where he was introduced to the work of George Boole. He graduated in 1936 with two bachelor's degrees: one in electrical engineering and the other in mathematics.
In 1936, Shannon began his graduate studies in electrical engineering at MIT, where he worked on Vannevar Bush's differential analyzer, an early analog computer. While studying the complicated ad hoc circuits of this analyzer, Shannon designed switching circuits based on Boole's concepts. In 1937, he wrote his master's degree thesis, A Symbolic Analysis of Relay and Switching Circuits. A paper from this thesis was published in 1938. In this work, Shannon proved that his switching circuits could be used to simplify the arrangement of the electromechanical relays that were used then in telephone call routing switches. Next, he expanded this concept, proving that these circuits could solve all problems that Boolean algebra could solve. In the last chapter, he presented diagrams of several circuits, including a 4-bit full adder.
Using this property of electrical switches to implement logic is the fundamental concept that underlies all electronic digital computers. Shannon's work became the foundation of digital circuit design, as it became widely known in the electrical engineering community during and after World War II. The theoretical rigor of Shannon's work superseded the ad hoc methods that had prevailed previously. Howard Gardner called Shannon's thesis "possibly the most important, and also the most noted, master's thesis of the century."
Shannon received his Ph.D. degree from MIT in 1940. Vannevar Bush had suggested that Shannon should work on his dissertation at the Cold Spring Harbor Laboratory, in order to develop a mathematical formulation for Mendelian genetics. This research resulted in Shannon's PhD thesis, called An Algebra for Theoretical Genetics.
In 1940, Shannon became a National Research Fellow at the Institute for Advanced Study in Princeton, New Jersey. In Princeton, Shannon had the opportunity to discuss his ideas with influential scientists and mathematicians such as Hermann Weyl and John von Neumann, and he also had occasional encounters with Albert Einstein and Kurt Gödel. Shannon worked freely across disciplines, and this ability may have contributed to his later development of mathematical information theory.
Shannon then joined Bell Labs to work on fire-control systems and cryptography during World War II, under a contract with section D-2 (Control Systems section) of the National Defense Research Committee (NDRC).
For two months early in 1943, Shannon came into contact with the leading British mathematician Alan Turing. Turing had been posted to Washington to share with the U.S. Navy's cryptanalytic service the methods used by the British Government Code and Cypher School at Bletchley Park to break the ciphers used by the Kriegsmarine U-boats in the north Atlantic Ocean. He was also interested in the encipherment of speech and to this end spent time at Bell Labs. Shannon and Turing met at teatime in the cafeteria. Turing showed Shannon his 1936 paper that defined what is now known as the "Universal Turing machine"; This impressed Shannon, as many of its ideas complemented his own.
In 1945, as the war was coming to an end, the NDRC was issuing a summary of technical reports as a last step prior to its eventual closing down. Inside the volume on fire control, a special essay titled Data Smoothing and Prediction in Fire-Control Systems, coauthored by Shannon, Ralph Beebe Blackman, and Hendrik Wade Bode, formally treated the problem of smoothing the data in fire-control by analogy with "the problem of separating a signal from interfering noise in communications systems." In other words, it modeled the problem in terms of data and signal processing and thus heralded the coming of the Information Age.
Shannon's work on cryptography was even more closely related to his later publications on communication theory. At the close of the war, he prepared a classified memorandum for Bell Telephone Labs entitled "A Mathematical Theory of Cryptography," dated September 1945. A declassified version of this paper was published in 1949 as "Communication Theory of Secrecy Systems" in the Bell System Technical Journal. This paper incorporated many of the concepts and mathematical formulations that also appeared in his A Mathematical Theory of Communication. Shannon said that his wartime insights into communication theory and cryptography developed simultaneously and that "they were so close together you couldn’t separate them". In a footnote near the beginning of the classified report, Shannon announced his intention to "develop these results … in a forthcoming memorandum on the transmission of information."
While he was at Bell Labs, Shannon proved that the cryptographic one-time pad is unbreakable in his classified research that was later published in October 1949. He also proved that any unbreakable system must have essentially the same characteristics as the one-time pad: the key must be truly random, as large as the plaintext, never reused in whole or part, and be kept secret.
In 1948, the promised memorandum appeared as "A Mathematical Theory of Communication," an article in two parts in the July and October issues of the Bell System Technical Journal. This work focuses on the problem of how best to encode the information a sender wants to transmit. In this fundamental work, he used tools in probability theory, developed by Norbert Wiener, which were in their nascent stages of being applied to communication theory at that time. Shannon developed information entropy as a measure of the uncertainty in a message while essentially inventing the field of information theory. In 1949 Claude Shannon and Robert Fano devised a systematic way to assign code words based on probabilities of blocks. This technique, known as Shannon-Fano coding, was first proposed in the 1948 article.
The book, co-authored with Warren Weaver, The Mathematical Theory of Communication, reprints Shannon's 1948 article and Weaver's popularization of it, which is accessible to the non-specialist. Warren Weaver pointed out that the word "information" in communication theory is not related to what you do say, but to what you could say. That is, information is a measure of one's freedom of choice when one selects a message. Shannon's concepts were also popularized, subject to his own proofreading, in John Robinson Pierce's Symbols, Signals, and Noise.
Information theory's fundamental contribution to natural language processing and computational linguistics was further established in 1951, in his article "Prediction and Entropy of Printed English", showing upper and lower bounds of entropy on the statistics of English – giving a statistical foundation to language analysis. In addition, he proved that treating whitespace as the 27th letter of the alphabet actually lowers uncertainty in written language, providing a clear quantifiable link between cultural practice and probabilistic cognition.
Another notable paper published in 1949 is "Communication Theory of Secrecy Systems", a declassified version of his wartime work on the mathematical theory of cryptography, in which he proved that all theoretically unbreakable ciphers must have the same requirements as the one-time pad. He is also credited with the introduction of sampling theory, which is concerned with representing a continuous-time signal from a (uniform) discrete set of samples. This theory was essential in enabling telecommunications to move from analog to digital transmissions systems in the 1960s and later.
He returned to MIT to hold an endowed chair in 1956.
In 1956 Shannon joined the MIT faculty to work in the Research Laboratory of Electronics (RLE). He continued to serve on the MIT faculty until 1978.
Shannon developed Alzheimer's disease and spent the last few years of his life in a nursing home in Massachusetts oblivious to the marvels of the digital revolution he had helped create. He died in 2001. He was survived by his wife, Mary Elizabeth Moore Shannon, his son, Andrew Moore Shannon, his daughter, Margarita Shannon, his sister, Catherine Shannon Kay, and his two granddaughters. His wife stated in his obituary that, had it not been for Alzheimer's disease, "He would have been bemused" by it all.
Outside of Shannon's academic pursuits, he was interested in juggling, unicycling, and chess. He also invented many devices, including a Roman numeral computer called THROBAC, juggling machines, and a flame-throwing trumpet. One of his more humorous devices was a box kept on his desk called the "Ultimate Machine", based on an idea by Marvin Minsky. Otherwise featureless, the box possessed a single switch on its side. When the switch was flipped, the lid of the box opened and a mechanical hand reached out, flipped off the switch, then retracted back inside the box. In addition, he built a device that could solve the Rubik's Cube puzzle.
Shannon met his second wife Betty Shannon (née Mary Elizabeth Moore) when she was a numerical analyst at Bell Labs. They were married in 1949. Betty assisted Claude in building some of his most famous inventions.
Claude and Betty Shannon had three children, Robert James Shannon, Andrew Moore Shannon, and Margarita Shannon, and raised his family in Winchester, Massachusetts. Their oldest son, Robert Shannon, died in 1998 at the age of 45.
To commemorate Shannon's achievements, there were celebrations of his work in 2001.
There are currently six statues of Shannon sculpted by Eugene Daub: one at the University of Michigan; one at MIT in the Laboratory for Information and Decision Systems; one in Gaylord, Michigan; one at the University of California, San Diego; one at Bell Labs; and another at AT&T Shannon Labs. After the breakup of the Bell System, the part of Bell Labs that remained with AT&T Corporation was named Shannon Labs in his honor.
According to Neil Sloane, an AT&T Fellow who co-edited Shannon's large collection of papers in 1993, the perspective introduced by Shannon's communication theory (now called information theory) is the foundation of the digital revolution, and every device containing a microprocessor or microcontroller is a conceptual descendant of Shannon's publication in 1948: "He's one of the great men of the century. Without him, none of the things we know today would exist. The whole digital revolution started with him." The unit shannon is named after Claude Shannon.
"Theseus", created in 1950, was a magnetic mouse controlled by an electromechanical relay circuit that enabled it to move around a labyrinth of 25 squares. Its dimensions were the same as those of an average mouse. The maze configuration was flexible and it could be modified arbitrarily by rearranging movable partitions. The mouse was designed to search through the corridors until it found the target. Having travelled through the maze, the mouse could then be placed anywhere it had been before, and because of its prior experience it could go directly to the target. If placed in unfamiliar territory, it was programmed to search until it reached a known location and then it would proceed to the target, adding the new knowledge to its memory and learning new behavior. Shannon's mouse appears to have been the first artificial learning device of its kind.
In 1949 Shannon completed a paper (published in March 1950) which estimates the game-tree complexity of chess, which is approximately 10120. This number is now often referred to as the "Shannon number", and is still regarded today as an accurate estimate of the game's complexity. The number is often cited as one of the barriers to solving the game of chess using an exhaustive analysis (i.e. brute force analysis).
On March 9, 1949, Shannon presented a paper called "Programming a Computer for playing Chess." The paper was presented at the National Institute for Radio Engineers Convention in New York. He described how to program a computer to play chess based on position scoring and move selection. He proposed basic strategies for restricting the number of possibilities to be considered in a game of chess. In March 1950 it was published in Philosophical Magazine, and is considered one of the first articles published on the topic of programming a computer for playing chess, and using a computer to solve the game. His process for having the computer decide on which move to make was a minimax procedure, based on an evaluation function of a given chess position. Shannon gave a rough example of an evaluation function in which the value of the black position was subtracted from that of the white position. Material was counted according to the usual chess piece relative value (1 point for a pawn, 3 points for a knight or bishop, 5 points for a rook, and 9 points for a queen). He considered some positional factors, subtracting ½ point for each doubled pawn, backward pawn, and isolated pawn. Another positional factor in the evaluation function was mobility, adding 0.1 point for each legal move available. Finally, he considered checkmate to be the capture of the king, and gave the king the artificial value of 200 points. Quoting from the paper:
The evaluation function was clearly for illustrative purposes, as Shannon stated. For example, according to the function, pawns that are doubled as well as isolated would have no value at all, which is clearly unrealistic.
Shannon formulated a version of Kerckhoffs' principle as "The enemy knows the system". In this form it is known as "Shannon's maxim".
The Shannon Centenary, 2016, marked the life and influence of Claude Elwood Shannon on the hundredth anniversary of his birth on April 30, 1916. It was inspired in part by the Alan Turing Year. An ad hoc committee of the IEEE Information Theory Society including Christina Fragouli, Rüdiger Urbanke, Michelle Effros, Lav Varshney and Sergio Verdú, coordinated worldwide events. The initiative was announced in the History Panel at the 2015 IEEE Information Theory Workshop Jerusalem and the IEEE Information Theory Society Newsletter.
A detailed listing of confirmed events was available on the website of the IEEE Information Theory Society.
Some of the planned activities included:
Shannon described himself as an atheist and was outwardly apolitical.
"A Mathematical Theory of Communication" is an article by mathematician Claude E. Shannon published in Bell System Technical Journal in 1948. It was renamed The Mathematical Theory of Communication in the book of the same name, a small but significant title change after realizing the generality of this work.A Symbolic Analysis of Relay and Switching Circuits
A Symbolic Analysis of Relay and Switching Circuits is the title of a master's thesis written by computer science pioneer Claude E. Shannon while attending the Massachusetts Institute of Technology (MIT) in 1937. In his thesis, Shannon, a dual degree graduate of the University of Michigan, proved that Boolean algebra could be used to simplify the arrangement of the relays that were the building blocks of the electromechanical automatic telephone exchanges of the day. Shannon went on to prove that it should also be possible to use arrangements of relays to solve Boolean algebra problems.
The utilization of the binary properties of electrical switches to perform logic functions is the basic concept that underlies all electronic digital computer designs. Shannon's thesis became the foundation of practical digital circuit design when it became widely known among the electrical engineering community during and after World War II. At the time, the methods employed to design logic circuits were ad hoc in nature and lacked the theoretical discipline that Shannon's paper supplied to later projects.
Psychologist Howard Gardner described Shannon's thesis as "possibly the most important, and also the most famous, master's thesis of the century". A version of the paper was published in the 1938 issue of the Transactions of the American Institute of Electrical Engineers, and in 1940, it earned Shannon the Alfred Noble American Institute of American Engineers Award.Biographical Memoirs of Fellows of the Royal Society
The Biographical Memoirs of Fellows of the Royal Society is an academic journal on the history of science published annually by the Royal Society. It publishes obituaries of Fellows of the Royal Society. It was established in 1932 as Obituary Notices of Fellows of the Royal Society and obtained its current title in 1955, with volume numbering restarting at 1. Prior to 1932, obituaries were published in the Proceedings of the Royal Society.
The memoirs are a significant historical record and most include a full bibliography of works by the subjects. The memoirs are often written by a scientist of the next generation, often one of the subject's own former students, or a close colleague. In many cases the author is also a Fellow. Notable biographies published in this journal include Albert Einstein, Alan Turing, Bertrand Russell, Claude Shannon, Clement Attlee, Ernst Mayr, and Erwin Schrödinger.Each year around 20 to 25 memoirs of deceased Fellows of the Royal Society are collated by the Editor-in-Chief, currently Malcolm Longair, who succeeded Trevor Stuart in 2016. All content more than one year old is freely available to read.Bit
The bit is a basic unit of information in information theory, computing, and digital communications. The name is a portmanteau of binary digit.In information theory, one bit is typically defined as the information entropy of a binary random variable that is 0 or 1 with equal probability, or the information that is gained when the value of such a variable becomes known. As a unit of information, the bit has also been called a shannon, named after Claude Shannon.
As a binary digit, the bit represents a logical value, having only one of two values. It may be physically implemented with a two-state device. These state values are most commonly represented as either 0or1, but other representations such as true/false, yes/no, +/−, or on/off are possible. The correspondence between these values and the physical states of the underlying storage or device is a matter of convention, and different assignments may be used even within the same device or program.
The symbol for the binary digit is either simply bit per recommendation by the IEC 80000-13:2008 standard, or the lowercase character b, as recommended by the IEEE 1541-2002 and IEEE Std 260.1-2004 standards. A group of eight binary digits is commonly called one byte, but historically the size of the byte is not strictly defined.Communication Theory of Secrecy Systems
"Communication Theory of Secrecy Systems" is a paper published in 1949 by Claude Shannon discussing cryptography from the viewpoint of information theory. It is one of the foundational treatments (arguably the foundational treatment) of modern cryptography. It is also a proof that all theoretically unbreakable ciphers must have the same requirements as the one-time pad.
Shannon published an earlier version of this research in the classified report A Mathematical Theory of Cryptography, Memorandum MM 45-110-02, Sept. 1, 1945, Bell Laboratories. This classified report also precedes the publication of his "A Mathematical Theory of Communication", which appeared in 1948.Confusion and diffusion
In cryptography, confusion and diffusion are two properties of the operation of a secure cipher identified by Claude Shannon in his 1945 classified report A Mathematical Theory of Cryptography. These properties, when present, work to thwart the application of statistics and other methods of cryptanalysis.
These concepts are also important in the design of robust hash functions and pseudorandom number generators where decorrelation of the generated values is of paramount importance.David Slepian
David S. Slepian (June 30, 1923 – November 29, 2007) was an American mathematician. He is best known for his work with algebraic coding theory, probability theory, and distributed source coding. He was colleagues with Claude Shannon and Richard Hamming at Bell Labs.Edward O. Thorp
Edward Oakley Thorp (born August 14, 1932) is an American mathematics professor, author, hedge fund manager, and blackjack player. He pioneered the modern applications of probability theory, including the harnessing of very small correlations for reliable financial gain.Thorp is the author of Beat the Dealer, which mathematically proved that the house advantage in blackjack could be overcome by card counting. He also developed and applied effective hedge fund techniques in the financial markets, and collaborated with Claude Shannon in creating the first wearable computer.Thorp received his Ph.D. in mathematics from the University of California, Los Angeles in 1958, and worked at the Massachusetts Institute of Technology (MIT) from 1959 to 1961. He was a professor of mathematics from 1961 to 1965 at New Mexico State University, and then joined the University of California, Irvine where he was a professor of mathematics from 1965 to 1977 and a professor of mathematics and finance from 1977 to 1982.Gibbs algorithm
In statistical mechanics, the Gibbs algorithm, introduced by J. Willard Gibbs in 1902, is a criterion for choosing a probability distribution for the statistical ensemble of microstates of a thermodynamic system by minimizing the average log probability
subject to the probability distribution pi satisfying a set of constraints (usually expectation values) corresponding to the known macroscopic quantities. in 1948, Claude Shannon interpreted the negative of this quantity, which he called information entropy, as a measure of the uncertainty in a probability distribution. In 1957, E.T. Jaynes realized that this quantity could be interpreted as missing information about anything, and generalized the Gibbs algorithm to non-equilibrium systems with the principle of maximum entropy and maximum entropy thermodynamics.
Physicists call the result of applying the Gibbs algorithm the Gibbs distribution for the given constraints, most notably Gibbs's grand canonical ensemble for open systems when the average energy and the average number of particles are given. (See also partition function).
This general result of the Gibbs algorithm is then a maximum entropy probability distribution. Statisticians identify such distributions as belonging to exponential families.Innovation (signal processing)
In time series analysis (or forecasting) — as conducted in statistics, signal processing, and many other fields — the innovation is the difference between the observed value of a variable at time t and the optimal forecast of that value based on information available prior to time t. If the forecasting method is working correctly, successive innovations are uncorrelated with each other, i.e., constitute a white noise time series. Thus it can be said that the innovation time series is obtained from the measurement time series by a process of 'whitening', or removing the predictable component. The use of the term innovation in the sense described here is due to Hendrik Bode and Claude Shannon (1950) in their discussion of the Wiener filter problem, although the notion was already implicit in the work of Kolmogorov.List of game theorists
This is a list of notable economists, mathematicians, political scientists, and computer scientists whose work has added substantially to the field of game theory. For a list of people in the field of video games rather than game theory, please see list of ludologists.
Derek Abbott - quantum game theory and Parrondo's games
Susanne Albers - algorithmic game theory and algorithm analysis
Kenneth Arrow - voting theory (Bank of Sweden Prize in Economic Sciences in Memory of Alfred Nobel 1972)
Robert Aumann - equilibrium theory (Bank of Sweden Prize in Economic Sciences in Memory of Alfred Nobel 2005)
Robert Axelrod - repeated Prisoner's Dilemma
Tamer Başar - dynamic game theory and application robust control of systems with uncertainty
Cristina Bicchieri - epistemology of game theory
Olga Bondareva - Bondareva–Shapley theorem
Steven Brams - cake cutting, fair division, theory of moves
Jennifer Tour Chayes - algorithmic game theory and auction algorithms
John Horton Conway - combinatorial game theory
William Hamilton - evolutionary biology
John Harsanyi - equilibrium theory (Bank of Sweden Prize in Economic Sciences in Memory of Alfred Nobel 1994)
Monika Henzinger - algorithmic game theory and information retrieval
Naira Hovakimyan - differential games and adaptive control
Peter L. Hurd - evolution of aggressive behavior
Rufus Isaacs - differential games
Anna Karlin - algorithmic game theory and online algorithms
Michael Kearns - algorithmic game theory and computational social science
Sarit Kraus - non-monotonic reasoning
John Maynard Smith - evolutionary biology
Oskar Morgenstern - social organization
John Forbes Nash - Nash equilibrium (Bank of Sweden Prize in Economic Sciences in Memory of Alfred Nobel 1994)
John von Neumann - Minimax theorem, expected utility, social organization, arms race
J. M. R. Parrondo - games with a reversal of fortune, such as Parrondo's games
Charles E. M. Pearce - games applied to queuing theory
George R. Price - theoretical and evolutionary biology
Anatol Rapoport - Mathematical psychologist, early proponent of tit-for-tat in repeated Prisoner's Dilemma
Julia Robinson - proved that fictitious play dynamics converges to the mixed strategy Nash equilibrium in two-player zero-sum games
Alvin E. Roth - market design (Nobel Memorial Prize in Economic Sciences 2012)
Ariel Rubinstein - bargaining theory, learning and language
Thomas Jerome Schaefer - computational complexity of perfect-information games
Suzanne Scotchmer - patent law incentive models
Reinhard Selten - bounded rationality (Bank of Sweden Prize in Economic Sciences in Memory of Alfred Nobel 1994)
Claude Shannon - studied cryptography and chess; sometimes called "the father of information theory"
Lloyd Shapley - Shapley value and core concept in coalition games (Nobel Memorial Prize in Economic Sciences 2012)
Thomas Schelling - bargaining (Bank of Sweden Prize in Economic Sciences in Memory of Alfred Nobel 2005) and models of segregation
Myrna Wooders - coalition theoryNyquist–Shannon sampling theorem
In the field of digital signal processing, the sampling theorem is a fundamental bridge between continuous-time signals (often called "analog signals") and discrete-time signals (often called "digital signals"). It establishes a sufficient condition for a sample rate that permits a discrete sequence of samples to capture all the information from a continuous-time signal of finite bandwidth.
Strictly speaking, the theorem only applies to a class of mathematical functions having a Fourier transform that is zero outside of a finite region of frequencies. Intuitively we expect that when one reduces a continuous function to a discrete sequence and interpolates back to a continuous function, the fidelity of the result depends on the density (or sample rate) of the original samples. The sampling theorem introduces the concept of a sample rate that is sufficient for perfect fidelity for the class of functions that are bandlimited to a given bandwidth, such that no actual information is lost in the sampling process. It expresses the sufficient sample rate in terms of the bandwidth for the class of functions. The theorem also leads to a formula for perfectly reconstructing the original continuous-time function from the samples.
Perfect reconstruction may still be possible when the sample-rate criterion is not satisfied, provided other constraints on the signal are known. (See § Sampling of non-baseband signals below and compressed sensing.) In some cases (when the sample-rate criterion is not satisfied), utilizing additional constraints allows for approximate reconstructions. The fidelity of these reconstructions can be verified and quantified utilizing Bochner's theorem.The name Nyquist–Shannon sampling theorem honors Harry Nyquist and Claude Shannon. The theorem was also discovered independently by E. T. Whittaker, by Vladimir Kotelnikov, and by others. It is thus also known by the names Nyquist–Shannon–Kotelnikov, Whittaker–Shannon–Kotelnikov, Whittaker–Nyquist–Kotelnikov–Shannon, and cardinal theorem of interpolation.Product cipher
In cryptography, a product cipher combines two or more transformations in a manner intending that the resulting cipher is more secure than the individual components to make it resistant to cryptanalysis. The product cipher combines a sequence of simple transformations such as substitution (S-box), permutation (P-box), and modular arithmetic. The concept of product ciphers is due to Claude Shannon, who presented the idea in his foundational paper, Communication Theory of Secrecy Systems.
For transformation involving reasonable number of n message symbols, both of the foregoing cipher systems (the S-box and P-box) are by themselves wanting. Shannon suggested using a combination of S-box and P-box transformation—a product cipher. The combination could yield a cipher system more powerful than either one alone. This approach of alternatively applying substitution and permutation transformation has been used by IBM in the Lucifer cipher system, and has become the standard for national data encryption standards such as the Data Encryption Standard and the Advanced Encryption Standard. A product cipher that uses only substitutions and permutations is called a SP-network. Feistel ciphers are an important class of product ciphers.Shannon (unit)
The shannon (symbol: Sh), more commonly known as the bit, is a unit of information and of entropy defined by IEC 80000-13. One shannon is the information content of an event occurring when its probability is 1⁄2. It is also the entropy of a system with two equally probable states. If a message is made of a sequence of a given number of bits, with all possible bit strings being equally likely, the message's information content expressed in shannons is equal to the number of bits in the sequence. For this and historical reasons, the shannon is more commonly known as the bit. The introduction of the term shannon provides an explicit distinction between the amount of information that is expressed and the quantity of data that may be used to represent the information. IEEE Std 260.1-2004 still defines the unit for this meaning as the bit, with no mention of the shannon.
The shannon can be converted to other information units according to
The shannon is named after Claude Shannon, the founder of information theory.Shannon coding
In the field of data compression, Shannon coding, named after its creator, Claude Shannon, is a lossless data compression technique for constructing a prefix code based on a set of symbols and their probabilities (estimated or measured). It is suboptimal in the sense that it does not achieve the lowest possible expected code word length like Huffman coding does, and never better but sometimes equal to the Shannon-Fano coding.
The method was the first of its type, the technique was used to prove Shannon's noiseless coding theorem in his 1948 article "A Mathematical Theory of Communication", and is therefore a centerpiece of the information age.
This coding method gave rise to the field of information theory and without its contribution, the world would not have any of the many successors; for example Shannon-Fano coding, Huffman coding, or arithmetic coding. Much of our day-to-day lives are significantly influenced by digital data and this would not be possible without Shannon coding and its ongoing evolution of its predecessor coding methods.
In Shannon coding, the symbols are arranged in order from most probable to least probable, and assigned codewords by taking the first bits from the binary expansions of the cumulative probabilities Here denotes the ceiling function (which rounds up to the next integer value).Shannon number
The Shannon number, named after Claude Shannon, is a conservative lower bound (not an estimate) of the game-tree complexity of chess of 10120, based on an average of about 103 possibilities for a pair of moves consisting of a move for White followed by one for Black, and a typical game lasting about 40 such pairs of moves.Shannon switching game
The Shannon switching game is an abstract strategy game for two players, invented by American mathematician and electrical engineer Claude Shannon, the "father of information theory" some time before 1951. Two players take turns coloring the edges of an arbitrary graph. One player has the goal of connecting two distinguished vertices by a path of edges of their color. The other player aims to prevent this by using their color instead (or, equivalently, by erasing edges). The game is commonly played on a rectangular grid; this special case of the game was independently invented by American mathematician David Gale in the late 1950s and is known as Gale or Bridg-It.Shannon–Fano coding
In the field of data compression, Shannon–Fano coding, named after Claude Shannon and Robert Fano, is a technique for constructing a prefix code based on a set of symbols and their probabilities (estimated or measured). It is suboptimal in the sense that it does not achieve the lowest possible expected code word length like Huffman coding.The technique was proposed in Shannon's "A Mathematical Theory of Communication", his 1948 article introducing the field of information theory. The method was attributed to Fano, who later published it as a technical report.
Shannon–Fano coding should not be confused with Shannon coding, the coding method used to prove Shannon's noiseless coding theorem, or with Shannon–Fano–Elias coding (also known as Elias coding), the precursor to arithmetic coding.Shannon–Hartley theorem
In information theory, the Shannon–Hartley theorem tells the maximum rate at which information can be transmitted over a communications channel of a specified bandwidth in the presence of noise. It is an application of the noisy-channel coding theorem to the archetypal case of a continuous-time analog communications channel subject to Gaussian noise. The theorem establishes Shannon's channel capacity for such a communication link, a bound on the maximum amount of error-free information per time unit that can be transmitted with a specified bandwidth in the presence of the noise interference, assuming that the signal power is bounded, and that the Gaussian noise process is characterized by a known power or power spectral density. The law is named after Claude Shannon and Ralph Hartley.