The creator of information theory. In the late 1940s, he established a mathematical method to analyze the quantity, reliability, transformation, and equivalence relations of information, and laid the foundation for remarkable developments in today's telecommunications.
A Mathematical Theory of Communication (with Warren Weaver), 1948.
Communication in the Presence of Noise, 1949.
The Zero Error Capacity of Noisy Channels, 1956.
Automata Studies (with John McCarthy), 1956.
Certain Results in Coding Theory for Noisy Channels, 1956.
Geometrische Deutung einige Ergebnisse bei der Berechnung der Kanalkapazitat, 1957
Probability of Error for Optional Codes in a Gaussian Channel, 1959.
Coding Theorems for a Discrete Source with a Fidelity criterion, 1960.
Two-way Communication Channels, 1961.
Communication technology is one of the technologies which has made the most remarkable progress in this century and produced significant effects in a wide range of human life. It is expected that the problems concerned with information and communication will become increasingly significant in the coming century.
Prof. Claude Elwood Shannon has given a mathematical scientific basis to the development of communication technology. His 1948 paper entitled “A Mathematical Theory of Communication,” gave birth to a new branch of mathematics called information theory. His theory revealed the scientific properties of information transmission and expanded the hope of handling the problems of information mathematically, which was previously considered impossible. It attracted the interests of many engineers and mathematicians, and has brought about the remarkable continual development of information science.
Prof. Shannon not only established an entirely new field of mathematics and showed a clear guideline of research and development, but also discovered the most important concepts and techniques in the field and, moreover, proved the most fundamental theorems concerning the stage of present development. It is rare to find a mathematical scientist who has accomplished such an epoch-making achievement in the long history of science.
Prof. Shannon analyzed the mathematical structure of information transmission from the following five aspects.
1. The characteristics of an information source which transmits information. The set of messages, the set of letters which compose each message, occurrence probability and transmission outputs of each message, etc.
2. The problems concerned with encoding in transmitting messages as signals.
3. The problems in the characteristics of transmission channels which transmit signals and those of noise which hinder communication.
4. The problem of restoring encoded signals as messages.
5. The characteristics of the part which receives information.
Based on the fundamental model of information transmission, Prof. Shannon formulated the fundamental laws which govern quantity and reliability of information, or transformation and equivalence of information. Without keeping to philosophical arguments nor resorting to methods depending on technological intuition, but by coming home to the heart of mathematical science, he built up a mathematical theory which is based on quantitative definition and mathematical analysis.
The concepts which Prof. Shannon formulated, especially those of information entropy, are fundamental in information theory, and Shannon’s Coding Theorems are still known as the most important results in this field.
Prof. Shannon’s theory and ideas on information, the applications of which are, needless to say, brilliant in communication technology, have also had much influence on a wide range of quantitative sciences including computer science, linguistics, biology, and psychology.