odern digital communication depends on Information Theory, which was invented in the 1940's by Claude E. Shannon. Shannon first published A Mathematical Theory of Communication in 1947-1948, and jointly publishedThe Mathematical Theory of Communciation with Warren Weaver in 1949. That text is still in publication by theUniversity of Illinois Press. Information Theory, sometimes referred to as Classical Information Theory as opposed toAlgorithmic Information Theory, provides a mathematical model for communication. Though Shannon was principally concerned with the problem of electronic communications, the theory has much broader applicability. Communication occurs whenever things are copied or moved from one place and/or time to another.
This article briefly describes the main concepts of Shannon's theory. The mathematical proofs are readily available in many sources, including the Internet links on this page. While Shannon's theory covers both digital and analog communication, analog communication will be ignored for simplicity. On the other hand, Information Theory is a fairly technical subject, generally introduced to third-year engineering university students. Really understanding it requires knowledge of statistics and calculus.
For those who wonder how a theory about communication can possibly relate to biological evolution, a visit to Tom Schneider's web site, Molecular Information Theory and the Theory of Molecular Machines, may help. In any case, Creationists are now fond of arguing about information, and this article provides useful background material on the subjec