555
Views
0
CrossRef citations to date
0
Altmetric
Articles

Three guys walk into a bar: an information theoretic analysis

ORCID Icon
Pages 193-212 | Received 25 Mar 2017, Accepted 01 Aug 2017, Published online: 21 Aug 2017
 

ABSTRACT

This study posits a scenario in which three famous information theorists meet each other in a bar and compare notes. The three include the celebrated information theorist Claude Shannon, the eighteenth century English statistician Thomas Bayes, and the Harvard statistical linguist George Zipf. Each promoted a foundational equation concerning human communication and the updating of beliefs based on new information. They discover that with some modest mathematical transformations they can demonstrate that each of the equations, although based on entirely distinct phenomena in physics, statistics, and linguistics, has the same basic structural form. The core of the analysis explores how such distinct phenomena share similar nonlinear structural properties, i.e., non-Gaussian distributions and why an understanding of these properties is important for communication research and the analysis of advanced information systems.

Disclosure statement

No potential conflict of interest was reported by the author.

Notes on contributor

W. Russell Neuman is Professor of Media Technology at the interdisciplinary MAGNET unit of New York University and University Provostial Fellow. He was John Evans Professor of Media Technology at the University of Michigan [email: [email protected]].

Notes

1 The fact that each additional bit doubles the information-carrying capacity of a string of bits represents an important lesson for the digital era. The longer the digital encryption key, the more secure it is and the less likely it can be decoded by brute force methods. An eight-bit key offers 256 possible variations. A 16-bit key offers 65,536. A 64 bit key offers 18,446,744,073,709,551,616.

2 Actually, the main point of communication by the lanterns in the Old North Church tower was that the Redcoats were coming now; the one-versus-two lanterns signaled the direction of their march.

3 It is widely acknowledged that many who draw on Shannon’s insights struggle with the intertwined conceptions of information, information transmission, communication, uncertainty reduction, entropy, and equivocation. It turns out that Shannon’s choice of the symbol ‘H’ drawing on a structural similarity to Boltzmann’s famous formula for entropy in physics is something of a celebrated story in the history of science. Stonier explains:

“Shannon considered his contribution to have been a theory of communication – i.e. a theory of information transport. Earlier, in 1928, R. V. L. Hartley (as reviewed by Cherry (Citation1978)) had suggested a way of quantifying the ‘information’ contained in a message. The suggestion represents one of those events in history where a whole field is switched onto the wrong track. As Colin Cherry has commented: ‘it is a pity that the mathematical concepts stemming from Hartley have been called “information” at all’ (Cherry, Citation1978, p. 51). Cherry goes on to point out that the formula derived by Shannon for the average information contained in a long series of symbols is really a measure of the statistical rarity or ‘surprise value’ of a course of message signs. This is hardly a true measure of the information content of a message. To confuse matters further, Shannon introduced the concept of entropy into his calculations. The idea that entropy and information are somehow interrelated had been around for two decades. It had been postulated by Szilard (Citation1929). However, Shannon’s basis for doing so stemmed from the fact that his equations looked like the equations that had been developed in statistical mechanics by the Austrian physicist Ludwig Boltzmann. It is said that he was also encouraged to do so by Von Neumann, who told Shannon that since ‘ … no one knows what entropy is, so in a debate you will always have the advantage’ (Tribus & McIrvine, Citation1971). One could argue that, as a result of Von Neumann’s advice, the communications engineers and information theorists all became the victims of a bad joke: that the potential indeterminacy of a message is the same thing as entropy. The confusion still reigns today” (Stonier, Citation1997, p. 13).

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.