Summary
The information potential of an information store is defined as the logarithm of the number of alternative messages which can be placed in it. A cost is attributed to each message, and situations are examined in which specific cash‐allocations are associated with individual stores. The machinery of elementary thermodynamics is clearly reflected in the subsequent development. In particular, precise analogues of temperature, of the increasing property of entropy, of the Maxwell‐Boltzmann distribution, of ideal efficiencies of heat engines, of the partition function and of free energy arise in a straightforward fashion. Probabilities are not introduced. The interest of the procedure, relative to other work in the field, lies in the complete elimination of subjective notions of information.