Abstract
An analysis is made in terms of information theory of the behaviour of linear, feedback regulators subjected to stationary, Gaussian-distributed, continuous-parameter, random-process disturbances. It is found that the state of a signal is best represented in terms of two independent measures, namely the variance and the limiting rate of the generalized entropy. A relationship is obtained which links the variance or the power of the outputs and their entropy rates with the information rates to the controllers, and an equivalence in behaviour to communication systems is pointed out. At the minimum of the entropy rate for a fixed variance, equations are derived which are analogous with those applying at equilibrium in thermodynamics.