Abstract
This paper expresses difficulties we have in interpreting neuronal activities by means of information and probability theory: until now there is no general agreement about the alphabet used by the neurons and, consequently, the first step in information theory is actually a conjecture. Further, electrophysiological single unit records can often be shown to be not stationary (because of trends, facilitations, inhibitions, transient reactions or oscillations). Thus they would not appear to be ergodic. But then, how does the system succeed in recognizing certain patterns by only one realization of a stochastic process? These difficulties do not arise if an appropriate multiunit system is considered, the dynamics of which are determined by a set of differential equations. The asymptotic (t ±) solutions of these equations define a number of state space structures (fixed points, limit cycles or strange attractors). According to this concept, information is related to the special shape of a state space structure and its probability, and information flow means the filtering of these structures.