Abstract
We define a stochastic neuron as an element that increases its internal state with probability p until a threshold value is reached; after that its internal state is set back to the initial value. We study the local information of a stochastic neuron between the message arriving from the input neurons and the response of the neuron. We study the dependence of the local information on the firing probability [iopmath latex="$alpha$"] [/iopmath] of the synaptic inputs in a network of such stochastic neurons. The values of [iopmath latex="$alpha$"] [/iopmath] obtained in the simulations are the same as those obtained theoretically by maximization of local mutual information. We conclude that the global dynamics maximizes the local mutual information of single units, which means that the self-selected parameter value of the population dynamics is such that each neuron behaves as an optimal encoder.