991
Views
185
CrossRef citations to date
0
Altmetric
Articles

Generalised information and entropy measures in physics

Pages 495-510 | Received 21 Dec 2008, Accepted 13 Feb 2009, Published online: 15 May 2009
 

Abstract

The formalism of statistical mechanics can be generalised by starting from more general measures of information than the Shannon entropy and maximising those subject to suitable constraints. We discuss some of the most important examples of information measures that are useful for the description of complex systems. Examples treated are the Rényi entropy, Tsallis entropy, Abe entropy, Kaniadakis entropy, Sharma–Mittal entropies, and a few more. Important concepts such as the axiomatic foundations, composability and Lesche stability of information measures are briefly discussed. Potential applications in physics include complex systems with long-range interactions and metastable states, scattering processes in particle physics, hydrodynamic turbulence, defect turbulence, optical lattices, and quite generally driven nonequilibrium systems with fluctuations of temperature.

Notes

1. An exception to this rule is the Fisher information, which depends on gradients of the probability density but will not be discussed here.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.