991
Views
185
CrossRef citations to date
0
Altmetric
Articles

Generalised information and entropy measures in physics

Pages 495-510 | Received 21 Dec 2008, Accepted 13 Feb 2009, Published online: 15 May 2009
 

Abstract

The formalism of statistical mechanics can be generalised by starting from more general measures of information than the Shannon entropy and maximising those subject to suitable constraints. We discuss some of the most important examples of information measures that are useful for the description of complex systems. Examples treated are the Rényi entropy, Tsallis entropy, Abe entropy, Kaniadakis entropy, Sharma–Mittal entropies, and a few more. Important concepts such as the axiomatic foundations, composability and Lesche stability of information measures are briefly discussed. Potential applications in physics include complex systems with long-range interactions and metastable states, scattering processes in particle physics, hydrodynamic turbulence, defect turbulence, optical lattices, and quite generally driven nonequilibrium systems with fluctuations of temperature.

Notes

1. An exception to this rule is the Fisher information, which depends on gradients of the probability density but will not be discussed here.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 61.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 775.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.