Abstract
Renyi entropy is a generalization of Shannon entropy, which plays an important role in information theory. Recently, a new concept called extropy has been developed, which is the dual complement of entropy. This paper proposes Renyi extropy, maximum Renyi extropy and conditional Renyi extropy. When the parameter q of Renyi extropy tends to 1, Renyi extropy degenerates to extropy. When the probability is uniformly distributed, Renyi extropy takes the maximum value, and the maximum Renyi extropy is equal to the maximum extropy.
Acknowledgements
The authors greatly appreciate the reviewers’ suggestions and the editor’s encouragement.
Disclosure statement
The authors state that there are no conflicts of interest.