References
- Shannon CE. A mathematical theory of communications. Bell Syst Tech J. 1948;27(3):379–423; 623–656.
- Bobkov S, Madiman M. Concentration of the information in data with log-concave distributions. Ann Prob. 2011;39(4):1528–1543.
- Kontoyiannis I, Verdu S. Optimal lossless compression: source varentropy and dispersion. In IEEE International Symposium on Information Theory, Istanbul, p. 1739–1743. 2013.
- Kontoyiannis I, Verdu S. Optimal lossless compression: source varentropy and dispersion. IEEE Trans Inform Theory. 2014;60(2):777–795.
- Arikan E. Varentropy decreases under the polar transform. IEEE Trans Inform Theory. 2016;62(6):3390–3400.
- Buono F, Longobardi M. Varentropy of past lifetimes. arXiv preprint arXiv:2008.07423. 2020.
- Di Crescenzo A, Paolillo L, Suarez-Llorens A. Stochastic comparisons, differential entropy and varentropy for distributions induced by probability density functions. arXiv preprint arXiv:2103.11038. 2021.
- Di Crescenzo A, Paolillo L. Analysis and applications of the residual varentropy of random lifetimes. Probab Eng Informat Sci. 2021;35(3):680–698.
- Maadani S, Mohtashami Borzadaran GR, Rezaei Roknabadi AH. Varentropy of order statistics and some stochastic comparisons. Commun Stat Theory Methods. 2021;51(18):6447–6460.
- Fradelizi M, Madiman M, Wang L. Optimal concentration of information content for logconcave densities. In: C Houdŕe, D Mason, P Reynaud-Bouret, J Rosínski, editor. High dimensional probability VII. Progress in probability, vol. 71. Cham: Springer; 2016. p. 45–60.
- Madiman M, Wang L. An optimal varentropy bound for log-concave distributions. In International Conference on Signal Processing and Communications (SPCOM). Bangalore: Indian Institute of Science, 2014. p. 1. doi:10.1109/SPCOM.2014.6983953.
- Goodarzi F, Amini M, Mohtashami Borzadaran GR. Characterizations of continuous distributions through inequalities involving the expected values of selected functions. Appl Math. 2017;62(5):493–507.
- Vasicek O. A test for normality based on sample entropy. J Royal Stat Soc, Series B. 1976;38(1):54–59.
- van Es B. Estimating functionals related to a density by class of statistics based on spacings. Scand J Stat. 1992;19(1):61–72.
- Ebrahimi N, Pflughoeft K, Soofi E. Two measures of sample entropy. Stat Probab Lett. 1994;20(3):225–234.
- Correa JC. A new estimator of entropy. Commun Stat Theory Methods. 1995;24(10):2439–2449.
- Alizadeh Noughabi H. A new estimator of entropy and its application in testing normality. J Stat Comput Simul. 2010;80(10):1151–1162. doi:10.1080/00949650903005656
- Arizono I, Ohta H. A test for normality based on Kullback-Leibler information. Am Stat. 1989;43(1):20–22.
- Grzegorzewski P, Wieczorkowski R. Entropy-based goodness-of-fit test for exponentiality. Commun Stat Theory Methods. 1999;28(5):1183–1202.
- Stephens MA. EDF statistics for goodness of fit and some comparisons. J Am Stat Assoc. 1974;69(347):730–737.
- Illowsky B, Dean S. Introductory statistics. Houston: OpenStax; 2018.