150
Views
0
CrossRef citations to date
0
Altmetric
Research Article

A link of extropy to entropy for continuous random variables via the generalized ϕ–entropy

ORCID Icon &
Received 19 Aug 2023, Accepted 30 May 2024, Published online: 08 Jul 2024

References

  • Ali, S. M., and S. D. Silvey. 1966. A general class of coefficients of divergence of one distribution from another. Journal of the Royal Statistical Society Series B: Statistical Methodology 28 (1):131–42. doi:10.1111/j.2517-6161.1966.tb00626.x.
  • Amigó, J., S. Balogh, and S. Hernández. 2018. A brief review of generalized entropies. Entropy 20 (11):813. doi:10.3390/e20110813.
  • Arizono, I, and H. Ohta. 1989. A test for normality based on Kullback-Leibler information. The American Statistician 43 (1):20–2. doi:10.1080/00031305.1989.10475600.
  • Balakrishnan, N., F. Buono, C. Calí, and M. Longobardi. 2023. Dispersion indices based on Kerridge inaccuracy measure and Kullback-Leibler divergence. Communications in Statistics - Theory and Methods 53 (15):5574–92. doi:10.1080/03610926.2023.2222926.
  • Balakrishnan, N., A. H. Rad, and N. R. Arghami. 2007. Testing exponentiality based on Kullback-Leibler information with progressively Type-II censored data. IEEE Transactions on Reliability 56:349–56.
  • Block, H. W., T. H. Savits, and H. Singh. 1998. The reversed hazard rate function. Probability in the Engineering and Informational Sciences 12 (1):69–90. doi:10.1017/S0269964800005064.
  • Buono, F., O. Kamari, and M. Longobardi. 2023. Interval extropy and weighted interval extropy. Ricerche Di Matematica 72 (1):283–98. doi:10.1007/s11587-021-00678-x.
  • Burbea, J. 1984. The Bose-Einstein entropy of degree alpha and its Jensen difference. Utilitas Mathematicas 25:225–40.
  • Burbea, J., and C. Rao. 1982. Entropy differential metric, distance and divergence measures in probability spaces: A unified approach. Journal of Multivariate Analysis 12 (4):575–96. doi:10.1016/0047-259X(82)90065-3.
  • Cressie, N., and T. R. Read. 1984. Multinomial goodness-of-fit tests. Journal of the Royal Statistical Society Series B: Statistical Methodology 46 (3):440–64. doi:10.1111/j.2517-6161.1984.tb01318.x.
  • Cumming, S. G. 2001. A parametric model of the fire-size distribution. Canadian Journal of Forest Research 31 (8):1297–303. doi:10.1139/x01-032.
  • Devarajan, K., G. Wang, and N. Ebrahimi. 2015. A unified statistical approach to non-negative matrix factorization and probabilistic latent semantic indexing. Machine Learning 99 (1):137–63. doi:10.1007/s10994-014-5470-z. 25821345
  • Di Crescenzo, A., and M. Longobardi. 2002. Entropy-based measure of uncertainty in past lifetime distributions. Journal of Applied Probability 39 (2):434–40. doi:10.1239/jap/1025131441.
  • Di Crescenzo, A., and M. Longobardi. 2009. On cumulative entropies. Journal of Statistical Planning and Inference 139 (12):4072–87. doi:10.1016/j.jspi.2009.05.038.
  • Crescenzo, A. D., and M. Longobardi. 2015. Some properties and applications of cumulative Kullback–Leibler information. Applied Stochastic Models in Business and Industry 31 (6):875–91. doi:10.1002/asmb.2116.
  • Ebrahimi, N. 1996. How to measure uncertainty in the residual life time distribution. Sankhyoverlinea: Series A 58:48–56.
  • Ebrahimi, N., and S. Kirmani. 1996. A measure of discrimination between two residual life-time distributions and its applications. Annals of the Institute of Statistical Mathematics 48 (2):247–65.
  • Espendiller, M., and M. Kateri. 2016. A family of association measures for 2 × 2 contingency tables based on the ϕ-divergence. Statistical Methodology 30:45–61. doi:10.1016/j.stamet.2015.12.002.
  • Hashempour, M., M. R. Kazemi, and S. Tahmasebi. 2022. On weighted cumulative residual extropy: characterization, estimation and testing. Statistics 56 (3):681–98. doi:10.1080/02331888.2022.2072505.
  • Havrda, J., and F. Charvat. 1967. Concept of structural alpha-entropy. Kybernetika 3:30–5.
  • Iranpour, M., M. A. Hejazi, and M. Shahidehpour. 2020. A unified approach for reliability assessment of critical infrastructures using graph theory and entropy. IEEE Transactions on Smart Grid 11 (6):5184–92. doi:10.1109/TSG.2020.3005862.
  • Kamari, O., and F. Buono. 2021. On extropy of past lifetime distribution. Ricerche Di Matematica 70 (2):505–15. doi:10.1007/s11587-020-00488-7.
  • Kang, H. Y, and B. M. Kwak. 2009. Application of maximum entropy principle for reliability-based design optimization. Structural and Multidisciplinary Optimization 38 (4):331–46. doi:10.1007/s00158-008-0299-3.
  • Kapur, J. N. 1972. Measures of uncertainty, mathematical programming and physics. Journal of the Indian Society of Agriculture and Statistics 24:47–66.
  • Khinchin, A. I. 1957. Mathematical foundations of information theory. New York: Dover Publications.
  • Klein, I., and M. Doll. 2020. (Generalized) maximum cumulative direct, residual, and paired Φ entropy approach. Entropy 22 (1):91. doi:10.3390/e22010091.
  • Kullback, S., and R. A. Leibler. 1951. On information and sufficiency. The Annals of Mathematical Statistics 22 (1):79–86. doi:10.1214/aoms/1177729694.
  • Lad, F., G. Sanfilippo, and G. Agrò. 2015. Extropy: Complementary dual of entropy. Statistical Science 30:40–58.
  • Liese, F., and I. Vajda. 1987. Convex statistical distances. Leipzig: Teubner.
  • Maasoumi, E., and J. Racine. 2002. Entropy and predictability of stock market returns. Journal of Econometrics 107 (1-2):291–312. doi:10.1016/S0304-4076(01)00125-7.
  • Mathai, A. M., and H. J. Haubold. 2007. Pathway model, superstatistics, Tsallis statistics, and a generalized measure of entropy. Physica A: Statistical Mechanics and Its Applications 375 (1):110–22. doi:10.1016/j.physa.2006.09.002.
  • Melbourne, J., S. Talukdar, S. Bhaban, M. Madiman, and M. V. Salapaka. 2022. The differential entropy of mixtures: new bounds and applications. IEEE Transactions on Information Theory 68 (4):2123–46. doi:10.1109/TIT.2022.3140661.
  • Mohammadi, M., and M. Hashempour. 2022. On interval weighted cumulative residual and past extropies. Statistics 56 (5):1029–47. doi:10.1080/02331888.2022.2111429.
  • Morimoto, T. 1963. Markov processes and the H-theorem. Journal of the Physical Society of Japan 18 (3):328–31. doi:10.1143/JPSJ.18.328.
  • Murphy, P. M., and D. W. Aha. 1994. UCI Repository of machine learning databases. Berkeley, CA: University of California, Department of Information and Computer Science, http://www.ics.uci.edu/mlearn/MLRepository.html.Irvine.
  • Murthy, D. N. P.,M. Xie, and R. Jiang, 2004. Weibull models. Hoboken, NJ: John Wiley & Sons.
  • Pape, N. T. 2015. Phi-divergences and dynamic entropies for quantifying uncertainty in lifetime distributions. M.Sc. thesis, Aachen, Germany: Department of Mathematics, RWTH Aachen University.
  • Pardo, L. 2005. Statistical inference based on divergence measures. Boca Raton, FL: Chapman and Hall/CRC.
  • Pearson, K. 1900. X. On the criterion that a given system of deviations from the probable in the case of a correlated system of variables is such that it can be reasonably supposed to have arisen from random sampling. The London, Edinburgh, and Dublin Philosophical Magazine and Journal of Science 50 (302):157–75. doi:10.1080/14786440009463897.
  • Qiu, G., and Kai. Jia. 2018. The residual extropy of order statistics. Statistics & Probability Letters 133:15–22. doi:10.1016/j.spl.2017.09.014.
  • Rao, M., Y. Chen, B. C. Vemuri, and F. Wang. 2004. Cumulative residual entropy: A new measure of information. IEEE Transactions on Information Theory 50 (6):1220–8. doi:10.1109/TIT.2004.828057.
  • Raschke, M. 2012. Inference for the truncated exponential distribution. Stochastic Environmental Research and Risk Assessment 26 (1):127–38. doi:10.1007/s00477-011-0458-8.
  • Ruggeri, F., M. Sánchez-Sánchez, M. Á. Sordo, and A. Suárez-Llorens. 2021. On a new class of multivariate prior distributions: Theory and application in reliability. Bayesian Analysis 16:31–60.
  • Shannon, C. E. 1948. A mathematical theory of communication. Bell System Technical Journal 27 (3):379–423. doi:10.1002/j.1538-7305.1948.tb01338.x.
  • Sheikh, A. K., J. K. Boah, and M. Younas. 1989. Truncated extreme value model for pipeline reliability. Reliability Engineering & System Safety 25 (1):1–14. doi:10.1016/0951-8320(89)90020-3.
  • Shi, X., A. P. Teixeira, J. Zhang, and C. Guedes Soares. 2014. Structural reliability analysis based on probabilistic response modelling using the Maximum Entropy Method. Engineering Structures 70:106–16. doi:10.1016/j.engstruct.2014.03.033.
  • Singh, V. B., M. Sharma, and H. Pham. 2018. Entropy based software reliability analysis of multi-version open source software. IEEE Transactions on Software Engineering 44 (12):1207–23. doi:10.1109/TSE.2017.2766070.
  • Soofi, E. S. 1994. Capturing the intangible concept of information. Journal of the American Statistical Association 89 (428):1243–54. doi:10.1080/01621459.1994.10476865.
  • Soofi, E. S. 2000. Principal information theoretic approaches. Journal of the American Statistical Association 95 (452):1349 doi:10.2307/2669786.
  • Sunoj, S. M., P. G. Sankaran, and S. S. Maya. 2009. Characterizations of life distributions using conditional expectations of doubly (interval) truncated random variables. Communications in Statistics - Theory and Methods 38 (9):1441–52. doi:10.1080/03610920802455001.
  • Tsallis, C. 1988. Possible generalization of Boltzmann-Gibbs statistics. Journal of Statistical Physics 52 (1-2):479–87. doi:10.1007/BF01016429.
  • Weijs, S. V., R. Van Nooijen, and N. Van De Giesen. 2010. Kullback–Leibler divergence as a forecast skill score with classic reliability–resolution–uncertainty decomposition. Monthly Weather Review 138 (9):3387–99. doi:10.1175/2010MWR3229.1.