380
Views
4
CrossRef citations to date
0
Altmetric
Articles

Cumulative and relative cumulative residual information generating measures and associated properties

ORCID Icon &
Pages 5260-5273 | Received 09 Mar 2021, Accepted 07 Nov 2021, Published online: 29 Nov 2021

References

  • Asadi, M., N. Ebrahimi, and E. S. Soofi. 2017. Connections of Gini, Fisher, and Shannon by Bayes risk under proportional hazards. Journal of Applied Probability 54 (4):1027–50. doi:10.1017/jpr.2017.51.
  • Asadi, M., N. Ebrahimi, E. S. Soofi, and Y. Zohrevand. 2016. Jensen-Shannon information of the coherent system lifetime. Reliability Engineering & System Safety 156:244–55. doi:10.1016/j.ress.2016.07.015.
  • Asadi, M., and Y. Zohrevand. 2007. On the dynamic cumulative residual entropy. Journal of Statistical Planning and Inference 137 (6):1931–41. doi:10.1016/j.jspi.2006.06.035.
  • Baratpour, S., and A. Habibi Rad. 2012. Testing goodness-of-fit for exponential distribution based on cumulative residual entropy. Communications in Statistics - Theory and Methods 41 (8):1387–96. doi:10.1080/03610926.2010.542857.
  • Barlow, R. E., and F. Proschan. 1975. Statistical theory of reliability and life testing: Probability models. New York: Holt, Rinehart and Winston.
  • Golomb, S. 1966. The information generating function of a probability distribution (corresp.). IEEE Transactions on Information Theory 12 (1):75–7. doi:10.1109/TIT.1966.1053843.
  • Guiasu, S., and C.Reischer. 1985. The relative information generating function. Information Sciences 35 (3):235–41. doi:10.1016/0020-0255(85)90053-2.
  • Kattumannil, S. K., N. Sreelakshmi, and N. Balakrishnan. 2021. Non-parametric inference for Gini covariance and its variants. Sankhya Series A. doi:10.1007/s13171-020-00218-z.
  • Kharazmi, O., and N. Balakrishnan. 2020a. Jensen-information generating function and its connections to some well-known information measures. Statistics & Probability Letters 170:108995. doi:10.1016/j.spl.2020.108995.
  • Kharazmi, O., and N. Balakrishnan. 2020b. Cumulative residual and relative cumulative residual Fisher information and their properties. IEEE Transactions on Information Theory 67 (10):6306–12. doi:10.1109/TIT.2021.3073789.
  • Kullback, S., and R. A. Leibler. 1951. On information and sufficiency. The Annals of Mathematical Statistics 22 (1):79–86. doi:10.1214/aoms/1177729694.
  • Mehrali, Y., M. Asadi, and O. Kharazmi. 2018. A Jensen-Gini measure of divergence with application in parameter estimation. Metron 76 (1):115–31. doi:10.1007/s40300-017-0119-x.
  • Rao, M., Y. Chen, B. C. Vemuri, and F. Wang. 2004. Cumulative residual entropy: new measure of information. IEEE Transactions on Information Theory 50 (6):1220–8. doi:10.1109/TIT.2004.828057.
  • Shaked, M., and J. G. Shanthikumar. 2007. Stochastic orders. New York: Springer.
  • Shannon, C. E. 1948. A mathematical theory of communication. Bell System Technical Journal 27 (3):379–423. doi:10.1002/j.1538-7305.1948.tb01338.x.
  • Sharma, B. D., and I. J. Taneja. 1975. Entropy of type (α,β) and other generalized measures in information theory. Metrika 22:205–15.
  • Xiong, H., P. Shang, and Y. Zhang. 2019. Fractional cumulative residual entropy. Communications in Nonlinear Science and Numerical Simulation 78:104879. doi:10.1016/j.cnsns.2019.104879.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.