References
- Abbasnejad, M. 2011. Some goodness of fit test based on renyi information. Applied Mathematical Sciences 5:1921–34.
- Ahmad, I. A., and I. Alwasel. 1999. A goodness of fit test for exponentiality based on the memoryless property. Journal of the Royal Statistical Society, Ser. B 61:681–9.
- Alizadeh Noughabi, H. 2010. A new estimator of entropy and its application in testing normality. Journal of Statistical Computation and Simulation 80 (10):1151–62.
- Alwasel, I. 2001. On goodness of fit testing for exponentiality using the memoryless property. Journal of Nonparametric Statistics 13:569–81.
- Amari, S. 1985. Differential-Geometrical Methods in Statistics. Berlin, Germany: Springer Verlag.
- Amari, S. 2009. Alpha-divergence is unique, belonging to both f-divergence and Bregman divergence classes. IEEE Transactions on Information Theory 55:4925–31.
- Amari, S., and H. Nagaoka. 2000. Methods of Information Geometry. New York, NY: Oxford University Press.
- Andrews, F. C., and A. C. Andrews. 1962. The form of the equilibrium distribution function. Transactions of the Kansas Academy of Science 65:247–56.
- Baratpour, S., and A. Habibirad. 2012. Testing goodness of fit for exponential distribution based on cumulative residual entropy. Communications in Statistics—Theory and Methods 41:1387–96.
- Barnett, V., and T. Lewis. 1994. Outliers in Statistical Data, John Wiley.
- Chen, Z. 2000. A new two-parameter lifetime distribution with bathtub shape or increasing failure rate function. Stprobl 49:155–61.
- Chernoff, H. 1952. A measure of asymptotic efficiency for tests of a hypothesis based on a sum of observations. Annals of Mathematical Statistics 23:493–507.
- Choi, B., K. Kim, and S. H. Song. 2004. Goodness of fit test for exponentiality based on Kullback-Leibler information. Communications in Statistics—Simulation and Computation 33 (2):525–36.
- Cichocki, A., and S. Amari. 2010. Families of alpha- beta- and gamma-divergences: flexible and robust measures of similarities. Entropy 12 (6):1532–68.
- Correa, J. C. 1995. A new estimator of entropy. Communications in Statistics—Theory and Methods 24:2439–49.
- Cressie, N., and T. Read. 1988. Goodness of fit Statistics for Discrete Multivariate Data. New York, NY: Springer.
- Dhillon, B. 1981. Lifetime distributions. IEEE Trans Reliab 30:457–9.
- Ebrahimi, N., M. Habibullah, and E. S. Soofi. 1992. Testing exponentiality based on Kullback-Leibler information. Journal of the Royal Statistical Society 54:739–48.
- Ebrahimi, N., K. Pflughoeft, and E. Soofi. 1994. Two measures of sample entropy. Statistics and Probability Letters 20:225–34.
- Eguchi, S., and S. Kato. 2010. Entropy and divergence associated with power function and the statistical application. Entropy 12:262–74.
- Finkelstein, J., and R. E. Schafer. 1971. Imported goodness of fit tests. Biometrika 58:641–5.
- Fujisawa, H., and S. Eguchi. 2008. Robust parameter estimation with a small bias against heavy contamination. Journal of Multivariate Analysis 99:2053–2081.
- Gurevich, G., and A. Davidson. 2008. Standardized form of Kullback-Leibler Information based statistics for normality and exponentiality. Computer Modeling and New Technologies 12:14–25.
- Harris, C. M. 1976. A note on testing for exponentiality. Naval Research Logistics Quarterly 28:169–75.
- Jones, M. C., N. L. Hjort, L. R. Harris, and A. Basu. 2001. A comparison of related density based minimum divergence estimators. Biometrika 88 (3):865–73.
- Kullback, S. 1959. Information Theory and Statistics. New York: John Willey.
- Kullback, S., and R. A. Leibler. 1951. On information and sufciency. Annals of Mathematical Statistics 22:79–86.
- Lilliefors, H. W. 1969. On the Kolmogorov-Smirnov test for the exponential distribution with mean unknown. Journal of the American Statistical Association 64:387–9.
- Minka, T. 2005. Divergence measures and message passing. Microsoft Research Technical Report (MSR-TR-2005).
- Nakamura, T. K. 2009. Relativistic equilibrium distribution by relative entropy maximization. Europhysics Letters Association 88 (4).
- Ooms, G., and K. Moore. 1991. A novel assay for genetic and environmental changes in the architecture of intact root systems of plants grown in vitro. Plant Cell, Tissue and Organ Culture 27 (2):129–39.
- Park, S., D. Choi, and S. Jung. 2014. Kullback-Leibler information of the equilibrium distribution function and its application to goodness of fit test. Communications for Statistical Applications and Methods 2:125–34.
- Rao, M., Y. Chen, B. C. Vemuri, and F. Wang. 2004. Cumulative residual entropy: a new measure of information. IEEE Transactions on Information Theory 50:1220–8.
- Shannon, C. E. 1948. A mathematical theory of communication. The Bell System Technical Journal 27:379–423.
- Stephens, M. A. 1974. EDF statistics for goodness of fit and some comparisons. Journal of American Statistical Association 69:730–7.
- Taneja, I. 1990. On generalized entropies with applications. Lectures in Applied Mathematics and Informatics 107–69.
- Taneja, I. 1995. New developments in generalized information measures. Advances in Imaging and Electron Physics 91:37–135.
- Van Es, B. 1992. Estimating functional related to a density by a lass of statistic based on spacings. Scandinavian Journal of Statistics 19:61–72.
- Van-Soest, J. 1969. Some goodness of fit tests for the exponential distribution. Statistica Neerlandia 23:41–51.
- Vasicek, O. 1976. A test for normality based on sample entropy. Journal of the Royal Statistical Society, B 38:54–59.
- Zhang, J., and H. Matsuzoe. 2008. Dualistic differential geometry associated with a convex function. Advances in Applied Mathematics and Global Optimization, Gao DY, Sherali, HD, Eds, 439–66.
- Zhu, H., and R. Rohwer. 1995. Bayesian Invariant measurements of generalization. Neural Processing Letters 2:28–31.
- Zhu, H., and R. Rohwer. 1997. Measurements of generalisation based on information geometry. Mathematics of Neural Networks. Springer US 394–8.