References
- J. M. Amigo, Permutation complexity in dynamical systems (2010), Springer-Verlag, Berlin-Heidelberg.
- H. H. Bauschke, P.L. Combettes (2017), Convex analysis and monotone operator theory in Hilbert spaces, 2nd ed. Springer, New York.
- Ch. Corda, M. Fatehi Nia, M.R. Molaei, Y. Sayyari (2018), Entropy of iterated function systems and their relations with black holes and Bohr-like black holes entropies, Entropy, 20, 56.
- S.S. Dragomir (1999-2000) A converse result for Jensen’s discrete inequality via GrÜss inequality and applications in information theory, An. Univ. Oradea. Fasc. Mat. 7, 178-189.
- S. S. Dragomir (2010), A new refinement of Jensen’s inequality in linear spaces with applications, Mathematical and Computer Modelling, 52 (10), 1497-1505. doi: https://doi.org/10.1016/j.mcm.2010.05.035
- D. Garg, S. Kumar (2019), Exponential Tsallis-Havrda-Charvat directed-divergence convex function of ‘type a‘, Journal of Information and Optimization Sciences, 40 (1), 1-11, (DOI: https://doi.org/10.1080/02522667.2017.1379234).
- R. M. Gray (1990), Entropy and information theory. Springer verlag, New York.
- G. Lu (2018), A refined upper bound for entropy, University Politehnica Of Bucharest Scientific Bulletin-series A-applied Mathematics And Physics, 80 (2).
- I. Kontoyiannis, P.H. Algoet, Y.M. Suhov, A.J. Wyner (1998), Nonparametric entropy estimation for stationary processes and random fields, with applications to english text. IEEE Transactions on Information Theory, 44 (3), 1319-1327. doi: https://doi.org/10.1109/18.669425
- G. Lu (2018), New refinements of Jensen’s inequality and entropy upper bounds, Journal of Mathematical Inequalities, 12 (2), 403-421.
- A. Mehrpooya, Y. Sayyari, M.R. Molaei, Algebraic and Shannon entropies of commutative hypergroups and their connection with information and permutation entropies and with calculation of entropy for chemical algebras, Soft Computing 23 (24) (2019), 13035-13053. doi: https://doi.org/10.1007/s00500-019-04314-7
- L. Paninski (2003), Estimation of entropy and mutual information. Neural Computation, 15 (6), 1191-1253 (DOI: https://doi.org/10.1162/089976603321780272).
- Y. Sayyari (2020), New bounds for entropy of information sources, Wavelet and Linear Algebra, 7 (2), 1-9.
- Y. Sayyari (2020), New entropy bounds via uniformly convex functions, Chaos, Solitons and Fractals, 141 (1), (DOI: https://doi.org/10.1016/j.chaos.2020.110360).
- Y. Sayyari, An improvement of the upper bound on the entropy of information sources, Journal of Mathematical Extension, Vol 15 (2021).
- Y. Sayyari, M.R. Molaei, S.M. Moghayer (2015), Entropy of continuous maps on quasi-metric spaces, Journal of Advanced Research in Dynamical and Control Systems, 7 (4), 1-10.
- S. Simic (2008), On a global bound for Jensen’s inequality, J. Math. Anal. Appl., 343, 414-419 (DOI: https://doi.org/10.1016/j.jmaa.2008.01.060).
- S. Simic (2009), Jensen’s inequality and new entropy bounds, Applied Mathematics Letters, 22 (8), 1262-1265. doi: https://doi.org/10.1016/j.aml.2009.01.040
- D.K. Singh, P. Dass (2018), On a functional equation related to some entropies in information theory, Journal of Discrete Mathematical Sciences and Cryptography, 21 (3), 713-726, (DOI: https://doi.org/10.1080/09720529.2018.1445809).
- N. Tapus, P.G. Popescu (2012), A new entropy upper bound, Applied Mathematics Letters, 25 (11), 1887-1890. doi: https://doi.org/10.1016/j.aml.2012.02.056
- P. Walters (2000), An Introduction to ergodic theory, Springer Verlag, New York.