Abstract
Divergence functions are interesting and widely used discrepancy measures. Even though they are not true distances we can use them to measure how separated two points are. Curiously enough, when they are applied to random variables they lead to a notion of best predictor that coincides with usual best predictor in Euclidean distance. From a divergence function we can derive a Riemannian metric which leads to a true distance between random variables, and the best predictors in this metric do not coincide with their Euclidean counterparts. It is the purpose of this paper to explicitly determine the best predictors in the derived metric, compare it to the estimators in divergence, and to obtain the sample estimators of the best predictors. Along the way we obtain results that relate approximations in divergence to approximations in the metric derived from it.
Acknowledgments
I want to thank Frank Nielsen for his comments and suggestions on the first draft of this manuscript. I want to thank as well the reviewer for her/his comments which led, not only to improvement in the manuscript, but to section (Equation6.2(6.1)
(6.1) ) and the references cited in there.
Notes
1Thanks to Frank Nielsen for the remark.