Abstract
In this paper, we extend the correspondence between Bayesian estimation and optimal smoothing in a Reproducing Kernel Hilbert Space (RKHS) by adding convex constraints to the problem. Through a sequence of approximating Hilbertian subspaces and a discretised model, we prove that the Maximum a posteriori (MAP) of the posterior distribution is exactly the optimal constrained smoothing function in the RKHS. This paper can be read as a generalisation of the paper Kimeldorf and Wahba (1970), where it is proved that the optimal smoothing solution is the mean of the posterior distribution. Synthetic and real data studies confirm the correspondence established in this paper.
Disclosure statement
No potential conflict of interest was reported by the author(s).