Abstract
In this paper, we adopt a Bayesian point of view, based on Gaussian processes, for classifying high-dimensional data. Since computing the exact marginal likelihood remains difficult, if not impossible, for discrete likelihoods and high-dimensional inputs, we introduce two different methods to improve the efficiency and the scalability of Gaussian processes for classification: scalable Laplace approximation and scalable expectation propagation, together with an embedding for dimension reduction. The proposed methods are shown to handle multimodal posteriors and improve the prediction quality, jointly. Various numerical tests on simulated and real data provide a comparison with alternative Bayesian and non Bayesian predictors. More particularly, the comparison study confirms that the ability of our proposed methods to estimate the marginal likelihood efficiently yields better predictions, especially for complex tasks.