Abstract
In the following article, we consider approximate Bayesian computation (ABC) inference. We introduce a method for numerically approximating ABC posteriors using the multilevel Monte Carlo (MLMC). A sequential Monte Carlo version of the approach is developed and it is shown under some assumptions that for a given level of mean square error, this method for ABC has a lower cost than i.i.d. sampling from the most accurate ABC approximation. Several numerical examples are given.
Acknowledgements
AJ, SJ, DN, and CS were all supported by grant number R-069-000-074-646, Operations research cluster funding, NUS. AJ and RT were additionally supported by KAUST CRG4 Award Ref: 2584.
Appendix A: Proof of Proposition 3.1
Proof.
We give the proof in the case n = 1; the general case is the same, except with some minor complications in notations. We have
(5)
(5)
We will deal with the two expressions on the R.H.S. of EquationEquation (5)(5)
(5) separately. Throughout C is a constant that does not depend on a level index l but whose value may change upon appearance.
First Term on the R.H.S. of EquationEquation (5)(5)
(5)
We will show that
and that
. We start with the first task. We have
where we have set
. Then elementary calculations yield
Now as
and as
we have
(6)
(6)
Now
Now
so that
We now will show that is lower bounded uniformly in l which will show that
.
Then
as
and
. So we have that
and
(7)
(7)
Combining EquationEquations (6)(6)
(6) and Equation(7)
(7)
(7) yields
(8)
(8)
Second Term on the R.H.S. of EquationEquation (5)(5)
(5)
Clearly
(9)
(9)
We first deal with the numerator on the R.H.S.:
As
we have
Therefore one has
Using almost the same calculation as for showing , we have
and so
(10)
(10)
Returning to EquationEquation (5)(5)
(5) and noting EquationEquation (9)
(9)
(9) , one apply the triangular inequality and combine EquationEquations (7)
(7)
(7) and Equation(10)
(10)
(10) to complete the proof. □
Appendix B: Proof of Proposition 3.2
Proof.
We will show that the expected cost of sampling a given full-conditional is . Throughout C is a constant that does not depend on a level index l nor i but whose value may change upon appearance.
It is easily shown that
and thus that the probability of accepting, in the rejection scheme is
The expected number of simulations is then the inverse. Clearly
and as
is compact and by assumption
. Note as
the expected number of simulations to sample the full conditional is at most
which completes the proof. □