Generalizations

Importance Sampling ABC

Consider again the Soft-ABC algorithm. Remember that we compute unnormalized weights using a kernel and then we normalize them, similar to how we do in Importance Sampling. Indeed, it turns out we can generalize Soft-ABC to something called Importance Sampling ABC or IS-ABC.

First of all, we need a proposal distribution. Usually one chooses q(θ,y)=p(yθ)q(θ) for some distribution q(θ). Then the unnormalized importance sampling weights are given by the ratio of the (unnormalized) augmented ABC posterior and the proposal distribution w~(θ)=p~ϵ(θ,yy)q(θ,y)=p~ϵ(yy)p(yθ)p(θ)p(yθ)q(θ)=p~ϵ(yy)p(θ)q(θ) The IS-ABC algorithm is given below. Importantly, notice how now we are sampling parameters from q(θ) rather than from the prior p(θ).

IS-ABC - Importance Sampling ABC

It is immediately clear that Soft-ABC is just a special case of this where our proposal distribution is p(yθ)p(θ).

You can play around with IS-ABC applied to the Two Moons example here. Notice that it may take a while to run.

Generalized Rejection ABC

One can generalize Rejection-ABC to a likelihood-free rejection sampler. The target distribution is still the augmented ABC posterior pϵ(θ,yy)=pϵ(yy)p(yθ)p(θ) and, as typical in rejection sampling, we require a proposal density q(θ,s) satisfying the following bound p~ϵ(θ,yy)Mq(θ,s)for some M>0 One can then sample from q(θ,s) and accept draws with probability p~ϵ(θ,yy)Mq(θ,s) In particular, here we choose the following proposal distribution q(θ,s)=p(yθ)q(θ) for some q(θ) satisfying the bound above. Of course, the reason of this choice of proposal distribution is so that the intractable likelihood cancels out in the acceptance probability p~ϵ(θ,yy)Mq(θ,s)=p~ϵ(yy)p(yθ)p(θ)Mp(yθ)q(θ)=p~ϵ(yy)p(θ)Qq(θ) The the generalized rejection-ABC algorithm is given below.

Generalized Rejection ABC

It is well-known in Rejection sampling that the optimal value for the constant M is M=maxθ,yp~ϵ(θ,yy)p(yθ)q(θ)=maxθ,yp~ϵ(yy)p(θ)q(θ)=maxyp~ϵ(yy)maxθp(θ)q(θ)

Previous
Next