Sequential Monte Carlo Samplers
Problem Set-Up
This is all taken from Sequential Monte Carlo samplers. We have a collection of target distributions
Importance Sampling
We write target expectations using the Importance Sampling (IS) trick for a proposal density
Sequential Importance Sampling
In importance sampling, for each different target
- At time
our target is and we use an IS proposal which we choose to approximate well (often we choose ). This means we sample particles from and then compute the IS unnormalized weights - Suppose that at time
we had a set of particles sampled from . Our target at time is . In order to propose a new set of particles we use a Markov Kernel . We call the resulting distribution . Notice that this distribution can be found using the property that a kernel operates on measures on the left Once we have sampled from the kernel to move the particles forward , we need to compute the weights to account for the discrepancy of sampling from rather than However notice we can only do this if we can evaluate .
In general, we cannot evaluate
In general
SMC sampler
Since the problem is integration with respect to
where we have defined the incremental weight as
To summarize:
- Importance Sampling at time
targets . It samples particles afresh from a proposal and computes weights afresh as . For this to work, however, we need to be able to find proposals which is in general very hard. - Sequential Importance Sampling also targets
at time . It tries to fix the problem of finding by using a local Markov Kernel to sample a new set of particles starting from . This, at time , gives rise to the following proposal distributions We can now sample from but we cannot evaluate due to the integral with respect to . Evaluating is needed to compute the IS weights - SMC Samplers overcomes the problem of integrating over
by working with the integrand directly. The proposal and the target distributions are then and . Notice the difference with respect to IS and SIS: In IS and SIS we get new particles at each time step, that is at time step we have and at time step we have . In an SMC sampler, instead, we extend the particles at time by sampling from a kernel and then appending this to the current particles to obtain . Since we have appended to the previous particles, we need to update the weights and these are updated using an incremental weight Importantly, this requires us to introduce backwards kernels which essentially allow us to approach the problem from an auxiliary variable perspective.
Since the variance of the weights increases as
The algorithm is summarized below.

A few notes:
- The particle estimate of the
th target is - It is helpful to remember the distributions of
and (using sloppy notation) - The optimal backward kernel takes us back to IS on
It is difficult to use this kernel as it relies on and which are intractable (indeed it is the reason why we went from SIS to SMC samplers). - Sub-optimal kernel: substitute
for . This is motivated by the fact that, if is a good proposal for then they should be sufficiently close. First, rewrite the optimal kernel Now substitute and for and respectively. The incremental weights become
IS Measure Theory
Suppose
Now suppose you have samples
SIS Proposal
Now let
SMC Proposal
The proposal in the SMC sampler is not given by
SMC Steps
- Step
: Our target is and proposal is given. Sample . Weights are RN-derivative - Step
: Move particles forward using kernel . We sample . Marginally, each new particle is distributed as Which can be written as where we can see that the density of is In SMC we then append to to get so our aim is now to find a measure for it.
Define
- Step
: Target is . Perform importance sampling using which is a product measure with density given by the product rule Similarly the extended target is a product measure with density The weights are then given by