Towards SMC: Sequential Importance Sampling
Sequential Importance Sampling tutorial for Sequential Monte Carlo (SMC)
Review of Importance Sampling for Sequential Data
At time
- Sample
from an importance distribution for . - Compute the unnormalized importance weights and normalize them, to find the normalized importance weights
- Use the importance weghts to approximate the expectation.
Sequential Importance Sampling
Sequential Importance Sampling has two main differences with respect to Importance Sampling for sequential data.
- Importance distribution is autoregressive:
- Samples at time
are found recursively using the samples at time . Previously, at each time we were sampling from . Essentially, when we were sampling , we were sampling each component from time to . In Sequential Importance Sampling, instead, at each time step we are sampling from , and append these values to . In other words, for each sample we are sampling only the component rather than the whole history. - Importance weights are also computed recursively.
Basically to obtain the next set of weights
SIS Issue: One issue with Sequential Importance Sampling is that in practice as
grows, all normalized weights tend to except for one large weight which tends to . In these cases then the approximation is quite poor because it is essentially approximated using one sample (i.e. the sample of the non-degenerate weight). This effect is known as weight degeneracy. This issue is solve by Sequential Monte Carlo (SMC).