In part the paper is interesting because it provides useful perspective on state-space modeling, filtering and estimation from the early linear/Gaussian days of Kalman filtering to the recent nonlinear/non-Gaussian days of particle filtering. There's also some interesting personal reflection. (Background: the paper is for a forthcoming Andrew Harvey Festschrift, and Neil was Andrew's student.)

But the paper's original contribution is even more interesting. It puts exponential smoothing in fresh and fascinating perspective, by considering it in a stochastic volatility (SV) environment.

As is well known, exponential smoothing (ES) is closely related to state-space models of unobserved components. In particular, ES is the MSE-optimal filter when the data-generating process is a latent random walk signal buried in white noise measurement error. The optimal smoothing parameter, moreover, depends only on the signal / noise ratio (that is, the random walk error variance relative to the measurement error variance).

Neil endows the errors with SV, in which case the signal / noise ratio and hence the optimal smoothing parameter are time-varying. The particle filter facilitates both optimal parameter estimation and optimal tracking of the time-varying volatility, making for real-time ES with an optimally time-varying smoothing parameter. Very cool, both in principle and practice!

Neil endows the errors with SV, in which case the signal / noise ratio and hence the optimal smoothing parameter are time-varying. The particle filter facilitates both optimal parameter estimation and optimal tracking of the time-varying volatility, making for real-time ES with an optimally time-varying smoothing parameter. Very cool, both in principle and practice!

More generally, it's interesting that ES remains alive and useful and still the focus of important research, some half-century after its introduction. Seemingly-naive methods sometimes reveal themselves to be sophisticated and adaptable.