Hidden Markov Models (HMMs) and Linear Dynamical Systems (LDSs) are based on the same assumption: a hidden state variable, of which we can make noisy measurements, evolves with Markovian dynamics. Both have the same independence diagram and consequently the learning and inference algorithms for both have the same structure. The only difference is that the HMM uses a discrete state variable with arbitrary dynamics and arbitrary measurements while the LDS uses a continuous state variable with linear-Gaussian dynamics and measurements. We show how the forward-backward equations for the HMM, specialized to linear-Gaussian assumptions, lead directly to Kalman filtering and Rauch-Tung-Streibel smoothing. We also investigate the most general possible modeling assumptions which lead to efficient recursions in the case of continuous state variables.
This paper is a companion to "Parameter estimation for linear dynamical systems" by Z. Ghahramani and G. Hinton.