Recursive Least Squares Algorithm RLS for Volterra Kernel Estimation
- Login to Download
- 1 Credits
Resource Overview
Detailed Documentation
This article employs the Recursive Least Squares (RLS) algorithm to compute kernels of Volterra series. As a sequential approximation algorithm, RLS is particularly suitable for adaptive filtering in dynamic system modeling and signal processing applications. The core implementation involves maintaining a covariance matrix P and weight vector W that get updated recursively with each new data point using the Kalman gain vector K. The standard RLS equations include: K(k) = P(k-1)φ(k)[λ + φ'(k)P(k-1)φ(k)]⁻¹, W(k) = W(k-1) + K(k)[d(k) - φ'(k)W(k-1)], and P(k) = λ⁻¹[P(k-1) - K(k)φ'(k)P(k-1)], where λ represents the forgetting factor. While classical least squares constitutes a statistical learning method for estimating unknown coefficients in linear regression models, the recursive variant extends this capability to nonlinear systems. For handling data with nonlinear characteristics, RLS enables the extension of linear regression frameworks to nonlinear models through Volterra series representation, thereby achieving superior data fitting performance. Consequently, when computing Volterra series kernels, the RLS algorithm provides more accurate results through its real-time adaptation mechanism and exponential weighting of recent data points.
- Login to Download
- 1 Credits