Abstract
This article is designed to point out the close connection between recursive estimation procedures, such as Kalman filter theory, familiar to control engineers, and linear least squares estimators and estimators that include prior information in the form of linear restrictions, such as mixed estimators and ridge estimators, familiar to statisticians. The only difference between the two points of view seems to be a difference in terminology. To demonstrate this point, it is shown how the Kalman filter equations can be derived from an existing textbook account of linear least squares theory and the notion of combining prior information in linear models, that is, the Goldberger—Theil mixed estimators' point of view. The author advocates the inclusion of these ideas early when least squares estimation concepts are being taught.