Abstract
By establishing a connection between a quantile regression and an asymmetric Laplace distribution (ALD), this paper considers the maximum likelihood estimation of parameters of a quantile autoregression model with Markovian switching (MSQAR), where the error terms obey ALD whose scale parameter depends on regime shifts. By utilizing the mixture representation of ALD, we develop an effective ML approach for estimating parameters of MSQAR models, and obtain closed-form estimators of unknown parameters via the EM algorithm. Consistency and asymptotic normality of estimators are shown by extending some techniques adopted in Douc, Moulines, and Rydén (2004). Also, we extend some asymptotic results of estimators to the case where the conditional quantile regression model is misspecified. Furthermore, the proposed approach is illustrated by simulations and empirical data. Simulation results show that the procedure performs well in finite samples, and the empirical analysis not only supports the existence of regime-switching in the quantile autoregression model, but also has a good performance on data fitting.
Disclosure statement
No potential conflict of interest was reported by the authors.
Notes
1 As explained in the footnote 4 in Kim, Huo, and Kim (Citation2020), it can be seen the connection between the Laplace density and the tick-exponential density. Furthermore, because of the non-differentiability of the quasi-likelihood function constructed by using the tick-exponential density, standard optimization methods such as the Newton-Raphson method cannot be applied directly. Meanwhile, it is impossible to use the typical linear programming method for standard linear quantile regression (LQR) models because of the nonlinearity caused by Markov-switching. Therefore, they solve these problems by using a variable transformation and some equivalent conditions, and transform the QML estimation into a standard LQR estimation problem. The LQR estimation is then performed by using the typical linear programming approach. Unlike their methods, we introduce a mixture representation of ALD (Lemma 2.1) to solve the problem of non-differentiability of the likelihood function, and obtain closed-form estimators of unknown parameters via the EM algorithm.