Abstract
This article introduces a new method for computing regression quantile functions. This method applies a finite smoothing algorithm based on smoothing the nondifferentiable quantile regression objective function ρτ. The smoothing can be done for all τ ∈ (0, 1), and the convergence is finite for any finite number of τi ∈ (0, 1), i = 1,…,N. Numerical comparison shows that the finite smoothing algorithm outperforms the simplex algorithm in computing speed. Compared with the powerful interior point algorithm, which was introduced in an earlier article, it is competitive overall; however, it is significantly faster than the interior point algorithm when the design matrix in quantile regression has a large number of covariates. Additionally, the new algorithm provides the same accuracy as the simplex algorithm. In contrast, the interior point algorithm gives only the approximate solutions in theory, and rounding may be necessary to improve the accuracy of these solutions in practice.