This study considers linear quantile regression and variable selection for high-dimensional data. Generally, an ordinary quantile regression estimator is obtained for a fixed single quantile level. Therefore, the estimated coefficient does not have continuity with respect to the quantile level, and hence the behavior of the estimator and estimated active variable set would change rapidly for different but sufficiently close quantile levels. To obtain a stable estimator for a given quantile level, this study proposes a new quantile regression method to estimate the coefficient as a function of the quantile level of interest in a given region, which is denoted by the quantile function regression. In the quantile function regression, we approximate the coefficient function of the quantile level using a sieve model, and hence the estimated conditional quantile is continuous with respect to quantile level. To employ variable selection, a group lasso-type sparse penalization is used to select the nonzero coefficient function of the quantile level, which indicates the estimated active set remains unchanged in some quantile region. Therefore, the quantile function regression can achieve global variable selection.
Layman’s abstract for Canadian Journal of Statistics article on Quantile function regression and variable selection for sparse models
Each week, we publish layman’s abstracts of new articles from our prestigious portfolio of journals in statistics. The aim is to highlight the latest research to a broader audience in an accessible format.
The article featured today is from the Canadian Journal of Statistics, with the full article now available to read here.
Yoshida, T. (2021), Quantile function regression and variable selection for sparse models. Can J Statistics. https://doi.org/10.1002/cjs.11616
More Details