Layman’s abstract for paper on a comparison of some conformal quantile regression methods

Every few days, we will be publishing layman’s abstracts of new articles from our prestigious portfolio of journals in statistics. The aim is to highlight the latest research to a broader audience in an accessible format.

The article featured today is from Stat and the full article, published in issue 9.1, is available to read online here.

 

Sesia, M, Candès, E. A comparison of some conformal quantile regression methods. Stat. 2020; 9:e261. https://doi.org/10.1002/sta4.261

Conformal quantile regression is a practical and principled statistical method for constructing adaptive prediction intervals with finite-sample coverage for non-parametric regression problems. In a nutshell, the main idea is to wrap a black-box machine learning prediction algorithm around a randomized data-splitting procedure that prevents overfitting and ensures that the output prediction intervals will be well-calibrated for future independent data points. Since this provable finite-sample guarantee does not require any model assumptions beyond sample exchangeability, this procedure is in principle very widely applicable.
This paper compares theoretically and empirically two recently proposed alternative versions of conformal quantile regression. Firstly, it proves that, under some additional assumptions, both alternatives are asymptotically efficient in large samples, in the sense that their output will asymptotically approach that of an omniscient oracle with exact knowledge of the true regression function. Secondly, this paper shows that one version of conformal quantile regression typically yields tighter prediction intervals in finite samples, for a wide range of real and simulated data sets, and is hence practically preferable. Finally, this paper discusses how to tune these procedures by fixing the relative proportions of observations used for training the black-box and calibrating it.