New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ENH: Smoothed quantile regression, smooth approximating functions #9182
Comments
checking some parts: Nychka's (in Oh et al 2011) approximation has continuous first but discontinuous (jump at zero) second derivative if q is not 0.5. The same should be true for all M-quantile norms. The problem is that there are (by construction) observations with zero residuals in quantile regression, so being at the discontinuity is not a zero probability event. But we might not get zero residuals with positive probability in the smooth approximation case. AFAICS, Nychka's update
resid in this code is used as inverse weight for irls update to update I'm not committing this change right now. There might be speed differences that need more investigation. weights following from M-norm derivation i.e. w = psi / u while current QuantReg fit uses weight = 1 / q / abs(resid) if resid < c and with equivalence 1 / q * (q * (1 - q)) = (1 - q) because weights in WLS are equivalent if only scaling differs. I don't find a reference for the IRLS computation. So, I don't see a problem rewriting this more in the style of irls in RLM. This means that there should not be a difference between QuantReg irls estimation and RLM with HuberT M-quantiles, except for the choice of threshold. update |
Looking at an old notebook again, I don't see why I remember that M-quantile and quantile regression don't match up for parameter estimates.
checking PR, comments already looked at this |
Chen, Colin. 2007. “A Finite Smoothing Algorithm for Quantile Regression.” Journal of Computational and Graphical Statistics 16 (1): 136–64. https://doi.org/10.1198/106186007X180336. equ. (2.3) they use a M-quantile huber function that has asymmetric threshold in positive and negative parts. This does not fit the M-quantiles pattern, which have different slopes at 0+ and 0-. They shrink the threshold towards zero during optimization with check when to stop shrinking. a similar one Muggeo, Vito M.R., Mariangela Sciandra, and Luigi Augugliaro. 2012. “Quantile Regression via Iterative Least Squares Computations.” Journal of Statistical Computation and Simulation 82 (11): 1557–69. https://doi.org/10.1080/00949655.2011.583650. Those approximation could fit into the M-quantile setup, but the |
related issue and application to penalized estimation #5350 #5350 (comment)
There are many smooth approximations in the literature for the quantile regression objective function, "check" function.
I tried before using RLM with M-quantiles, but my experiments with it did not have parameter estimates close to the ones from QuantReg.
It will be more useful to look at the literature that explicitly uses smoothed quantile regression.
The main advantage would be being able to use generic optimization which makes extension easier to implement, e.g. penalized estimation or nonlinear mean functions.
There might also be a possible speedup if we can use faster optimization algorithms than the irls, (although for the simple linear case we should still get Koenker's interior point algorithm.)
possible implementation
QuantRegSmoothed
, new class with generic optimization infit
, several possible objective functions that differ in approximating function, possibly/optionally return QuantRegResults with kernel based cov_params as in QuantReg.It should properly be based on a (non-existing) GenericMEstimator class (version with objective function; GenericMomEstimator #7436 would be separate.)
A quick prototype could be implemented as subclass of GenericLikelihoodModel (or as another RLM norm just to see whether that works. That will not be immediately useful for penalized estimation.)
references in addition to those in comment in #5350
Mkhadri, Abdallah, Mohamed Ouhourane, and Karim Oualkacha. 2017. “A Coordinate Descent Algorithm for Computing Penalized Smooth Quantile Regression.” Statistics and Computing 27 (4): 865–83. https://doi.org/10.1007/s11222-016-9659-9.
Nychka, Doug, Gerry Gray, Perry Haaland, David Martin, and Michael O’Connell. 1995. “A Nonparametric Regression Approach to Syringe Grading for Quality Improvement.” Journal of the American Statistical Association 90 (432): 1171–78. https://doi.org/10.2307/2291509.
Oh, Hee-Seok, Thomas C. M. Lee, and Douglas W. Nychka. 2011. “Fast Nonparametric Quantile Regression With Arbitrary Smoothing Methods.” Journal of Computational and Graphical Statistics 20 (2): 510–26. https://doi.org/10.1198/jcgs.2010.10063.
Ouhourane, Mohamed, Yi Yang, Andréa L. Benedet, and Karim Oualkacha. 2022. “Group Penalized Quantile Regression.” Statistical Methods & Applications 31 (3): 495–529. https://doi.org/10.1007/s10260-021-00580-8.
Yoshida, Takuma. 2023. “Asymptotics for Penalized Spline Estimators in Quantile Regression.” Communications in Statistics - Theory and Methods 52 (14): 4815–34. https://doi.org/10.1080/03610926.2013.765477.
The text was updated successfully, but these errors were encountered: