Simulation Study for Variable Selection in Quantile Regression
DOI:
https://doi.org/10.31185/wjps.607Keywords:
Lasso, SCAD; MCP, Gamma-Divergence, Regularization.Abstract
Six essential techniques for variable selection in quantile regression models are thoroughly examined in this work. Quantile Lasso, gamma-divergence, quantile elastic net, quantile adaptive Lasso, quantile SCAD, and quantile MCP are some of these techniques. The study's main objective is to compare and assess various approaches' performance, particularly in relation to their capacity to reduce median model error. An important part of the analysis includes situations in which there are more possible candidate variables than there are data. Through a thorough simulation analysis, the study also aims to identify the best practices for each and every linear regression situation.
References
R. Tibshirani, “Regression shrinkage and selection via the Lasso,” Journal of the Royal Statistical Society, Series B 58, 267–288, (1996).
H., Zou, and T., Hastie, ‘’Regularization and variable selection via the elastic net’’, Journal of the Royal Statistical Society B, 67, 301–320, (2005).
J. Fan and R. Li, “Variable selection via nonconcave penalized likelihood and its oracle properties”, Journal of the American Statistical Association 96(456), 1348–1360, (2001).
Y. Li and J. Zhu, “L1-norm quantile regression” , Journal of Computational and Graphical Statistics, 17(1), 163–185, (2008).
Y. Wu, and Y. Liu, “Variable Selection in Quantile Regression, Statistica Sinica”, 19, 801-817, (2009).
R. Koenker, and G. Bassett ,“Regression quantiles”, Econometrica, 46(1), 33–50, (1978).
L. Wang, Y. Wu, and R. Li, “Quantile regression for analyzing heterogeneity in ultra-high dimension. Journal of the American Statistical Association”, 107(497), 214–222, (2012).
M. Slawski, “The structured elastic net for quantile regression and support vector classification”, Statistics and Computing, 22(1), 153–168, (2012).
Y. Tian, S. Shen, G. Lu, M. Tang, and M. Tian, “Bayesian lasso-regularized quantile regression for linear regression models with autoregressive errors”, Communications in Statistics - Simulation and Computation, 48(3), 777–796, (2019).
L. Petrella, and V., “Raponi Joint estimation of conditional quantiles in multivariate linear regression models with an application to financial distress”, Journal of Multivariate Analysis, 173, 70–84, (2019).
A. Yan, and F. Song, “Adaptive elastic net-penalized quantile regression for variable selection”. Communications in Statistics- Theory and Methods, (2019).
R. Koenker , “Quantile regression for longitudinal data”, Journal of Multivariate Analysis, 91(1), 74–89, (2004).
A. Aghamohammadi, and S. Mohammadi, “Bayesian analysis of penalized quantile regression for longitudinal data”, Statistical Papers, 58(4), 1035–1053, (2017).
R. Alhamzawi , K. Yu, and D. F. Benoit, “Bayesian adaptive lasso quantile regression”, Statistical Modelling, 12(3), 279–297, (2012).
M. Bernardi, M. Bottone and L. Petrella, “Bayesian quantile regression using the skew exponential power distribution”, Computational Statistics and Data Analysis, 126, 92–111, (2018).
M. Kyung, J. Gill, M. Ghosh, and G. Casella, “Penalized regression, standard errors, and Bayesian lassos”, Bayesian Analysis, 5(2), 369–41, (2010).
Q. Li, R. Xi,and N. Lin, “Bayesian regularized quantile regression”, Bayesian Analysis, 5(3), 533–556, (2010).
X. Tang, and H. Lian, “Mean and quantile boosting for partially linear additive models”, Statistics and Computing, 26(5), 997–1008, (2016).
S. M. Ajeel and H. A. Hashem, “Comparison Some Robust Regularization Methods in Linear Regression via Simulation Study”, Academic Journal of Nawroz University, 9(2), 244–252, (2020).
A.E., Hoerl, and R.W. ,Kennard, “Ridge Regression: Biased Estimation for Nonorthogonal Problems’’. Technometrics , 12 (1), 55–67, (1970).
H. Zou, “The adaptive Lasso and its oracle properties,” Journal of the American Statistical Association 101, 1418–1429, (2006).
H. Fujisawa and S. Eguchi, “Robust parameter estimation with a small bias against heavy contamination”, Journal of Multivariate Analysis, 99(9), 2053-2081, (2008).
A.E
T. Kawashima and H. Fujisawa, “Robust and sparse regression via _divergence. Entropy”, 19(11), 608, (2017).
.
C. H. Zhang, “Nearly unbiased variable selection under minimax concave penalty,” Annals of Statistics 38, 894–942, (2010).
Downloads
Published
Issue
Section
License
Copyright (c) 2025 HUSSEIN HASHEM

This work is licensed under a Creative Commons Attribution 4.0 International License.