Full Paper View Go Back

L1 Penalized Regression Procedures for Feature Selection

Muthukrishnan. R1 , Mahalakshmi. P2

Section:Research Paper, Product Type: Isroset-Journal
Vol.5 , Issue.5 , pp.88-91, Oct-2018


CrossRef-DOI:   https://doi.org/10.26438/ijsrmss/v5i5.8891


Online published on Oct 31, 2018


Copyright © Muthukrishnan. R , Mahalakshmi. P . This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
 

View this paper at   Google Scholar | DPI Digital Library


XML View     PDF Download

How to Cite this Paper

  • IEEE Citation
  • MLA Citation
  • APA Citation
  • BibTex Citation
  • RIS Citation

IEEE Style Citation: Muthukrishnan. R , Mahalakshmi. P, “L1 Penalized Regression Procedures for Feature Selection,” International Journal of Scientific Research in Mathematical and Statistical Sciences, Vol.5, Issue.5, pp.88-91, 2018.

MLA Style Citation: Muthukrishnan. R , Mahalakshmi. P "L1 Penalized Regression Procedures for Feature Selection." International Journal of Scientific Research in Mathematical and Statistical Sciences 5.5 (2018): 88-91.

APA Style Citation: Muthukrishnan. R , Mahalakshmi. P, (2018). L1 Penalized Regression Procedures for Feature Selection. International Journal of Scientific Research in Mathematical and Statistical Sciences, 5(5), 88-91.

BibTex Style Citation:
@article{R_2018,
author = {Muthukrishnan. R , Mahalakshmi. P},
title = {L1 Penalized Regression Procedures for Feature Selection},
journal = {International Journal of Scientific Research in Mathematical and Statistical Sciences},
issue_date = {10 2018},
volume = {5},
Issue = {5},
month = {10},
year = {2018},
issn = {2347-2693},
pages = {88-91},
url = {https://www.isroset.org/journal/IJSRMSS/full_paper_view.php?paper_id=872},
doi = {https://doi.org/10.26438/ijcse/v5i5.8891}
publisher = {IJCSE, Indore, INDIA},
}

RIS Style Citation:
TY - JOUR
DO = {https://doi.org/10.26438/ijcse/v5i5.8891}
UR - https://www.isroset.org/journal/IJSRMSS/full_paper_view.php?paper_id=872
TI - L1 Penalized Regression Procedures for Feature Selection
T2 - International Journal of Scientific Research in Mathematical and Statistical Sciences
AU - Muthukrishnan. R , Mahalakshmi. P
PY - 2018
DA - 2018/10/31
PB - IJCSE, Indore, INDIA
SP - 88-91
IS - 5
VL - 5
SN - 2347-2693
ER -

447 Views    192 Downloads    141 Downloads
  
  

Abstract :
In high dimensional regression analysis, a greater number of independent variables occur in many scientific fields and machine learning applications. To select predictors that are relevant to the response, statistical feature selection should be performed. In the study on variable selection in regression analysis, specifically when there are a greater number of predictor variables or highly correlated variables (or both), traditional method includes forward-backward and mixed stepwise variable selection procedure fails. There is need of alternatives, that is, L1 penalized regression procedures which provide higher prediction accuracy and computational efficiency. This paper demonstrates such procedures, particularly least absolute shrinkage and selection operator (LASSO) which does shrinkage and variable selection simultaneously and its variants. In case of extreme observations in the data set, robust regression estimators that are adopted in LASSO tolerate outliers with comparatively greater accuracy. In this paper, the performance of these procedures has been analyzed using the performance measure Median Squared Error (MSE) with numerical illustrations.

Key-Words / Index Term :
Variable selection, LASSO, Huber, outlier, Robust, R Software

References :
[1] A. Alfons, C. Croux, S. Gelper, “Sparse least trimmed squares regression for analyzing high-dimensional large data sets”, Annals of Applied Statistics, Vol.7, No.1, pp.226-248, 2013.
[2] A. C. Lozano, N. Meinshausen, E. Yang, “Minimum distance lasso for robust high-dimensional regression”, Electronic Journal of Statistics, Vol.10, No.1, pp.1296-1340, 2016.
[3] H. Wang, C. Leng, “A note on adaptive group lasso”, Computational Statistics and Data Analysis, pp.5277-5286, 2008.
[4] H. Wang, G. Li, G. Jiang, “Robust regression shrinkage and consistent variable selection through the LAD-lasso”, Journal of Business & Economic Statistics, Vol.25, pp.347-355, 2007.
[5] H. Zou, “The adaptive lasso and its oracle properties”, Journal of the American Statistical Association, Vol.101, No.476, pp.1418-1429, 2006.
[6] H. Zou, M. Yuan, “Composite quantile regression and the oracle model selection theory”, Annals of Statistics, Vol.36, No.3, pp.1108-1126, 2008.
[7] J. Fan, Q. Li, Y. Wang, “Estimation of high dimensional mean regression in the absence of symmetry and light tail assumptions”. J.R.Statist.Soc. B (Statistical Methodology), In Press, 2016.
[8] K. Knight, W. Fu, “Asymptotics of lasso-type estimators”, Annals of statistics, Vol.28, pp.1346-1378, 2000.
[9] Lambert-Lacroix, Zwald, “Robust Regression through the Huber`s criterion and adaptive lasso penalty”, Electron. J. Stat., Vol.5, pp.1015-1053, 2011.
[10] M. Mohanadevi, V. Vinothini, “Accurate Error Prediction of Sugarcane Yeild Using a Regression Model”, International Journal of Computer Science and Engineering, Vol.6, Issue.7, pp.66-71, 2018.
[11] M. Yuan, Y. Lin, “Model selection and estimation in regression with grouped variables”, J.R.Statist.Soc. B, Vol.68, pp.49-67, 2006.
[12] R Core Team, “R: A language and environment for statistical computing”, R Foundation for Statistical Computing, Vienna, Austria, 2018.
[13] R. Tibshirani, “Regression shrinkage and selection via the lasso”, J.R.Statist.Soc. B, Vol.58, pp.267-288, 1996.
[14] R. Tibshirani, M. Saunders, S. Rosset, J. Zhu, K. Knight, “Sparsity and smoothness via the Fused Lasso”, J.R.Statist.Soc. B, Vol.67, pp.91-108, 2005.
[15] S. Li and Y. Qin, “MTE: Maximum Tangent Likelihood and Other Robust Estimators for High-Dimensional Regression”, R package version 1.0.0.
[16] S. N. Negahban, P. Ravikumar, “A unified framework for high-dimensional analysis of M-estimators with decomposable regularizes”, Statistical Science, Vol.27, No.4, 538-557, 2010.
[17] X. Wang, Y. Jiang, M. Huang, H. Zhang, “Robust variable selection with exponential squared loss”, Journal of the American Statistical Association, Vol.108, Issue.502, pp.632-643, 2013.
[18] Y. Qin, Shaobo Li, Yang Li, Yan Yu, “Penalized maximum tangent likelihood estimation and robust variable selection”, Unpublished, 2017.

Authorization Required

 

You do not have rights to view the full text article.
Please contact administration for subscription to Journal or individual article.
Mail us at  support@isroset.org or view contact page for more details.

Go to Navigation