![]() ![]() ![]() It is an NP hard problem to find the model which minimizes a specific information criterion, and in general already for a moderate number of say fifty variables it becomes computationally infeasible to guarantee finding the optimal solution. However, the corresponding optimization problem is notoriously difficult due to the non-convexity and discontinuity of the L 0 penalty. Thus from a theoretical perspective it is rather appealing to perform model selection using generalized information criteria. As a consequence a number of different modifications of BIC have been suggested, for example mBIC which is designed to control the family wise error rate (FWER), mBIC2 controlling the false discovery rate, or EBIC for which consistency under certain asymptotic conditions has been shown even when the number of regressors is allowed to be larger than the sample size. However, this is no longer true in a high dimensional setting, where under sparsity both AIC and BIC tend to select too large models. Specifically BIC is known to yield a consistent model selection rule, which means that as the sample size goes to infinity the probability of selecting the true model goes to 1. Their asymptotic properties have been thoroughly studied and are well understood when the number of potential regressors is fixed (see for example and citations given there). The former use a penalty which depends on the number of estimated parameters, sometimes called L 0 penalty, and include the classical information criteria AIC and BIC. Of particular importance in this context is penalized maximum likelihood estimation, which can be divided in selection methods based on generalized information criteria and regularization methods. Methods for performing variable selection, particularly in a high dimensional setting, have undergone tremendous development over the last two decades. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.Ĭompeting interests: The authors have declared that no competing interests exist. Florian Frommlet received funding from WWTF and Gregory Nuel from ANR. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.ĭata Availability: The data set used for real data analysis in this manuscript was obtained from dbGaP through dbGaP accession number phs000276.v2.p1 at Funding: This research has been funded by WWTF, the Vienna Science and Technology Fund ( ) through project MA09-007a and by ANR, the French National Research Agency, through Project SAMOGWAS ( ). Received: Accepted: JanuPublished: February 5, 2016Ĭopyright: © 2016 Frommlet, Nuel. PLoS ONE 11(2):Ĭleveland Clinic Lerner Research Institute, UNITED STATES The paper ends with an illustrative example of applying AR to analyze GWAS data.Ĭitation: Frommlet F, Nuel G (2016) An Adaptive Ridge Procedure for L 0 Regularization. Furthermore an efficient implementation of AR in the context of least-squares segmentation is presented. Based on extensive simulations for the non-orthogonal case as well as for Poisson regression the performance of AR is studied and compared with SCAD and adaptive LASSO. After introducing AR its specific shrinkage properties are studied in the particular case of orthogonal linear regression. In this paper we introduce an adaptive ridge procedure (AR), where iteratively weighted ridge problems are solved whose weights are updated in such a way that the procedure converges towards selection with L 0 penalties. Their theoretical properties have been studied intensively and are well understood, but making use of them in case of high-dimensional data is difficult due to the non-convex optimization problem induced by L 0 penalties. ![]() Penalized selection criteria like AIC or BIC are among the most popular methods for variable selection. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |