Abstract
High dimensional predictors in regression analysis are often associated with multicollinearity along with other estimation problems. These problems can be mitigated through a constrained optimization method that simultaneously induces dimension reduction and variable selection that also maintains a high level of predictive ability of the fitted model. Simulation studies show that the method may outperform sparse principal component regression, least absolute shrinkage and selection operator, and elastic net procedures in terms of predictive ability and optimal selection of inputs. Furthermore, the method yields reduced models with smaller prediction errors than the estimated full models from the principal component regression or the principal covariance regression.
| Original language | English |
|---|---|
| Pages (from-to) | 242-256 |
| Number of pages | 15 |
| Journal | Computational Statistics and Data Analysis |
| Volume | 112 |
| DOIs | |
| Publication status | Published - Aug 2017 |
| Externally published | Yes |
Keywords
- Dimension reduction
- High dimensionality
- Latent factors
- Regression modeling
- Soft thresholding
- Sparse principal component analysis
- Sparsity
- Variable selection