DefinePK

DefinePK hosts the largest index of Pakistani journals, research articles, news headlines, and videos. It also offers chapter-level book search.

Modified Method for Choosing Ridge Parameter


Article Information

Title: Modified Method for Choosing Ridge Parameter

Authors: Ayan Ullah, Muhammad Suhail, Maryam Ilyas

Journal: Journal Of Statistics

HEC Recognition History
Category From To
Y 2024-10-01 2025-12-31
Y 2023-07-01 2024-09-30
Y 1900-01-01 2005-06-30

Publisher: Government College University, Lahore.

Country: Pakistan

Year: 2017

Volume: 24

Issue: 1

Language: English

Categories

Abstract

Multicollinearity occurs when two or more predictors are linearly related to each other. In this case, either OLS estimators do not exist or if exist the associated variances of estimated Regression co-efficients are very large, making inferences invalid. Ridge Regression is used to counter the effects of multicollinearity. This is done by introducing biasing constant k, called Ridge parameter in the least square objective function. Ridge parameter shrinks the estimates and their variances. Selection and choice of the unknown Ridge parameter k is of prime importance in Ridge Regression analysis. Khalaf et al. (2013) proposed some modifications of existing Ridge estimators – by multiplying them with the factor that make use of maximum eigenvalue associated with ( ) matrix and name resulting estimators as K1M– K16M. This study proposed some modifications of existing Ridge estimators – by multiplying them with the factor that make use of arithmetic mean of eigenvalues associated with ( ) matrix denoted as K1A–K16A. The comparative performance of proposed sets of estimators and Khalaf et al. (2013) was evaluated by Mean Square Error (MSE) using simulated data sets. Data sets considering different levels of collinearity (r), sample size (n), number of predictor (p), error term variances and error term distributions were generated. It was observed that proposed estimators K1A–K16A outperform K1M–K16M when error terms following normal distribution ( = 0.1, 1) collinearity levels (r) are (i.e. 0.80, 0.90, 0.95) and number of predictors are (i.e. 2, 4, 6) and when error terms following non-normal distribution (F (4, 20)) collinearity levels (r) are high (i.e. 0.80, 0.90, 0.95) and number of predictors are small (i.e. 2, 4).


Paper summary is not available for this article yet.

Loading PDF...

Loading Statistics...