View Item 
      •   IPB Repository
      • Dissertations and Theses
      • Undergraduate Theses
      • UT - Faculty of Mathematics and Natural Sciences
      • UT - Statistics and Data Sciences
      • View Item
      •   IPB Repository
      • Dissertations and Theses
      • Undergraduate Theses
      • UT - Faculty of Mathematics and Natural Sciences
      • UT - Statistics and Data Sciences
      • View Item
      JavaScript is disabled for your browser. Some features of this site may not work without it.

      Which one use to overcome multicollinearity ? Ridge regression or principal component regression

      Thumbnail
      View/Open
      Fultext (705.5Kb)
      Date
      2000
      Author
      Pakpahan, Eduwin
      Nasoetion, Andi Hakim
      Sumantri, Bambang
      Metadata
      Show full item record
      Abstract
      In multiple linear regression the effect of nonorthogonality of the prediction vectors is to pull the least squares estimates of the regression coefficient away from the true coefficient that one is trying to estimate. Separating the effects of each factor of the regressors has been found to be difficult because of high correlation among independent variables. Although the multicollinearity problem has been recognized by many ways, ordinary least squares is still the main technique used in many fields, but the estimation of regression coefficient can present problems when the data vectors for the predictors are not orthogonal, or in this case are multicollinear. In using ordinary least squares, the regression coefficient may be unstable, too large in absolute value, and incorrect with respect to sign. Introduced biased estimation refer to unbiased one that measure the average closeness to the parameter being estimated corresponding to its mean square error (MSE). Ridge regression and principal component can produce better estimates by its own way than those given by ordinary least squares. It may well be preferred estimator since it will have larger probability of being close to true parameter value. Two of the biased regression methods, ridge regression and principal component regression will be applied into this research and discuss briefly for each of its properties. These methods have its particular idiosyncrasy. The biased regression methods attack the collinearity problem by computationally suppressing the effects of the collinearity. Ridge regression does this correctly by modifying the correlation matrix adding a biased constant driving through nonsingular matrix. Principal component regression attacks the problem by regressing Y on the important principal component and then parceling out the effect of the principal component variables to the original variables. Finally translated back the obtained variables into original variables yield the same sign and nearly the same coefficients for both methods.
      URI
      http://repository.ipb.ac.id/handle/123456789/131500
      Collections
      • UT - Statistics and Data Sciences [2260]

      Copyright © 2020 Library of IPB University
      All rights reserved
      Contact Us | Send Feedback
      Indonesia DSpace Group 
      IPB University Scientific Repository
      UIN Syarif Hidayatullah Institutional Repository
      Universitas Jember Digital Repository
        

       

      Browse

      All of IPB RepositoryCollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects

      My Account

      Login

      Application

      google store

      Copyright © 2020 Library of IPB University
      All rights reserved
      Contact Us | Send Feedback
      Indonesia DSpace Group 
      IPB University Scientific Repository
      UIN Syarif Hidayatullah Institutional Repository
      Universitas Jember Digital Repository