Show simple item record

dc.contributor.advisorNasoetion, Andi Hakim
dc.contributor.advisorSumantri, Bambang
dc.contributor.authorPakpahan, Eduwin
dc.date.accessioned2023-11-09T07:49:17Z
dc.date.available2023-11-09T07:49:17Z
dc.date.issued2000
dc.identifier.urihttp://repository.ipb.ac.id/handle/123456789/131500
dc.description.abstractIn multiple linear regression the effect of nonorthogonality of the prediction vectors is to pull the least squares estimates of the regression coefficient away from the true coefficient that one is trying to estimate. Separating the effects of each factor of the regressors has been found to be difficult because of high correlation among independent variables. Although the multicollinearity problem has been recognized by many ways, ordinary least squares is still the main technique used in many fields, but the estimation of regression coefficient can present problems when the data vectors for the predictors are not orthogonal, or in this case are multicollinear. In using ordinary least squares, the regression coefficient may be unstable, too large in absolute value, and incorrect with respect to sign. Introduced biased estimation refer to unbiased one that measure the average closeness to the parameter being estimated corresponding to its mean square error (MSE). Ridge regression and principal component can produce better estimates by its own way than those given by ordinary least squares. It may well be preferred estimator since it will have larger probability of being close to true parameter value. Two of the biased regression methods, ridge regression and principal component regression will be applied into this research and discuss briefly for each of its properties. These methods have its particular idiosyncrasy. The biased regression methods attack the collinearity problem by computationally suppressing the effects of the collinearity. Ridge regression does this correctly by modifying the correlation matrix adding a biased constant driving through nonsingular matrix. Principal component regression attacks the problem by regressing Y on the important principal component and then parceling out the effect of the principal component variables to the original variables. Finally translated back the obtained variables into original variables yield the same sign and nearly the same coefficients for both methods.id
dc.language.isoen_USid
dc.publisherIPB Universityid
dc.subject.ddcStatisticsid
dc.titleWhich one use to overcome multicollinearity ? Ridge regression or principal component regressionid
dc.typeUndergraduate Thesisid
dc.subject.keywordRegresssion analysisid


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record