Statistics

Permanent URI for this communityhttps://repository.ui.edu.ng/handle/123456789/360

Browse

Search Results

Now showing 1 - 3 of 3
  • Thumbnail Image
    Item
    Frequentist and bayesian estimation of parameters of linear regression model with correlated explanatory variables
    (2017) Adepoju, A. A.; Adebajo, E. O; Ogundunmade, P. T.
    This paper addressed the popular issue of collinearity among explanatory variables in the context of a multiple linear regression analysis, and the parameter estimations of both the classical and the Bayesian methods. Five sample sizes: 10, 25, 50, 100 and 500 each replicated 10,000 times were simulated using Monte Carlo method. Four levels of correlation p = 0.0,0.1,0.5, and 0.9 representing no correlation, weak correlation, moderate correlation and strong correlation were considered. The estimation techniques considered were; Ordinary Least Squares (OLS), Feasible Generalized Least Squares (FGLS) and Bayesian Methods. The performances of the estimators were evaluated using Absolute Bias (ABIAS) and Mean Square Error (MSE) of the estimates. In all cases considered, the Bayesian estimators had the best performance. It was consistently most efficient than the other estimators, namely OLS and FGLS
  • Thumbnail Image
    Item
    Estimators of linear regression model with autocorrelated error terms and prediction using correlated uniform regressors
    (2012-11) Ayinde, K.; Adedayo, D. A.; Adepoju, A. A.
    Performances of estimators of linear regression model with autocorrelated error term have been attributed to the nature and specification of the explanatory variables. The violation of assumption of the independence of the explanatory variables is not uncommon especially in business, economic and social sciences, leading to the development of many estimators. Moreover, prediction is one of the main essences of regression analysis. This work, therefore, attempts to examine the parameter estimates of the Ordinary Least Square estimator (OLS), Cochrane-Orcutt estimator (COR), Maximum Likelihood estimator (ML) and the estimators based on Principal Component analysis (PC) in prediction of linear regression model with autocorrelated error terms under the violations of assumption of independent regressors (multicollinearity) using Monte-Carlo experiment approach. With uniform variables as regressors, it further identifies the best estimator that can be used for prediction purpose by averaging the adjusted co-efficient of determination of each estimator over the number of trials. Results reveal that the performances of COR and ML estimators at each level of multicollinearity over the levels of autocorrelation are convex – like while that of the OLS and PC estimators are concave; and that as the level of multicollinearity increases, the estimators perform much better at all the levels of autocorrelation. Except when the sample size is small (n=10), the performances of the COR and ML estimators are generally best and asymptotically the same. When the sample size is small, the COR estimator is still best except when the autocorrelation level is low. At these instances, the PC estimator is either best or competes with the best estimator. Moreover, at low level of autocorrelation in all the sample sizes, the OLS estimator competes with the best estimator in all the levels of multicollinearity
  • Thumbnail Image
    Item
    The Bayesian Approach to Estimation of Multi-Equation Econometric Models in the Presence of Multicollinearity
    (2014) Okewole, D. M. O
    The Bayesian approach conveys information not available in the data but on prior knowledge of the subject matter, which enables one to make probability statements about the parameters of interest, while the classical approaches deals solely with the data. Several researches on the classical approaches have shown them to be sensitive to multicollinearity, a violation of one of the assumptions of multi-equation models which often plagues economic variables. Studies on the performance of the Bayesian method in this context are however limited. This study was aimed at investigating the performance of the Bayesian approach in estimating multi-equation models in the presence of multicollinearity. The purely just- and over-identified multi-equation models were considered. In both cases, the normal distribution with zero mean and large variance served as locally uniform prior for the regression coefficients. Three Bayesian Method Prior Variances (BMPV) were specified as 10, 100 and 1000 in a Monte Carlo prior variance sensitivity analysis. The Wishart distribution with zero degree of freedom served as prior distribution for inverse of error variance-covariance matrix, being its conjugate. The posterior distributions for the two models were then derived from the prior distributions and the likelihood functions as a bivariate Student-t and generalized Student-t distributions respectively. The estimates were then compared with those from the classical estimators; Ordinary Least Squares (OLS), Two stage Least Squares (2SLS), Three stage Least Squares (3SLS) and Limited Information Maximum Likelihood (LIML). Samples of sizes T=20, 40, 60, and 100 in 5000 replicates were generated based on eight specified research scenario. The Mean Squared Error (MSE) of the estimates were computed and used as evaluation criteria. The BMPV 10 produced the least MSE in the prior variance sensitivity analysis for the over-identified model, whereas for the just-identified model without multicollinearity, BMPV 100 was the smallest. The Bayesian method was better in the small sample cases T  40 than the classical estimators for  (the coefficient of the exogenous variable in the just-identified model); when T=20, MSE for BMPV 10, 100 and 1000 were 0.169, 0.168 and 0.171 respectively, whereas OLS, 2SLS, 3SLS and LIML yielded same results; 0.244, when T=40, BMPV 10, 100 and 1000 were 0.1220, 0.1272, 0.1361 respectively and 0.1262 for the classical methods. The 2SLS and 3SLS estimates of (coefficient of the endogenous explanatory variable) which were the same in the over identified model had smaller MSE than the Bayesian method; when T=20, MSE for 2SLS/3SLS =0.0280, whereas BMPV 10=0.0286, BMPV 100 = 0.0300, and BMPV 1000 = 0.033. The Bayesian method was less sensitive to multicollinearity in estimating coefficients of the correlated exogenous variables; MSE (T=20) for BMPV 10, 100, and 1000 were 0.4529, 0.5220, 0.5290 respectively, while it was 0.7492 for the classical estimators. The MSE of LIML (0.0036) was similar to that of BMPV 100 (0.0036) and BMPV 1000 (0.0036) in large sample caseT 100 for. Bayesian approach was suitable for estimating the parameters of exogenous variables in the small sample cases when the model is purely just-identified, and in over identified model in the presence of multicollinearity