首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The relative performances of forecasting models change over time. This empirical observation raises two questions. First, is the relative performance itself predictable? Second, if so, can it be exploited in order to improve the forecast accuracy? We address these questions by evaluating the predictive abilities of a wide range of economic variables for two key US macroeconomic aggregates, namely industrial production and inflation, relative to simple benchmarks. We find that business cycle indicators, financial conditions, uncertainty and measures of past relative performances are generally useful for explaining the models’ relative forecasting performances. In addition, we conduct a pseudo-real-time forecasting exercise, where we use the information about the conditional performance for model selection and model averaging. The newly proposed strategies deliver sizable improvements over competitive benchmark models and commonly-used combination schemes. The gains are larger when model selection and averaging are based on both financial conditions and past performances measured at the forecast origin date.  相似文献   

2.
Parameter estimation under model uncertainty is a difficult and fundamental issue in econometrics. This paper compares the performance of various model averaging techniques. In particular, it contrasts Bayesian model averaging (BMA) — currently one of the standard methods used in growth empirics — with a new method called weighted-average least squares (WALS). The new method has two major advantages over BMA: its computational burden is trivial and it is based on a transparent definition of prior ignorance. The theory is applied to and sheds new light on growth empirics where a high degree of model uncertainty is typically present.  相似文献   

3.
4.
The effect of preferential trade agreements (PTAs) on trade flows is subject to model uncertainty stemming from the diverse and even contradictory effects suggested by the theoretical PTA literature. The existing empirical literature has produced remarkably disparate results and the wide variety of empirical approaches reflects the uncertainty about the ‘correct’ set of explanatory variables that ought to be included in the analysis. To account for the model uncertainty that surrounds the validity of the competing PTA theories, we introduce Bayesian model averaging (BMA) to the PTA literature. Statistical theory shows that BMA successfully incorporates model uncertainty in linear regression analysis by minimizing the mean squared error, and by generating predictive distributions with optimal predictive performance. Once model uncertainty is addressed as part of the empirical strategy, we find strong evidence of trade creation, trade diversion, and open bloc effects. Our results are robust to a range of alternative empirical specifications proposed by the recent PTA literature. Copyright © 2010 John Wiley & Sons, Ltd.  相似文献   

5.
This paper provides empirical evidence on the dynamic effects of uncertainty on firm‐level capital accumulation. A novelty in this paper is that the firm‐level uncertainty indicator is motivated and derived from a theoretical model, the neoclassical investment model with time to build. This model also serves as the base for the empirical work, where an error‐correction approach is employed. I find a negative effect of uncertainty on capital accumulation, both in the short run and the long run. This outcome cannot be explained by the model alone. Instead, the results suggest that the predominant mechanism at work stems from irreversibility constraints.  相似文献   

6.
Dynamic model averaging (DMA) has become a very useful tool with regards to dealing with two important aspects of time-series analysis, namely, parameter instability and model uncertainty. An important component of DMA is the Kalman filter. It is used to filter out the latent time-varying regression coefficients of the predictive regression of interest, and produce the model predictive likelihood, which is needed to construct the probability of each model in the model set. To apply the Kalman filter, one must write the model of interest in linear state–space form. In this study, we demonstrate that the state–space representation has implications on out-of-sample prediction performance, and the degree of shrinkage. Using Monte Carlo simulations as well as financial data at different sampling frequencies, we document that the way in which the current literature tends to formulate the candidate time-varying parameter predictive regression in linear state–space form ignores empirical features that are often present in the data at hand, namely, predictor persistence and predictor endogeneity. We suggest a straightforward way to account for these features in the DMA setting. Results using the widely applied Goyal and Welch (2008) dataset document that modifying the DMA framework as we suggest has a bearing on equity premium point prediction performance from a statistical as well as an economic viewpoint.  相似文献   

7.
Abstract.  In this paper, we review and unite the literatures on returns to schooling and Bayesian model averaging. We observe that most studies seeking to estimate the returns to education have done so using particular (and often different across researchers) model specifications. Given this, we review Bayesian methods which formally account for uncertainty in the specification of the model itself, and apply these techniques to estimate the economic return to a college education. The approach described in this paper enables us to determine those model specifications which are most favored by the given data, and also enables us to use the predictions obtained from all of the competing regression models to estimate the returns to schooling. The reported precision of such estimates also account for the uncertainty inherent in the model specification. Using U.S. data from the National Longitudinal Survey of Youth (NLSY), we also revisit several 'stylized facts' in the returns to education literature and examine if they continue to hold after formally accounting for model uncertainty.  相似文献   

8.
Multinomial and ordered Logit models are quantitative techniques which are used in a range of disciplines nowadays. When applying these techniques, practitioners usually select a single model using either information-based criteria or pretesting. In this paper, we consider the alternative strategy of combining models rather than selecting a single model. Our strategy of weight choice for the candidate models is based on the minimization of a plug-in estimator of the asymptotic squared error risk of the model average estimator. Theoretical justifications of this model averaging strategy are provided, and a Monte Carlo study shows that the forecasts produced by the proposed strategy are often more accurate than those produced by other common model selection and model averaging strategies, especially when the regressors are only mildly to moderately correlated and the true model contains few zero coefficients. An empirical example based on credit rating data is used to illustrate the proposed method. To reduce the computational burden, we also consider a model screening step that eliminates some of the very poor models before averaging.  相似文献   

9.
基于模型动态稳定性的要求,本文改进了半非参数(SNP)模型的选择方法,使所选择的SNP模型既能很好地拟合真实样本又能模拟出与真实样本统计特征相近的时间序列。其次,本文得到了二维随机过程情形下赫米特展开项的理论结果。第三,实证结果表明:上交所旧质押式回购利率初始、插值后样本两组数据的最优SNP模型均为Semiparametric AR(1)-GARCH(1,1)(即11118000)模型,但是两者的系数估计值却不相同。最后,本文的实证结果表明了所提出的SNP模型选择改进方法的合理性与稳定性。  相似文献   

10.
Existing empirical studies on the sacrifice ratio (measuring the output cost of disinflation) consider a large number of potential explanatory variables including the length of disinflation, various institutional settings, economic conditions, and the political climate. Some results are robust across different studies, while others are not. We address the presence of model uncertainty by using the Bayesian model averaging method to identify the important determinants of the sacrifice ratio, without relying on ad hoc model selection. Our results show that the length of disinflation is the most important variable. This supports the ‘cold turkey’ argument for faster disinflation.  相似文献   

11.
This paper compares the performance of Bayesian variable selection approaches for spatial autoregressive models. It presents two alternative approaches that can be implemented using Gibbs sampling methods in a straightforward way and which allow one to deal with the problem of model uncertainty in spatial autoregressive models in a flexible and computationally efficient way. A simulation study shows that the variable selection approaches tend to outperform existing Bayesian model averaging techniques in terms of both in-sample predictive performance and computational efficiency. The alternative approaches are compared in an empirical application using data on economic growth for European NUTS-2 regions.  相似文献   

12.
The abundant literature on the competing motives for holding international reserves stresses different factors, giving rise to a problem called model uncertainty. In this paper we search for the most important determinants of reserve holdings using data for 104 countries over the period 1999–2010 and evaluate their importance using Bayesian model averaging (BMA). We enrich the ongoing empirical discussion by examining the role of financial globalization and monetary policy and by introducing new variables and searching for alternatives to the traditional ones. The results confirm that trade openness and the broad-money-to-GDP ratio are the key determinants with a positive link to the level of reserves. On the other hand, financial development seems to lower the need for reserves.  相似文献   

13.
The identification of structural parameters in the linear instrumental variables (IV) model is typically achieved by imposing the prior identifying assumption that the error term in the structural equation of interest is orthogonal to the instruments. Since this exclusion restriction is fundamentally untestable, there are often legitimate doubts about the extent to which the exclusion restriction holds. In this paper I illustrate the effects of such prior uncertainty about the validity of the exclusion restriction on inferences based on linear IV models. Using a Bayesian approach, I provide a mapping from prior uncertainty about the exclusion restriction into increased uncertainty about parameters of interest. Moderate prior uncertainty about exclusion restrictions can lead to a substantial loss of precision in estimates of structural parameters. This loss of precision is relatively more important in situations where IV estimates appear to be more precise, for example in larger samples or with stronger instruments. I illustrate these points using several prominent recent empirical papers that use linear IV models. An accompanying electronic table allows users to readily explore the robustness of inferences to uncertainty about the exclusion restriction in their particular applications. Copyright © 2010 John Wiley & Sons, Ltd.  相似文献   

14.
Predicting the evolution of mortality rates plays a central role for life insurance and pension funds. Various stochastic frameworks have been developed to model mortality patterns by taking into account the main stylized facts driving these patterns. However, relying on the prediction of one specific model can be too restrictive and can lead to some well-documented drawbacks, including model misspecification, parameter uncertainty, and overfitting. To address these issues we first consider mortality modeling in a Bayesian negative-binomial framework to account for overdispersion and the uncertainty about the parameter estimates in a natural and coherent way. Model averaging techniques are then considered as a response to model misspecifications. In this paper, we propose two methods based on leave-future-out validation and compare them to standard Bayesian model averaging (BMA) based on marginal likelihood. An intensive numerical study is carried out over a large range of simulation setups to compare the performances of the proposed methodologies. An illustration is then proposed on real-life mortality datasets, along with a sensitivity analysis to a Covid-type scenario. Overall, we found that both methods based on an out-of-sample criterion outperform the standard BMA approach in terms of prediction performance and robustness.  相似文献   

15.
Sustainable economic development in the future is driven by public policy on regional, national and global levels. Therefore a comprehensive policy analysis is needed that provides consistent and effective policy support. However, a general problem facing classical policy analysis is model uncertainty. All actors, those involved in the policy choice and those in the policy analysis, are fundamentally uncertain which of the different models corresponds to the true generative mechanism that represents the natural, economic, or social phenomena on which policy analysis is focused. In this paper, we propose a general framework that explicitly incorporates model uncertainty into the derivation of a policy choice. Incorporating model uncertainty into the analysis is limited by the very high required computational effort. In this regard, we apply metamodeling techniques as a way to reduce computational complexity. We demonstrate the effect of different metamodel types using a reduced model for the case of CAADP in Senegal. Furthermore, we explicitly show that ignoring model uncertainty leads to inefficient policy choices and results in a large waste of public resources.  相似文献   

16.
流动性约束下我国居民消费行为的二元结构与地区差异   总被引:4,自引:0,他引:4  
本文构建了状态空间模型和跨省面板数据模型,就不确定性和流动性约束等对我国居民消费行为影响进行比较分析,结果表明:1978年以来,收入的不确定性对城镇居民消费行为的负面影响要大于农村居民;城镇居民面临的流动性约束相对小于农村居民;城镇居民面对利率的替代效应要大于收入效应,而农村居民与之相反。1995年以来,收入和支出的不确定性对我国居民整体消费行为的影响并不显著,但与东中部地区居民消费变动正相关;我国居民消费面临的流动约束整体来看并不严重,但中部地区居民相对而言面临更强的流动性约束。  相似文献   

17.
A common problem in applied regression analysis is that covariate values may be missing for some observations but imputed values may be available. This situation generates a trade-off between bias and precision: the complete cases are often disarmingly few, but replacing the missing observations with the imputed values to gain precision may lead to bias. In this paper, we formalize this trade-off by showing that one can augment the regression model with a set of auxiliary variables so as to obtain, under weak assumptions about the imputations, the same unbiased estimator of the parameters of interest as complete-case analysis. Given this augmented model, the bias-precision trade-off may then be tackled by either model reduction procedures or model averaging methods. We illustrate our approach by considering the problem of estimating the relation between income and the body mass index (BMI) using survey data affected by item non-response, where the missing values on the main covariates are filled in by imputations.  相似文献   

18.
This paper investigates the replicability of three important studies on growth theory uncertainty that employed Bayesian model averaging tools. We compare these results with estimates obtained using alternative, recently developed model averaging techniques. Overall, we successfully replicate all three studies, find that the sign and magnitude of these new estimates are reasonably close to those produced via traditional Bayesian methods and deploy a novel strategy to implement one of the new averaging estimators. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

19.
This paper explores ways to integrate model uncertainty into policy evaluation. We describe a general framework that includes both model averaging methods as well as some measures that describe whether policies and their consequences are model dependent. These general ideas are then applied to assess simple monetary policy rules for some standard New Keynesian specifications. We conclude that the original Taylor rule has good robustness properties, but may reasonably be challenged in overall quality with respect to stabilization by alternative simple rules, even when these rules employ parameters that are set without accounting for model uncertainty.  相似文献   

20.
In this paper I describe the effect of parameter uncertainty on the way conditional forecast variances grow as the forecast horizon increases. Without parameter uncertainty, forecast variances for the unit root model grow linearly with the forecast horizon while with the trend stationary model they are bounded. With parameter uncertainty, however, I find that for both the unit root and the trend stationary models, forecast variances grow with the square of the forecast horizon so that uncertainty grows at a much faster rate than without parameter uncertainty.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号