首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 62 毫秒
1.
Summary In this paper we try to clarify whether the use ofBox-Jenkins methods would have improved the forecasting performance in Austria during the recession of 1975. For this purpose we estimate ARIMA models for gross national product, private consumption, investment in plant and equipment, and inventory investment. We then compare the forecasts derived from these models with the results of more convential forecasting techniques. It can not be expected that Box-Jenkins methods predict a business cycle turning point. But, as soon as the recession was under way Box-Jenkins methods were faster in adapting to the new situation than conventional forecasting techniques. We found that the accuracy of Box-Jenkins predictions depends to a large extent on the length of the forecasting horizon. Our results suggest that the forecasting horizon should not exceed one year. All in all, Box-Jenkins methods applied together with the forecasting techniques already in use could further improve the forecasting performance.  相似文献   

2.
We employ a 10-variable dynamic structural general equilibrium model to forecast the US real house price index as well as its downturn in 2006:Q2. We also examine various Bayesian and classical time-series models in our forecasting exercise to compare to the dynamic stochastic general equilibrium model, estimated using Bayesian methods. In addition to standard vector-autoregressive and Bayesian vector autoregressive models, we also include the information content of either 10 or 120 quarterly series in some models to capture the influence of fundamentals. We consider two approaches for including information from large data sets — extracting common factors (principle components) in factor-augmented vector autoregressive or Bayesian factor-augmented vector autoregressive models as well as Bayesian shrinkage in a large-scale Bayesian vector autoregressive model. We compare the out-of-sample forecast performance of the alternative models, using the average root mean squared error for the forecasts. We find that the small-scale Bayesian-shrinkage model (10 variables) outperforms the other models, including the large-scale Bayesian-shrinkage model (120 variables). In addition, when we use simple average forecast combinations, the combination forecast using the 10 best atheoretical models produces the minimum RMSEs compared to each of the individual models, followed closely by the combination forecast using the 10 atheoretical models and the DSGE model. Finally, we use each model to forecast the downturn point in 2006:Q2, using the estimated model through 2005:Q2. Only the dynamic stochastic general equilibrium model actually forecasts a downturn with any accuracy, suggesting that forward-looking microfounded dynamic stochastic general equilibrium models of the housing market may prove crucial in forecasting turning points.  相似文献   

3.
This paper considers methods for forecasting macroeconomic time series in a framework where the number of predictors, N, is too large to apply traditional regression models but not sufficiently large to resort to statistical inference based on double asymptotics. Our interest is motivated by a body of empirical research suggesting that popular data-rich prediction methods perform best when N ranges from 20 to 40. In order to accomplish our goal, we resort to partial least squares and principal component regression to consistently estimate a stable dynamic regression model with many predictors as only the number of observations, T, diverges. We show both by simulations and empirical applications that the considered methods, especially partial least squares, compare well to models that are widely used in macroeconomic forecasting.  相似文献   

4.
Global vector autoregressions (GVARs) have several attractive features: multiple potential channels for the international transmission of macroeconomic and financial shocks, a standardized economically appealing choice of variables for each country or region examined, systematic treatment of long-run properties through cointegration analysis, and flexible dynamic specification through vector error correction modeling. Pesaran et al. (2009) generate and evaluate forecasts from a paradigm GVAR with 26 countries, based on Dées, di Mauro et al. (2007). The current paper empirically assesses the GVAR in Dées, di Mauro et al. (2007) with impulse indicator saturation (IIS)??a new generic procedure for evaluating parameter constancy, which is a central element in model-based forecasting. The empirical results indicate substantial room for an improved, more robust specification of that GVAR. Some tests are suggestive of how to achieve such improvements.  相似文献   

5.
The purpose of this study is to determine, with a dynamic simultaneous equations model, the relative importance of the most significant socioeconomic forces leading to the large-scale labor migration from the South to the North of Italy from 1952 to 1976, and to analyze its implications for the past and prospective development of the South. The model is estimated by Full Information Maximum Likelihood, validated by dynamic simulation, stressing dynamic policy simulations, and also presenting the results of some forecasting.  相似文献   

6.
Although there have been many evaluations of the Federal Reserve’s Greenbook forecasts, we analyze them in a different dimension. We examine the revisions of these forecasts in the context of fixed event predictions to determine how new information is incorporated in the forecasting process. This analysis permits us to determine if there was an inefficient use of information in the sense that the forecast revision has predictive power for the forecast error. Research on forecast smoothing suggests that we might find a positive relationship between the forecast error and the forecast revision. Although we do find for some variables and horizons the Fed’s forecast errors are predictable from its forecast revisions, there is no evidence of forecast smoothing. Instead the revisions sometimes have a negative relationship with the forecast error, suggesting in these cases that the Fed may be over-responsive to new information.  相似文献   

7.
In a recent study by Chowdhry et al. (American Economic Review 95: 255–276, 2005), they suggest that the empirical failure of relative purchasing power parity (PPP) may be because the official inflation data are too sticky. Thus, after extracting the unobservable pure price inflation from equity markets, they find strong evidence supporting relative PPP in the short run. As a replication study, this paper first replicates their original findings successfully. We further investigate whether long-run relative PPP holds using the pure price inflation data constructed by Chowdhry et al. (2005). After constructing pure real exchange rate series using their pure price inflation data, we implement both unit root and cointegration tests on the pure real exchange rates. According to the test results, the evidence suggests that relative PPP does not hold in the long run. Thus, it may be too early to suggest a resolution of the PPP puzzle.  相似文献   

8.
《Research in Economics》2014,68(1):70-83
Consumers become indecisive when facing too many choices. Economic analysis suggests that when a decision involves uncertain outcome, can be delayed and is irreversible, there will be a real option in the cost–benefit analysis. For example, the option to keep alive a consumer's purchasing decision has a significant value. It allows the consumer to take advantage of any future advantageous deals while avoiding the bad choices. This renders the consumer more hesitant. When a consumer decides to exercise his buying decision, he demands a compensation for the loss of this option. Hence, the benefits of a purchase must be over and above its costs by a wide margin (the option value). Data from a survey at a Turkish university on hypothetical purchase decisions confirmed the existence of this real option. We conclude with marketing policy recommendations and future research directions. Connection to the Prospect Theory is briefly explored.Note: Although the 3rd person singular pronoun he/his was used throughout to describe the consumer, he was intended to be gender-neutral.  相似文献   

9.
We employ a multiple testing technique to identify the countries for which purchasing power parity (PPP) held over the last century. The approach controls the multiplicity problem inherent in simultaneously testing for PPP on several time series, thereby avoiding spurious rejections. It has higher power than traditional multiple testing techniques by exploiting the dependence structure between the countries with a bootstrap approach. Our results show that, plausibly, thus controlling for multiplicity leads to a number of rejections of the null that is intermediate between that of traditional multiple testing techniques and that which results if one tests the null on each single time series at some level α. Research supported by Ruhr Graduate School in Economics and DFG under Sonderforschungsbereich 475.  相似文献   

10.
Expanding the work of Marchetti and Modis on Lotka-Volterra competition systems, a general model of Interaction Systems (IS) is introduced to describe the dynamics of multiple member interactions among different populations concerning not only biological systems but other types of systems as well. The new IS model provides us with a general framework of analysis and forecasting, where all parameters, variables, and interactions have real meaning, by using basic knowledge of each system.The proposed model can be applied to many different fields covering economic, business, social, physical, and other phenomena giving us both numerical estimates and qualitative insights of the system's dynamics. This is illustrated in two case studies. In the first case, the IS model is applied to elementary chemical reactions in order to quantify the reactions' kinetics. The result is the well known rate law of chemical reactions kinetics thus providing evidence of the proposed model's validity. In the second case, the IS model is applied to the global economy. The resulting model is tested against real global GDP data. The new IS model gave reliable estimates and proved to be considerably more accurate as compared to a similar forecast of global GDP based on the logistic growth model. Furthermore, the new model presented a basic framework of understanding the nature of major economic shifts, including the recent global recession of 2009, by studying the dynamic relationship between demand and supply.  相似文献   

11.
Using PIRLS (Progress in International Reading Literacy Study) data, we investigate which countries' schools can be classified as significantly better or weaker than Germany's as regards the reading literacy of primary school children. The ‘standard’ approach is to conduct separate tests for each country relative to the reference country (Germany) and to reject the null of equally good schools for all those countries whose p-value satisfies pi ? 0.05. We demonstrate that this approach ignores the multiple testing nature of the problem and thus overstates differences between schooling systems by producing unwarranted rejections of the null. We employ various multiple testing techniques to remedy this problem. The results suggest that the ‘standard’ approach may overstate the number of significantly different countries by up to 30%.  相似文献   

12.
The Box-Jenkins approach to time series analysis, which is an efficient way of analyzing stationary time series, recommends differencing as a general method for transforming a nonstationary time series into a stationary one. This paper gives a methodological discussion of some other ways of transforming a nonstationary series, in particular removing linear trends. It is argued that in many cases removing trends is superior to differencing in several respects. For example, when the process generating the time series is an ARMA(p,q) process added to a linear trend, differencing will produce an ARMA(p,q + 1) process that violates the invertibility conditions and is therefore difficult to estimate. The discussion is extended to time series with seasonal patterns.  相似文献   

13.
Monthly retail unit sales of clothes washers and dryers in eastern Washington state were regressed on average employment during the current and preceding two months, the average advertised price of these appliances relative to the Consumer Price Index, the expected change in consumer stocks of such applicances for each year on the basis of a hypothetical rate of accumulation, and the unit volume of newspaper advertising in order to determine the viablity of alternative models for analysing and forecasting monthly sales in a localized retail market during the periods of growth, maturity, and decline in market demand.

Employment and advertising, the two most useful variables in the functions tested, were able to account for 70–75 per cent of the monthly sales variance for automatic washers and dryers. Employment elasticity coefficients, though not linearly related to income elasticity coefficients for these products, appear to be a useful measure of purchasing power for localized demand and forecasting functions. The average advertising elasticity for the products studied was 0.095, which was approximately equal to advertising expenditures as a ratio of the gross margin of furniture and house furnishings corporations and kitchen appliance departments of department stores for the period studied, in keeping with the Rasmussen hypothesis. Correction of the models for the presence of autocorrelation altered the explanatory power of some of the variables substantially and resulted in regression coefficients for some variables that were more nearly consistent with theoretical expectations.  相似文献   

14.
Abstract We argue that risk aversion driven by exchange‐rate uncertainty causes a wedge between the domestic and foreign prices of a homogeneous good. We test our hypothesis using a unique micro‐data set from a market with minimum imperfections. The empirical findings validate our hypothesis, as accounting for exchange‐rate uncertainty we are able to explain a significant proportion of deviations from the law of one price. Overall, our analysis suggests the possibility of a new solution to the purchasing power parity puzzles.  相似文献   

15.
This article addresses the issue of inference in time-varying parameter regression models in the presence of many predictors and develops a novel dynamic variable selection strategy. The proposed variational Bayes dynamic variable selection algorithm allows for assessing at each time period in the sample which predictors are relevant (or not) for forecasting the dependent variable. The algorithm is used to forecast inflation using over 400 macroeconomic, financial, and global predictors, many of which are potentially irrelevant or short-lived. The new methodology is able to ensure parsimonious solutions to this high-dimensional estimation problem, which translate into excellent forecast performance.  相似文献   

16.
组合预测模型在区域物流需求预测中的应用   总被引:1,自引:0,他引:1  
朱帮助 《经济地理》2008,28(6):952-954
针对单一预测方法用于区域物流需求量预测存在的不足,文章提出了基于预测有效度的组合预测模型,即通过组合多个单一模型的预测结果,发挥各自的优点,提高预测的精确度。以广东省江门市为例,分别采用线性回归模型、灰色GM(1,1)模型和组合预测模型对其物流需求量进行了预测,实证结果表明区域物流需求组合预测模型能够取得更高的预测精度。  相似文献   

17.
Recent developments in investment research have highlighted the importance of non-convexities and irreversibilities in firms’ adjustment of quasi-fixed inputs. Aggregation across capital goods may smooth out the discontinuities associated with the adjustment of individual assets. Lack of suitable data is one of the reasons why empirical work has typically relied on the assumption of capital homogeneity. In this paper we exploit a data set of 1539 Italian firms which allows us to disaggregate capital into equipment and structures, and purchases and sales of assets. We construct measures of fundamental Q to capture investment opportunities associated with each asset. We uncover the pattern of dynamic adjustment by using non-parametric techniques to relate each individual investment to its own fundamental Q.  相似文献   

18.
In this paper we propose ridge regression estimators for probit models since the commonly applied maximum likelihood (ML) method is sensitive to multicollinearity. An extensive Monte Carlo study is conducted where the performance of the ML method and the probit ridge regression (PRR) is investigated when the data are collinear. In the simulation study we evaluate a number of methods of estimating the ridge parameter k that have recently been developed for use in linear regression analysis. The results from the simulation study show that there is at least one group of the estimators of k that regularly has a lower mean squared error than the ML method for all different situations that have been evaluated. Finally, we show the benefit of the new method using the classical Dehejia and Wahba dataset which is based on a labour market experiment.  相似文献   

19.
Rangan Gupta 《Applied economics》2013,45(33):4677-4697
This article considers the ability of large-scale (involving 145 fundamental variables) time-series models, estimated by dynamic factor analysis and Bayesian shrinkage, to forecast real house price growth rates of the four US census regions and the aggregate US economy. Besides the standard Minnesota prior, we also use additional priors that constrain the sum of coefficients of the VAR models. We compare 1- to 24-months-ahead forecasts of the large-scale models over an out-of-sample horizon of 1995:01–2009:03, based on an in-sample of 1968:02–1994:12, relative to a random walk model, a small-scale VAR model comprising just the five real house price growth rates and a medium-scale VAR model containing 36 of the 145 fundamental variables besides the five real house price growth rates. In addition to the forecast comparison exercise across small-, medium- and large-scale models, we also look at the ability of the ‘optimal’ model (i.e. the model that produces the minimum average mean squared forecast error) for a specific region in predicting ex ante real house prices (in levels) over the period of 2009:04 till 2012:02. Factor-based models (classical or Bayesian) perform the best for the North East, Mid-West, West census regions and the aggregate US economy and equally well to a small-scale VAR for the South region. The ‘optimal’ factor models also tend to predict the downward trend in the data when we conduct an ex ante forecasting exercise. Our results highlight the importance of information content in large number of fundamentals in predicting house prices accurately.  相似文献   

20.
Does Benford’s Law hold in economic research and forecasting?   总被引:1,自引:0,他引:1  
First and higher order digits in data sets of natural and socio-economic processes often follow a distribution called Benford’s law. This phenomenon has been used in business and scientific applications, especially in fraud detection for financial data. In this paper, we analyse whether Benford’s law holds in economic research and forecasting. First, we examine the distribution of regression coefficients and standard errors in research papers, published in Empirica and Applied Economics Letters. Second, we analyse forecasts of GDP growth and CPI inflation in Germany, published in Consensus Forecasts. There are two main findings: The relative frequencies of the first and second digits in economic research are broadly consistent with Benford’s law. In sharp contrast, the second digits of Consensus Forecasts exhibit a massive excess of zeros and fives, raising doubts on their information content.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号