首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
This paper considers the problem of forecasting under continuous and discrete structural breaks and proposes weighting observations to obtain optimal forecasts in the MSFE sense. We derive optimal weights for one step ahead forecasts. Under continuous breaks, our approach largely recovers exponential smoothing weights. Under discrete breaks, we provide analytical expressions for optimal weights in models with a single regressor, and asymptotically valid weights for models with more than one regressor. It is shown that in these cases the optimal weight is the same across observations within a given regime and differs only across regimes. In practice, where information on structural breaks is uncertain, a forecasting procedure based on robust optimal weights is proposed. The relative performance of our proposed approach is investigated using Monte Carlo experiments and an empirical application to forecasting real GDP using the yield curve across nine industrial economies.  相似文献   

2.
We develop an easy-to-implement method for forecasting a stationary autoregressive fractionally integrated moving average (ARFIMA) process subject to structural breaks with unknown break dates. We show that an ARFIMA process subject to a mean shift and a change in the long memory parameter can be well approximated by an autoregressive (AR) model and suggest using an information criterion (AIC or Mallows’ CpCp) to choose the order of the approximate AR model. Our method avoids the issue of estimation inaccuracy of the long memory parameter and the issue of spurious breaks in finite sample. Insights from our theoretical analysis are confirmed by Monte Carlo experiments, through which we also find that our method provides a substantial improvement over existing prediction methods. An empirical application to the realized volatility of three exchange rates illustrates the usefulness of our forecasting procedure. The empirical success of the HAR-RV model can be explained, from an econometric perspective, by our theoretical and simulation results.  相似文献   

3.
We propose new methods for evaluating predictive densities. The methods include Kolmogorov–Smirnov and Cramér–von Mises-type tests for the correct specification of predictive densities robust to dynamic mis-specification. The novelty is that the tests can detect mis-specification in the predictive densities even if it appears only over a fraction of the sample, due to the presence of instabilities. Our results indicate that our tests are well sized and have good power in detecting mis-specification in predictive densities, even when it is time-varying. An application to density forecasts of the Survey of Professional Forecasters demonstrates the usefulness of the proposed methodologies.  相似文献   

4.
This paper conducts a broad-based comparison of iterated and direct multi-period forecasting approaches applied to both univariate and multivariate models in the form of parsimonious factor-augmented vector autoregressions. To account for serial correlation in the residuals of the multi-period direct forecasting models we propose a new SURE-based estimation method and modified Akaike information criteria for model selection. Empirical analysis of the 170 variables studied by Marcellino, Stock and Watson (2006) shows that information in factors helps improve forecasting performance for most types of economic variables although it can also lead to larger biases. It also shows that SURE estimation and finite-sample modifications to the Akaike information criterion can improve the performance of the direct multi-period forecasts.  相似文献   

5.
Economic sentiment surveys are carried out by all European Union member states and are often seen as early indicators for future economic developments. Based on these surveys, the European Commission constructs an aggregate European Economic Sentiment Indicator (ESI). This paper compares the ESI with more sophisticated aggregation schemes based on statistical methods: dynamic factor analysis and partial least squares. The indicator based on partial least squares clearly outperforms the other two indicators in terms of comovement with economic activity. In terms of forecast ability, the ESI, constructed in a rather ad hoc way, can compete with the other indicators.  相似文献   

6.
We consider model selection facing uncertainty over the choice of variables and the occurrence and timing of multiple location shifts. General-to-simple selection is extended by adding an impulse indicator for every observation to the set of candidate regressors: see Johansen and Nielsen (2009). We apply that approach to a fat-tailed distribution, and to processes with breaks: Monte Carlo experiments show its capability of detecting up to 20 shifts in 100 observations, while jointly selecting variables. An illustration to US real interest rates compares impulse-indicator saturation with the procedure in Bai and Perron (1998).  相似文献   

7.
In this paper we consider tests for the null of (trend-) stationarity against the alternative of a change in persistence at some (known or unknown) point in the observed sample, either from I(0)I(0) to I(1)I(1) behaviour or vice versa, of, inter alia, [Kim, J., 2000. Detection of change in persistence of a linear time series. Journal of Econometrics 95, 97–116]. We show that in circumstances where the innovation process displays non-stationary unconditional volatility of a very general form, which includes single and multiple volatility breaks as special cases, the ratio-based statistics used to test for persistence change do not have pivotal limiting null distributions. Numerical evidence suggests that this can cause severe over-sizing in the tests. In practice it may therefore be hard to discriminate between persistence change processes and processes with constant persistence but which display time-varying unconditional volatility. We solve the identified inference problem by proposing wild bootstrap-based implementations of the tests. Monte Carlo evidence suggests that the bootstrap tests perform well in finite samples. An empirical illustration using US price inflation data is provided.  相似文献   

8.
When location shifts occur, cointegration-based equilibrium-correction models (EqCMs) face forecasting problems. We consider alleviating such forecast failure by updating, intercept corrections, differencing, and estimating the future progress of an ‘internal’ break. Updating leads to a loss of cointegration when an EqCM suffers an equilibrium-mean shift, but helps when collinearities are changed by an ‘external’ break with the EqCM staying constant. Both mechanistic corrections help compared to retaining a pre-break estimated model, but an estimated model of the break process could outperform. We apply the approaches to EqCMs for UK M1, compared with updating a learning function as the break evolves.  相似文献   

9.
It is well known that for continuous time models with a linear drift standard estimation methods yield biased estimators for the mean reversion parameter both in finite discrete samples and in large in-fill samples. In this paper, we obtain two expressions to approximate the bias of the least squares/maximum likelihood estimator of the mean reversion parameter in the Ornstein–Uhlenbeck process with a known long run mean when discretely sampled data are available. The first expression mimics the bias formula of Marriott and Pope (1954) for the discrete time model. Simulations show that this expression does not work satisfactorily when the speed of mean reversion is slow. Slow mean reversion corresponds to the near unit root situation and is empirically realistic for financial time series. An improvement is made in the second expression where a nonlinear correction term is included into the bias formula. It is shown that the nonlinear term is important in the near unit root situation. Simulations indicate that the second expression captures the magnitude, the curvature and the non-monotonicity of the actual bias better than the first expression.  相似文献   

10.
We consider issues related to the order of an autoregression selected using information criteria. We study the sensitivity of the estimated order to (i) whether the effective number of observations is held fixed when estimating models of different order, (ii) whether the estimate of the variance is adjusted for degrees of freedom, and (iii) how the penalty for overfitting is defined in relation to the total sample size. Simulations show that the lag length selected by both the Akaike and the Schwarz information criteria are sensitive to these parameters in finite samples. The methods that give the most precise estimates are those that hold the effective sample size fixed across models to be compared. Theoretical considerations reveal that this is indeed necessary for valid model comparisons. Guides to robust model selection are provided.  相似文献   

11.
In this paper we provide a joint treatment of two major problems that surround testing for a unit root in practice: uncertainty as to whether or not a linear deterministic trend is present in the data, and uncertainty as to whether the initial condition of the process is (asymptotically) negligible or not. We suggest decision rules based on the union of rejections of four standard unit root tests (OLS and quasi-differenced demeaned and detrended ADF unit root tests), along with information regarding the magnitude of the trend and initial condition, to allow simultaneously for both trend and initial condition uncertainty.  相似文献   

12.
Many predictors employed in forecasting macroeconomic and finance variables display a great deal of persistence. Tests for determining the usefulness of these predictors are typically oversized, overstating their importance. Similarly, hypothesis tests on cointegrating vectors will typically be oversized if there is not an exact unit root. This paper uses a control variable approach where adding stationary covariates with certain properties to the model can result in asymptotic normal inference for prediction regressions and cointegration vector estimates in the presence of possibly non-unit root trending covariates. The properties required for this result are derived and discussed.  相似文献   

13.
14.
We provide an extensive evaluation of the predictive performance of the US yield curve for US gross domestic product growth by using new tests for forecast breakdown, in addition to a variety of in‐sample and out‐of‐sample evaluation procedures. Empirical research over the past decades has uncovered a strong predictive relationship between the yield curve and output growth, whose stability has recently been questioned. We document the existence of a forecast breakdown during the Burns–Miller and Volker monetary policy regimes, whereas during the early part of the Greenspan era the yield curve emerged as a more reliable model to predict future economic activity.  相似文献   

15.
We analyze optimality properties of maximum likelihood (ML) and other estimators when the problem does not necessarily fall within the locally asymptotically normal (LAN) class, therefore covering cases that are excluded from conventional LAN theory such as unit root nonstationary time series. The classical Hájek–Le Cam optimality theory is adapted to cover this situation. We show that the expectation of certain monotone “bowl-shaped” functions of the squared estimation error are minimized by the ML estimator in locally asymptotically quadratic situations, which often occur in nonstationary time series analysis when the LAN property fails. Moreover, we demonstrate a direct connection between the (Bayesian property of) asymptotic normality of the posterior and the classical optimality properties of ML estimators.  相似文献   

16.
The first two influential books on economic forecasting are by Henri Theil [1961, second edition 1965. Economic Forecasts and Policy. North-Holland, Amsterdam] and by George Box and Gwilym Jenkins [1970. Time Series Analysis, Forecasting and Control. Holden Day, San Francisco]. Theil introduced advanced mathematical statistical techniques and considered a variety of types of data. Box and Jenkins introduced ARIMA models and how they are used to forecast. With these foundations, the field of economic forecasting has considered a wide range of techniques and models, wider and deeper information sets, longer horizons, and deeper questions including how to better evaluate all forecasts and how to disentangle a forecast, a policy, and the outcomes. Originally, forecasts were just for means (or expectations) then moved to variances, and now consider predictive distributions. Eventually, multivariate distributions will have to be considered, but evaluation will be difficult.  相似文献   

17.
In this paper we consider the problem of semiparametric efficient estimation in conditional quantile models with time series data. We construct an M-estimator which achieves the semiparametric efficiency bound recently derived by Komunjer and Vuong (forthcoming). Our efficient M-estimator is obtained by minimizing an objective function which depends on a nonparametric estimator of the conditional distribution of the variable of interest rather than its density. The estimator is new and not yet seen in the literature. We illustrate its performance through a Monte Carlo experiment.  相似文献   

18.
This paper introduces the concept of risk parameter in conditional volatility models of the form ?t=σt(θ0)ηt?t=σt(θ0)ηt and develops statistical procedures to estimate this parameter. For a given risk measure rr, the risk parameter is expressed as a function of the volatility coefficients θ0θ0 and the risk, r(ηt)r(ηt), of the innovation process. A two-step method is proposed to successively estimate these quantities. An alternative one-step approach, relying on a reparameterization of the model and the use of a non Gaussian QML, is proposed. Asymptotic results are established for smooth risk measures, as well as for the Value-at-Risk (VaR). Asymptotic comparisons of the two approaches for VaR estimation suggest a superiority of the one-step method when the innovations are heavy-tailed. For standard GARCH models, the comparison only depends on characteristics of the innovations distribution, not on the volatility parameters. Monte-Carlo experiments and an empirical study illustrate the superiority of the one-step approach for financial series.  相似文献   

19.
Tests of ARCH are a routine diagnostic in empirical econometric and financial analysis. However, it is well known that misspecification of the conditional mean may lead to spurious rejection of the null hypothesis of no ARCH. Nonlinearity is a prime example of this phenomenon. There is little work on the extent of the effect of neglected nonlinearity on the properties of ARCH tests. We investigate this using new ARCH testing procedures that are robust to the presence of neglected nonlinearity. Monte Carlo evidence shows that the problem is serious and that the new methods alleviate this problem to a very large extent. We apply the new tests to exchange rate data and find substantial evidence of spurious rejection of the null hypothesis of no ARCH.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号