首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
    
This paper addresses the issue of testing the ‘hybrid’ New Keynesian Phillips curve (NKPC) through vector autoregressive (VAR) systems and likelihood methods, giving special emphasis to the case where the variables are non‐stationary. The idea is to use a VAR for both the inflation rate and the explanatory variable(s) to approximate the dynamics of the system and derive testable restrictions. Attention is focused on the ‘inexact’ formulation of the NKPC. Empirical results over the period 1971–98 show that the NKPC is far from providing a ‘good first approximation’ of inflation dynamics in the Euro area.  相似文献   

2.
    
We propose an iterative decomposition that tests and accounts for multiple structural breaks in the mean, seasonality, dynamics and conditional volatility, while also accounting for outliers. Considering each component separately within each iteration leads to greater flexibility compared with joint procedures. Monte Carlo analysis shows the procedure performs well. Applied to monthly CPI inflation in G7 countries and the Euro area, we uncover mean and seasonality breaks for all countries and, allowing for these, changes in persistence are generally also indicated. Further, volatility reductions are widespread in the early to mid 1980s, with some countries exhibiting increases from 1999 onwards.  相似文献   

3.
    
Business cycle analyses have proved to be helpful to practitioners in assessing current economic conditions and anticipating upcoming fluctuations. In this article, we focus on the acceleration cycle in the euro area, namely the peaks and troughs of the growth rate which delimit the slowdown and acceleration phases of the economy. Our aim is twofold: first, we put forward a reference turning point chronology of this cycle on a monthly basis, based on gross domestic product and industrial production indices. We consider both euro area aggregate level and country‐specific cycles for the six main countries of the zone. Second, we come up with a new turning point indicator, based on business surveys carefully watched by central banks and short‐term analysts, to follow in real‐time the fluctuations of the acceleration cycle.  相似文献   

4.
We detect a new stylized fact that is common to the dynamics of all macroeconomic series, including financial aggregates. Their Auto-Correlation Functions (ACFs) share a common four-parameter functional form that arises from the dynamics of a general equilibrium model with heterogeneous firms. We find that, not only does our formula fit the data better than the ACFs that arise from auto-regressive and fractionally-integrated models, but it also yields the correct shape of the ACF, thus explaining the lags with which macroeconomic variables evolve and the onset of seemingly-sudden turning points. This finding puts a premium on quick and decisive macroeconomic policy interventions at the first signs of a turning point, in contrast to gradualist approaches.  相似文献   

5.
    
Nonlinear time series models have become fashionable tools to describe and forecast a variety of economic time series. A closer look at reported empirical studies, however, reveals that these models apparently fit well in‐sample, but rarely show a substantial improvement in out‐of‐sample forecasts, at least over linear models. One of the many possible reasons for this finding is the use of inappropriate model selection criteria and forecast evaluation criteria. In this paper we therefore propose a novel criterion, which we believe does more justice to the very nature of nonlinear models. Simulations show that this criterion outperforms those criteria currently in use, in the sense that the true nonlinear model is more often found to perform better in out‐of‐sample forecasting than a benchmark linear model. An empirical illustration for US GDP emphasizes its relevance.  相似文献   

6.
This paper proposes a testing strategy for the null hypothesis that a multivariate linear rational expectations (LRE) model may have a unique stable solution (determinacy) against the alternative of multiple stable solutions (indeterminacy). The testing problem is addressed by a misspecification-type approach in which the overidentifying restrictions test obtained from the estimation of the system of Euler equations of the LRE model through the generalized method of moments is combined with a likelihood-based test for the cross-equation restrictions that the model places on its reduced form solution under determinacy. The resulting test has no power against a particular class of indeterminate equilibria, hence the non rejection of the null hypothesis can not be interpreted conclusively as evidence of determinacy. On the other hand, this test (i) circumvents the nonstandard inferential problem generated by the presence of the auxiliary parameters that appear under indeterminacy and that are not identifiable under determinacy, (ii) does not involve inequality parametric restrictions and hence the use of nonstandard inference, (iii) is consistent against the dynamic misspecification of the LRE model, and (iv) is computationally simple. Monte Carlo simulations show that the suggested testing strategy delivers reasonable size coverage and power against dynamic misspecification in finite samples. An empirical illustration focuses on the determinacy/indeterminacy of a New Keynesian monetary business cycle model of the US economy.  相似文献   

7.
    
Models for the 12‐month‐ahead US rate of inflation, measured by the chain‐weighted consumer expenditure deflator, are estimated for 1974–98 and subsequent pseudo out‐of‐sample forecasting performance is examined. Alternative forecasting approaches for different information sets are compared with benchmark univariate autoregressive models, and substantial out‐performance is demonstrated including against Stock and Watson's unobserved components‐stochastic volatility model. Three key ingredients to the out‐performance are: including equilibrium correction component terms in relative prices; introducing nonlinearities to proxy state‐dependence in the inflation process and replacing the information criterion, commonly used in VARs to select lag length, with a ‘parsimonious longer lags’ parameterization. Forecast pooling or averaging also improves forecast performance.  相似文献   

8.
We decompose the squared VIX index, derived from US S&P500 options prices, into the conditional variance of stock returns and the equity variance premium. We evaluate a plethora of state-of-the-art volatility forecasting models to produce an accurate measure of the conditional variance. We then examine the predictive power of the VIX and its two components for stock market returns, economic activity and financial instability. The variance premium predicts stock returns while the conditional stock market variance predicts economic activity and has a relatively higher predictive power for financial instability than does the variance premium.  相似文献   

9.
This article analyzes how monetary policy has responded to exchange rate movements in six open economies, paying particular attention to the two‐way interaction between monetary policy and the exchange rate. We address this issue using a structural VAR model that is identified using a combination of sign and short‐term (zero) restrictions. Doing so we find that, while there is a instantaneous reaction in the exchange rate following a monetary policy shock in all countries, monetary policy responds significantly on impact to an exchange rate shock in only four of the six countries.  相似文献   

10.
We investigate the economic significance of trading off empirical validity of models against other desirable model properties. Our investigation is based on three alternative econometric systems of the supply side, in a model that can be used to discuss optimal monetary policy in Norway. Our results caution against compromising empirical validity when selecting a model for policy analysis. We also find large costs from basing policies on the robust model, or on a suite of models, even when it contains the valid model. This confirms an important role for econometric modelling and evaluation in model choice for policy analysis.  相似文献   

11.
    
This paper studies the estimation and testing of Euler equation models in the framework of the classical two-step minimum-distance method. The time-varying reduced-form model in the first step reflects the adaptation of private agents’ beliefs to the changing economic environment. The presumed ability of Euler conditions to deliver stable parameters indexing tastes and technology is interpreted as a time-invariant second-step model. This paper shows that, complementary to and independent of one another, both standard specification test and stability test are required for the evaluation of an Euler equation. As an empirical application, a widely used investment Euler equation is submitted to examination. The empirical outcomes appear to suggest that the standard investment model has not been a success for aggregate investment.  相似文献   

12.
This paper discusses summary measures for the speed of adjustment in possibly cointegrated Vector Autoregressive Processes (VAR). In particular we propose long-run half-lives, based on interim and total multipliers. We discuss their relation with Granger-noncausality and other types of half-life, which are shown to convey different information, except in the univariate AR(1) case. We present likelihood-based inference on long-run half-lives, regarded as discrete functions of parameters in the VAR model. It is shown how asymptotic confidence regions can be defined. An empirical illustration concerning speed of adjustment to purchasing-power parity is provided.  相似文献   

13.
    
Relationships between the Federal funds rate, unemployment, inflation and the long‐term bond rate are investigated with cointegration techniques. We find a stable long‐term relationship between the Federal funds rate, unemployment and the bond rate. This relationship is interpretable as a policy target because deviations are corrected via the Federal funds rate. Deviations of the actual Federal funds rate from the estimated target give simple indications of discretionary monetary policy, and the larger deviations relate to special episodes outside the current information set. A more traditional Taylor‐type target, where inflation appears instead of the bond rate, does not seem congruent with the data.  相似文献   

14.
We present a new approach to trend/cycle decomposition of time series that follow regime-switching processes. The proposed approach, which we label the “regime-dependent steady-state” (RDSS) decomposition, is motivated as the appropriate generalization of the Beveridge and Nelson decomposition [Beveridge, S., Nelson, C.R., 1981. A new approach to decomposition of economic time series into permanent and transitory components with particular attention to measurement of the business cycle. Journal of Monetary Economics 7, 151–174] to the setting where the reduced-form dynamics of a given series can be captured by a regime-switching forecasting model. For processes in which the underlying trend component follows a random walk with possibly regime-switching drift, the RDSS decomposition is optimal in a minimum mean-squared-error sense and is more broadly applicable than directly employing an Unobserved Components model.  相似文献   

15.
We study estimation of the date of change in persistence, from I(0)I(0) to I(1)I(1) or vice versa. Contrary to statements in the original papers, our analytical results establish that the ratio-based break point estimators of Kim [Kim, J.Y., 2000. Detection of change in persistence of a linear time series. Journal of Econometrics 95, 97–116], Kim et al. [Kim, J.Y., Belaire-Franch, J., Badillo Amador, R., 2002. Corringendum to “Detection of change in persistence of a linear time series”. Journal of Econometrics 109, 389–392] and Busetti and Taylor [Busetti, F., Taylor, A.M.R., 2004. Tests of stationarity against a change in persistence. Journal of Econometrics 123, 33–66] are inconsistent when a mean (or other deterministic component) is estimated for the process. In such cases, the estimators converge to random variables with upper bound given by the true break date when persistence changes from I(0)I(0) to I(1)I(1). A Monte Carlo study confirms the large sample downward bias and also finds substantial biases in moderate sized samples, partly due to properties at the end points of the search interval.  相似文献   

16.
    
Recently, single‐equation estimation by the generalized method of moments (GMM) has become popular in the monetary economics literature, for estimating forward‐looking models with rational expectations. We discuss a method for analysing the empirical identification of such models that exploits their dynamic structure and the assumption of rational expectations. This allows us to judge the reliability of the resulting GMM estimation and inference and reveals the potential sources of weak identification. With reference to the New Keynesian Phillips curve of Galí and Gertler [Journal of Monetary Economics (1999) Vol. 44, 195] and the forward‐looking Taylor rules of Clarida, Galí and Gertler [Quarterly Journal of Economics (2000) Vol. 115, 147], we demonstrate that the usual ‘weak instruments’ problem can arise naturally, when the predictable variation in inflation is small relative to unpredictable future shocks (news). Hence, we conclude that those models are less reliably estimated over periods when inflation has been under effective policy control.  相似文献   

17.
The Beveridge–Nelson (BN) decomposition is a model-based method for decomposing time series into permanent and transitory components. When constructed from an ARIMA model, it is closely related to decompositions based on unobserved components (UC) models with random walk trends and covariance stationary cycles. The decomposition when extended to I(2)I(2) models can also be related to non-model-based signal extraction filters such as the HP filter. We show that the BN decomposition provides information on the correlation between the permanent and transitory shocks in a certain class of UC models. The correlation between components is known to determine the smoothed estimates of components from UC models. The BN decomposition can also be used to evaluate the efficacy of alternative methods. We also demonstrate, contrary to popular belief, that the BN decomposition can produce smooth cycles if the reduced form forecasting model is appropriately specified.  相似文献   

18.
This paper conducts a broad-based comparison of iterated and direct multi-period forecasting approaches applied to both univariate and multivariate models in the form of parsimonious factor-augmented vector autoregressions. To account for serial correlation in the residuals of the multi-period direct forecasting models we propose a new SURE-based estimation method and modified Akaike information criteria for model selection. Empirical analysis of the 170 variables studied by Marcellino, Stock and Watson (2006) shows that information in factors helps improve forecasting performance for most types of economic variables although it can also lead to larger biases. It also shows that SURE estimation and finite-sample modifications to the Akaike information criterion can improve the performance of the direct multi-period forecasts.  相似文献   

19.
Beveridge and Nelson [Beveridge, Stephen, Nelson, Charles R., 1981. A new approach to decomposition of economic time series into permanent and transitory components with particular attention to measurement of the ‘business cycle’. Journal of Monetary Economics 7, 151–174] proposed that the long-run forecast is a measure of trend for time series such as GDP that do not follow a deterministic path in the long run. They showed that if the series is stationary in first differences, then the estimated trend is a random walk with drift that accounts for growth, and the cycle is stationary. In contrast to linear de-trending, the smoother of Hodrick and Prescott (1981) and Hodrick and Prescott [Hodrick, Robert, Prescott, Edward C., 1997. Post-war US business cycles: An empirical investigation. Journal of Money Credit and Banking 29 (1), 1–16] and the unobserved components model of Harvey, [Harvey, A.C., 1985. Trends and cycles in macroeconomic time series. Journal of Business and Economic Statistics 3, 216–227]. Watson [Watson, Mark W., 1986. Univariate detrending methods with stochastic trends Journal of Monetary Economics 18, 49–75] and Clark [Clark, Peter K., 1987. The cyclical component of US economic activity. The Quarterly Journal of Economics 102 (4), 797–814], the BN decomposition attributes most variation in GDP to trend shocks while the cycles are short and brief. Since each is an estimate of the transitory part of GDP that will die out, it seems natural to compare cycle measures by their ability to forecast future growth. The results presented here suggest that cycle measures contain little if any information beyond the short-term momentum captured by BN.  相似文献   

20.
Inference for multiple-equation Markov-chain models raises a number of difficulties that are unlikely to appear in smaller models. Our framework allows for many regimes in the transition matrix, without letting the number of free parameters grow as the square as the number of regimes, but also without losing a convenient form for the posterior distribution. Calculation of marginal data densities is difficult in these high-dimensional models. This paper gives methods to overcome these difficulties, and explains why existing methods are unreliable. It makes suggestions for maximizing posterior density and initiating MCMC simulations that provide robustness against the complex likelihood shape.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号