首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 125 毫秒
1.
What does a monetary policy shock do? We answer this question by estimating a new‐Keynesian monetary policy dynamic stochastic general equilibrium model for a number of economies with a variety of empirical proxies of the business cycle. The effects of two different policy shocks, an unexpected interest rate hike conditional on a constant inflation target and an unpredicted drift in the inflation target, are scrutinized. Filter‐specific Bayesian impulse responses are contrasted with those obtained by combining multiple business cycle indicators. Our results document the substantial uncertainty surrounding the estimated effects of these two policy shocks across a number of countries.  相似文献   

2.
Estimated policy rules are reduced‐form equations that are silent on many important policy questions. However, a structural understanding of monetary policy can be obtained by estimating a policymaker's objective function. The paper derives conditions under which the parameters in a policymaker's policy objective function can be identified and estimated. We apply these conditions to a New Keynesian sticky‐price model of the US economy. The results show that the implicit inflation target and the relative weight placed on interest rate smoothing both declined when Paul Volcker was appointed Federal Reserve chairman.  相似文献   

3.
In this paper, we assess the possibility of producing unbiased forecasts for fiscal variables in the Euro area by comparing a set of procedures that rely on different information sets and econometric techniques. In particular, we consider autoregressive moving average models, Vector autoregressions, small‐scale semistructural models at the national and Euro area level, institutional forecasts (Organization for Economic Co‐operation and Development), and pooling. Our small‐scale models are characterized by the joint modelling of fiscal and monetary policy using simple rules, combined with equations for the evolution of all the relevant fundamentals for the Maastricht Treaty and the Stability and Growth Pact. We rank models on the basis of their forecasting performance using the mean square and mean absolute error criteria at different horizons. Overall, simple time‐series methods and pooling work well and are able to deliver unbiased forecasts, or slightly upward‐biased forecast for the debt–GDP dynamics. This result is mostly due to the short sample available, the robustness of simple methods to structural breaks, and to the difficulty of modelling the joint behaviour of several variables in a period of substantial institutional and economic changes. A bootstrap experiment highlights that, even when the data are generated using the estimated small‐scale multi‐country model, simple time‐series models can produce more accurate forecasts, because of their parsimonious specification.  相似文献   

4.
General equilibrium models that include policy rules for government spending, lump-sum transfers, and distortionary taxation on labor and capital income and on consumption expenditures are fit to US data under rich specifications of fiscal policy rules to obtain several results. First, the best-fitting model allows many fiscal instruments to respond to debt. Second, responses of aggregates to fiscal policy shocks under rich rules vary considerably from responses where only non-distortionary fiscal instruments finance debt. Third, in the short run, all fiscal instruments except labor taxes react strongly to debt, but long-run intertemporal financing comes from all components of the government’s budget constraint. Fourth, debt-financed fiscal shocks trigger long-lasting dynamics; short-run and long-run multipliers can differ markedly.  相似文献   

5.
Monetary targets have come to be regarded as inadequate for the conduct of short-term monetary policy, both among theoreticians and practitioners of policy. In this paper two approaches are put forward, analysed and evaluated for improving the performance of monetary targets. According to the first approach, simple rules for monetary targets are derived within an optimisation framework. These rules, related to ultimate targets, are simple so that they can be announced and are flexible so that they are subject to revision when the economy drifts away from its course due to unexpected shocks.The second approach is based on indicators and complements monetary targets with exchange rate targets through a simple feedback law for determining interest rate policy. The advantage of this feedback law is that it provides the mechanism through which policy is to be revised in response to shocks. If such a feedback law is announced, private economic agents have the means of distinguishing discretionary and arbitrary changes of policy from those which are needed to bring the economy back to the announced and committed course. This approach is used to analyse and extend the suggestions in the House of Commons Report on International Monetary Arrangements.The common ground between the two approaches is an optimisation framework with respect to the parameters of either the fixed simple rules or the simple feedback laws. This is discussed in section 1. The approach of deriving simple fixed rules is illustrated in a monetarist model in which there is a link between private sector expectations and the credible announcement of monetary targets. The model is explained in section 2 and simple fixed rules are discussed in section 3. The performance of simple rules for monetary targets is evaluated in terms of a minimax strategy with model uncertainty between the monetarist model and a Keynesian model without the assumption of announcement effects. This is discussed in section 4. Optimal feedback laws are derived and analysed in section 5. The parameter sensitivity of these feedback laws with respect to the model and the objective function, as well as their behaviour under shocks, is also examined.  相似文献   

6.
In this paper, we study the implications of macroprudential policies in a monetary union for macroeconomic and financial stability. For this purpose, we develop a two-country monetary union new Keynesian general equilibrium model with housing and collateral constraints, to be calibrated for Lithuania and the rest of the euro area. We consider two different scenarios for macroprudential policies: one in which the ECB extends its goals to also include financial stability and a second one in which a national macroprudential authority uses the loan-to-value ratio (LTV) as an instrument. The results show that both rules are effective in making the financial system more stable in both countries, and especially in Lithuania. This is because the financial sector in this country is more sensitive to shocks. We find that an extended Taylor rule is indeed effective in reducing the volatility of credit, but comes with a cost in terms of higher inflation volatility. The simple LTV rule, on the other hand, does not compromise the objective of monetary policy. This reinforces the “Tinbergen principle”, which argues that there should be two different instruments when there are two different policy goals.  相似文献   

7.
We use a bivariate generalized autoregressive conditionally heteroskedastic (GARCH) model of inflation and output growth to examine the causality relationship among nominal uncertainty, real uncertainty and macroeconomic performance measured by the inflation and output growth rates. The application of the constant conditional correlation GARCH(1,1) model leads to a number of interesting conclusions. First, inflation does cause negative welfare effects, both directly and indirectly, i.e. via the inflation uncertainty channel. Secondly, in some countries, more inflation uncertainty provides an incentive to Central Banks to surprise the public by raising inflation unexpectedly. Thirdly, in contrast to the assumptions of some macroeconomic models, business cycle variability and the rate of economic growth are related. More variability in the business cycle leads to more output growth.  相似文献   

8.
This paper derives limit distributions of empirical likelihood estimators for models in which inequality moment conditions provide overidentifying information. We show that the use of this information leads to a reduction of the asymptotic mean-squared estimation error and propose asymptotically uniformly valid tests and confidence sets for the parameters of interest. While inequality moment conditions arise in many important economic models, we use a dynamic macroeconomic model as a data generating process and illustrate our methods with instrumental variable estimators of monetary policy rules. The results obtained in this paper extend to conventional GMM estimators.  相似文献   

9.
We develop a general framework for analyzing the usefulness of imposing parameter restrictions on a forecasting model. We propose a measure of the usefulness of the restrictions that depends on the forecaster’s loss function and that could be time varying. We show how to conduct inference about this measure. The application of our methodology to analyzing the usefulness of no-arbitrage restrictions for forecasting the term structure of interest rates reveals that: (1) the restrictions have become less useful over time; (2) when using a statistical measure of accuracy, the restrictions are a useful way to reduce parameter estimation uncertainty, but are dominated by restrictions that do the same without using any theory; (3) when using an economic measure of accuracy, the no-arbitrage restrictions are no longer dominated by atheoretical restrictions, but for this to be true it is important that the restrictions incorporate a time-varying risk premium.  相似文献   

10.
External financial frictions might increase the severity of economic uncertainty shocks. We analyze the impact of aggregate uncertainty and financial condition shocks using a threshold vector autoregressive (TVAR) model with stochastic volatility during distinct US financial stress regimes. We further examine the international spillover of the US financial shock. Our results show that the peak contraction in euro area industrial production due to uncertainty shocks during a financial crisis is nearly-four times larger than the peak contraction during normal times. The US financial shocks have an influential asymmetric spillover effect on the euro area. Furthermore, the estimates reveal that the European Central Bank (ECB) is more cautious in implementing a monetary policy against uncertainty shocks while adopting hawkish monetary policies against financial shocks. In contrast, the Fed adopts a more hawkish monetary policy during heightened uncertainty, whereas it acts more steadily when financial stress rises in the economy.  相似文献   

11.
This paper examines the trade-offs that a central bank faces when the exchange rate can experience sustained deviations from fundamentals and occasionally collapse. The economy is modelled as switching randomly between different regimes according to time-invariant transition probabilities. We compute both the optimal regime-switching control rule for this economy and optimised linear Taylor rules, in the two cases where the transition probabilities are known with certainty and where they are uncertain. The simple algorithms used in the computation are also of independent interest as tools for the study of monetary policy under general forms of (asymmetric) additive and multiplicative uncertainty. An interesting finding is that policies based on robust (minmax) values of the transition probabilities are usually more conservative.  相似文献   

12.
We propose a Bayesian combination approach for multivariate predictive densities which relies upon a distributional state space representation of the combination weights. Several specifications of multivariate time-varying weights are introduced with a particular focus on weight dynamics driven by the past performance of the predictive densities and the use of learning mechanisms. In the proposed approach the model set can be incomplete, meaning that all models can be individually misspecified. A Sequential Monte Carlo method is proposed to approximate the filtering and predictive densities. The combination approach is assessed using statistical and utility-based performance measures for evaluating density forecasts of simulated data, US macroeconomic time series and surveys of stock market prices. Simulation results indicate that, for a set of linear autoregressive models, the combination strategy is successful in selecting, with probability close to one, the true model when the model set is complete and it is able to detect parameter instability when the model set includes the true model that has generated subsamples of data. Also, substantial uncertainty appears in the weights when predictors are similar; residual uncertainty reduces when the model set is complete; and learning reduces this uncertainty. For the macro series we find that incompleteness of the models is relatively large in the 1970’s, the beginning of the 1980’s and during the recent financial crisis, and lower during the Great Moderation; the predicted probabilities of recession accurately compare with the NBER business cycle dating; model weights have substantial uncertainty attached. With respect to returns of the S&P 500 series, we find that an investment strategy using a combination of predictions from professional forecasters and from a white noise model puts more weight on the white noise model in the beginning of the 1990’s and switches to giving more weight to the professional forecasts over time. Information on the complete predictive distribution and not just on some moments turns out to be very important, above all during turbulent times such as the recent financial crisis. More generally, the proposed distributional state space representation offers great flexibility in combining densities.  相似文献   

13.
Sustainable economic development in the future is driven by public policy on regional, national and global levels. Therefore a comprehensive policy analysis is needed that provides consistent and effective policy support. However, a general problem facing classical policy analysis is model uncertainty. All actors, those involved in the policy choice and those in the policy analysis, are fundamentally uncertain which of the different models corresponds to the true generative mechanism that represents the natural, economic, or social phenomena on which policy analysis is focused. In this paper, we propose a general framework that explicitly incorporates model uncertainty into the derivation of a policy choice. Incorporating model uncertainty into the analysis is limited by the very high required computational effort. In this regard, we apply metamodeling techniques as a way to reduce computational complexity. We demonstrate the effect of different metamodel types using a reduced model for the case of CAADP in Senegal. Furthermore, we explicitly show that ignoring model uncertainty leads to inefficient policy choices and results in a large waste of public resources.  相似文献   

14.
This paper proposes a model of the US unemployment rate which accounts for both its asymmetry and its long memory. Our approach introduces fractional integration and nonlinearities simultaneously into the same framework, using a Lagrange multiplier procedure with a standard null‐limit distribution. The empirical results suggest that the US unemployment rate can be specified in terms of a fractionally integrated process, which interacts with some nonlinear functions of labour‐demand variables such as real oil prices and real interest rates. We also find evidence of a long‐memory component. Our results are consistent with a hysteresis model with path dependency rather than a non‐accelerating inflation rate of unemployment (NAIRU) model with an underlying unemployment equilibrium rate, thereby giving support to more activist stabilization policies. However, any suitable model should also include business cycle asymmetries, with implications for both forecasting and policy‐making.  相似文献   

15.
This paper proposes SupWald tests from a threshold autoregressive model computed with an adaptive set of thresholds. Simple examples of adaptive threshold sets are given. A second contribution of the paper is a general asymptotic null limit theory when the threshold variable is a level variable. We obtain a pivotal null limiting distribution under some simple conditions for bounded or asymptotically unbounded thresholds. Our general approach is flexible enough to allow a choice of the auxiliary threshold model or of the threshold set involved in the test specifically designed for nonlinear stationary alternatives relevant for macroeconomic and financial topics involving arbitrage in presence of transaction costs. A Monte-Carlo study and an application to the interest rates spread for French, German, New-Zealander and US post-1980 monthly data illustrate the ability of the adaptive SupWald tests to reject unit-root when the ADF does not.  相似文献   

16.
In this paper we ask how uncertainty about fiscal policy affects the impact of fiscal policy changes on the economy when the government tries to counteract a deep recession. The agents in our model are uncertain about the conduct of fiscal policy and act as econometricians by estimating fiscal policy rules that might change over time.We find that assuming that agents are not instantaneously aware of the new fiscal policy regime in place leads to substantially more volatility in the short run and persistent differences in average outcomes. We highlight issues that can arise when a policymaker wants to announce a policy change. From a methodological perspective, we introduce a novel way to model learning in the face of discrete policy changes.  相似文献   

17.
We study two kinds of unconventional monetary policies: announcements about the future path of the short-term rate and long-term nominal interest rates as operating instruments of monetary policy. We do so in a model where the risk premium on long-term debt is, in part, endogenously determined. We find that both policies are consistent with unique equilibria, that, at the zero lower bound, announcements about the future path of the short-term rate can lower long-term interest rates through their impact both on expectations and on the risk premium and that long-term interest rate rules perform as well as, and at times better than, conventional Taylor rules. With simulations, we show that long-term interest rate rules generate sensible dynamics both when in operation and when expected to be applied.  相似文献   

18.
A fundamental shift in monetary policy occurred around 1980: the Fed went from a “passive” policy to an “active” policy. We study a model in which government bonds provide transactions services. We present two calibrations of our model, using pre- and post-1980 data. We show that estimates of pre- and post-1980 policy rules all lie within our determinacy regions. But, the pre-1980 policy was a very bad monetary policy, even if it avoided sunspot equilibria. Model simulations suggest that household welfare would have increased by 3.3 percent of permanent consumption in this period under an active policy.  相似文献   

19.
This article combines a Structural Vector Autoregression with a no‐arbitrage approach to build a multifactor Affine Term Structure Model (ATSM). The resulting No‐Arbitrage Structural Vector Autoregressive (NASVAR) model implies that expected excess returns are driven by structural macroeconomic shocks. This is in contrast with a standard ATSM, in which agents are concerned with non‐structural risks. As a simple application, we study the effects of supply, demand and monetary policy shocks on the UK yield curve. We show that all structural shocks affect the slope of the yield curve, with demand and supply shocks accounting for a large part of the time variation in bond yields.  相似文献   

20.
What types of monetary and fiscal policy rules produce self-fulfilling deflationary paths that are monotonic and empirically relevant? This paper presents simple theoretical conditions that guarantee the existence of these paths in a general equilibrium model with sticky prices. These sufficient conditions are weak enough to be satisfied by most monetary and fiscal policy rules. A quantification of the model which combines a real shock à la Hayashi and Prescott (2002) with a simultaneous sunspot that deanchors inflation expectations matches the main empirical features of the Japanese deflationary process during the “lost decade”. The results also highlight the key role of the assumption about the anchoring of inflation expectations for the size of fiscal multipliers and, in general, for any policy analysis.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号