首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
This paper addresses the issue of testing the ‘hybrid’ New Keynesian Phillips curve (NKPC) through vector autoregressive (VAR) systems and likelihood methods, giving special emphasis to the case where the variables are non‐stationary. The idea is to use a VAR for both the inflation rate and the explanatory variable(s) to approximate the dynamics of the system and derive testable restrictions. Attention is focused on the ‘inexact’ formulation of the NKPC. Empirical results over the period 1971–98 show that the NKPC is far from providing a ‘good first approximation’ of inflation dynamics in the Euro area.  相似文献   

2.
Using euro‐area data, we re‐examine the empirical success of New‐Keynesian Phillips curves (NKPCs). We re‐estimate with a suitably specified optimizing supply side (which attempts to treat non‐stationarity in factor income shares and mark‐ups) that allows us to derive estimates of technology parameters, marginal costs and ‘price gaps’. Our resulting estimates of the euro‐area NKPCs are robust, provide reasonable estimates for fixed‐price durations and discount rates and embody plausible dynamic properties. Our method for identifying the underlying determinants of NKPCs has general applicability to a wide set of countries as well as of use for sectoral studies.  相似文献   

3.
We study optimal monetary policy in a New Keynesian model at the zero bound interest rate where households use cash alongside house equity borrowing to conduct transactions. The amount of borrowing is limited by a collateral constraint. When either the loan to value ratio declines or house prices fall, we observe a decrease in the money multiplier. We argue that the central bank should respond to the fall in the money multiplier and therefore to the reduction in house prices or the loan to collateral value ratio. We also find that optimal monetary policy generates a large and persistent fall in the money multiplier in response to the drop in the loan to collateral value ratio.  相似文献   

4.
We give an appraisal of the New Keynesian Phillips curve (NPCM) as an empirical model of European inflation. The favourable evidence for NPCMs on euro‐area data reported in earlier studies is shown to depend on specific choices made about estimation methodology. The NPCM can be re‐interpreted as a highly restricted equilibrium correction model. We also report the outcome of tests based on variable addition and encompassing of existing models. The results show that economists should not accept the NPCM too readily.  相似文献   

5.
This paper investigates business cycle relations among different economies in the Euro area. Cyclical dynamics are explicitly modelled as part of a time series model. We introduce mechanisms that allow for increasing or diminishing phase shifts and for time‐varying association patterns in different cycles. Standard Kalman filter techniques are used to estimate the parameters simultaneously by maximum likelihood. The empirical illustrations are based on gross domestic product (GDP) series of seven European countries that are compared with the GDP series of the Euro area and that of the US. The original integrated time series are band‐pass filtered. We find that there is an increasing resemblance between the business cycle fluctuations of the European countries analysed and those of the Euro area, although with varying patterns.  相似文献   

6.
In this paper, we assess the possibility of producing unbiased forecasts for fiscal variables in the Euro area by comparing a set of procedures that rely on different information sets and econometric techniques. In particular, we consider autoregressive moving average models, Vector autoregressions, small‐scale semistructural models at the national and Euro area level, institutional forecasts (Organization for Economic Co‐operation and Development), and pooling. Our small‐scale models are characterized by the joint modelling of fiscal and monetary policy using simple rules, combined with equations for the evolution of all the relevant fundamentals for the Maastricht Treaty and the Stability and Growth Pact. We rank models on the basis of their forecasting performance using the mean square and mean absolute error criteria at different horizons. Overall, simple time‐series methods and pooling work well and are able to deliver unbiased forecasts, or slightly upward‐biased forecast for the debt–GDP dynamics. This result is mostly due to the short sample available, the robustness of simple methods to structural breaks, and to the difficulty of modelling the joint behaviour of several variables in a period of substantial institutional and economic changes. A bootstrap experiment highlights that, even when the data are generated using the estimated small‐scale multi‐country model, simple time‐series models can produce more accurate forecasts, because of their parsimonious specification.  相似文献   

7.
We propose a novel identification‐robust test for the null hypothesis that an estimated New Keynesian model has a reduced form consistent with the unique stable solution against the alternative of sunspot‐driven multiple equilibria. Our strategy is designed to handle identification failures as well as the misspecification of the relevant propagation mechanisms. We invert a likelihood ratio test for the cross‐equation restrictions (CER) that the New Keynesian system places on its reduced‐form solution under determinacy. If the CER are not rejected, sunspot‐driven expectations can be ruled out from the model equilibrium and we accept the structural model. Otherwise, we move to a second‐step and invert an Anderson and Rubin‐type test for the orthogonality restrictions (OR) implied by the system of structural Euler equations. The hypothesis of indeterminacy and the structural model are accepted if the OR are not rejected. We investigate the finite‐sample performance of the suggested identification‐robust two‐step testing strategy by some Monte Carlo experiments and then apply it to a New Keynesian AD/AS model estimated with actual US data. In spite of some evidence of weak identification as for the ‘Great Moderation’ period, our results offer formal support to the hypothesis of a switch from indeterminacy to a scenario consistent with uniqueness occurring in the late 1970s. Our identification‐robust full‐information confidence set for the structural parameters computed on the ‘Great Moderation’ regime turns out to be more precise than the intervals previously reported in the literature through ‘limited‐information’ methods. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

8.
This paper explores ways to integrate model uncertainty into policy evaluation. We describe a general framework that includes both model averaging methods as well as some measures that describe whether policies and their consequences are model dependent. These general ideas are then applied to assess simple monetary policy rules for some standard New Keynesian specifications. We conclude that the original Taylor rule has good robustness properties, but may reasonably be challenged in overall quality with respect to stabilization by alternative simple rules, even when these rules employ parameters that are set without accounting for model uncertainty.  相似文献   

9.
Estimated policy rules are reduced‐form equations that are silent on many important policy questions. However, a structural understanding of monetary policy can be obtained by estimating a policymaker's objective function. The paper derives conditions under which the parameters in a policymaker's policy objective function can be identified and estimated. We apply these conditions to a New Keynesian sticky‐price model of the US economy. The results show that the implicit inflation target and the relative weight placed on interest rate smoothing both declined when Paul Volcker was appointed Federal Reserve chairman.  相似文献   

10.
This paper focuses on the dynamic misspecification that characterizes the class of small‐scale New Keynesian models currently used in monetary and business cycle analysis, and provides a remedy for the typical difficulties these models have in accounting for the rich contemporaneous and dynamic correlation structure of the data. We suggest using a statistical model for the data as a device through which it is possible to adapt the econometric specification of the New Keynesian model such that the risk of omitting important propagation mechanisms is kept under control. A pseudo‐structural form is built from the baseline system of Euler equations by forcing the state vector of the system to have the same dimension as the state vector characterizing the statistical model. The pseudo‐structural form gives rise to a set of cross‐equation restrictions that do not penalize the autocorrelation structure and persistence of the data. Standard estimation and evaluation methods can be used. We provide an empirical illustration based on USA quarterly data and a small‐scale monetary New Keynesian model.  相似文献   

11.
Steady‐state restrictions are commonly imposed on highly persistent variables to achieve stationarity prior to confronting rational expectations models with data. However, the resulting steady‐state deviations are often surprisingly persistent indicating that some aspects of the underlying theory may be empirically problematic. This paper discusses how to formulate steady‐state restrictions in rational expectations models with latent forcing variables and test their validity using cointegration techniques. The approach is illustrated by testing steady‐state restrictions for alternative specifications of the New Keynesian model and shown to be able to discriminate between different assumptions on the sources of the permanent shocks.  相似文献   

12.
We suggest to use a factor model based backdating procedure to construct historical Euro‐area macroeconomic time series data for the pre‐Euro period. We argue that this is a useful alternative to standard contemporaneous aggregation methods. The article investigates for a number of Euro‐area variables whether forecasts based on the factor‐backdated data are more precise than those obtained with standard area‐wide data. A recursive pseudo‐out‐of‐sample forecasting experiment using quarterly data is conducted. Our results suggest that some key variables (e.g. real GDP, inflation and long‐term interest rate) can indeed be forecasted more precisely with the factor‐backdated data.  相似文献   

13.
In this paper, we evaluate the role of a set of variables as leading indicators for Euro‐area inflation and GDP growth. Our leading indicators are taken from the variables in the European Central Bank's (ECB) Euro‐area‐wide model database, plus a set of similar variables for the US. We compare the forecasting performance of each indicator ex post with that of purely autoregressive models. We also analyse three different approaches to combining the information from several indicators. First, ex post, we discuss the use as indicators of the estimated factors from a dynamic factor model for all the indicators. Secondly, within an ex ante framework, an automated model selection procedure is applied to models with a large set of indicators. No future information is used, future values of the regressors are forecast, and the choice of the indicators is based on their past forecasting records. Finally, we consider the forecasting performance of groups of indicators and factors and methods of pooling the ex ante single‐indicator or factor‐based forecasts. Some sensitivity analyses are also undertaken for different forecasting horizons and weighting schemes of forecasts to assess the robustness of the results.  相似文献   

14.
We suggest a strategy to evaluate members of a class of New‐Keynesian models of a small open economy. As an example, we estimate a modified version of the model in Svensson [Journal of International Economics (2000) Vol. 50, pp. 155–183] and compare its impulse response and variance decomposition functions with those a structural vector autoregression (VAR) model. The focus is on responses to foreign rather than to domestic shocks, which facilitates identification. Some results are that US shocks account for large shares of the variance of Canadian variables, that little of this influence is due to real exchange rate movements, and that Canadian monetary policy is not adequately described by a Taylor rule.  相似文献   

15.
We survey literature comparing inflation targeting (IT) and price‐level targeting (PT) as macroeconomic stabilisation policies. Our focus is on New Keynesian models and areas that have seen significant developments since Ambler's (2009, Price‐level targeting and stabilisation policy: a survey. Journal of Economic Surveys 23(5): 974–997) survey: optimal monetary policy; the zero lower bound; financial frictions and transition costs of adopting a PT regime. Ambler's conclusion that PT improves social welfare in New Keynesian models is fairly robust, but we note an interesting split in the literature: PT consistently outperforms IT in models where policymakers commit to simple Taylor‐type rules, but results in favour of PT when policymakers minimise loss functions are overturned with small deviations from the baseline model. Since the beneficial effects of PT appear to hang on the joint assumption that agents are rational and the economy New Keynesian, we discuss survey and experimental evidence on rational expectations and the applied macro literature on the empirical performance of New Keynesian models. Overall, the evidence is not clear‐cut, but we note that New Keynesian models can pass formal statistical tests against macro data and that models with rational expectations outperform those with behavioural expectations (i.e. heuristics) in direct statistical tests. We therefore argue that policymakers should continue to pay attention to PT.  相似文献   

16.
This paper extends a New Keynesian model to include roles for currency and deposits as competing sources of liquidity services demanded by households. It shows that, both qualitatively and quantitatively, the Barnett critique applies: while a Divisia aggregate of monetary services tracks the true monetary aggregate almost perfectly, a simple-sum measure often behaves quite differently. The model also shows that movements in both quantity and price indexes for monetary services correlate strongly with movements in output following a variety of shocks. Finally, the analysis characterizes the optimal monetary policy response to disturbances that originate in the financial sector.  相似文献   

17.
Jim Lee 《Economic Systems》2009,33(4):325-343
This paper empirically investigates the optimal monetary policy conduct for the euro area in the presence of heterogeneous economic conditions across member states. Based on the New Keynesian monetary framework, we compare welfare losses under the assumption that the central bank conducts monetary policy using area-wide aggregate data against the alternative assumption that the central bank exploits country-specific as opposed to area-wide data. Empirical results reveal a sizable gain in stabilization performance if the European Central Bank formulates monetary policy by explicitly taking into account cross-country heterogeneity within the euro area. The estimated gain is more pronounced in a hybrid variant than in the purely forward-looking version of the New Keynesian model.  相似文献   

18.
This paper builds a quarterly Divisia monetary aggregate for the euro area using area‐wide data over the sample period from 1980 to 2000, finding two main results. First, it is found that the demand for this monetary aggregate has been well behaved and relatively stable over the last two decades. Secondly, the Divisia‐weighed monetary aggregate is found to have interesting information content from a forward‐looking perspective. This lends support to the view that money and – in a broader sense – liquidity services should be assigned an important role in shaping monetary policy in the euro area, although the policy maker is not interested in monetary aggregates per se.  相似文献   

19.
Beyond Twin Deficits   总被引:1,自引:0,他引:1  
A bstract This paper outlines new developments in the sociology of money. It highlights certain aspects of Post Keynesian monetarism and explores Keynesian concepts of emotions relative to economics and economic sociology. Gunnar Myrdal's work on time and money contributes to the discussion. Underdeveloped areas of discourse in both sociology and economics are identified and the resulting superficiality of references to money are examined. Sociology, for example, has historically neglected concepts of future time and money, while economics has paid little attention to emotions and organizations. Removing these orthodox barriers allows economics to be informed by concepts previously relegated to sociology, such as emotions of trust and confidence. This process may induce the disaffected from both disciplines to draw from each other, creating an alternative, and ultimately more satisfactory understanding of money.  相似文献   

20.
This paper investigates the empirical success of the New Keynesian Phillips Curve (NKPC) in explaining US inflation when observed measures of inflation expectations are used in conjunction with the output gap. The paper contributes to the literature by addressing the important problem of serial correlation in the stylized NKPC and developing an extended model to account for this serial correlation. Contrary to recent results indicating no role for the output gap, we find it to be a statistically significant driving variable for inflation, with this finding robust to whether the inflation expectations series used relates to individual consumers, professional forecasters or the US Fed.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号