首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Macroeconomic policy makers are typically concerned with several indicators of economic performance. We thus propose to tackle the design of macroeconomic policy using Multicriteria Decision Making (MCDM) techniques. More specifically, we employ Multi-objective Programming (MP) to seek so-called efficient policies. The MP approach is combined with a computable general equilibrium (CGE) model. We chose use of a CGE model since it has the dual advantage of being consistent with standard economic theory while allowing one to measure the effect(s) of a specific policy with real data. Applying the proposed methodology to Spain (via the 1995 Social Accounting Matrix) we first quantified the trade-offs between two specific policy objectives: growth and inflation, when designing fiscal policy. We then constructed a frontier of efficient policies involving real growth and inflation. In doing so, we found that policy in 1995 Spain displayed some degree of inefficiency with respect to these two policy objectives. We then offer two sets of policy recommendations that, ostensibly, could have helped Spain at the time. The first deals with efficiency independent of the importance given to both growth and inflation by policy makers (we label this set: general policy recommendations). A second set depends on which policy objective is seen as more important by policy makers: increasing growth or controlling inflation (we label this one: objective-specific recommendations).  相似文献   

2.
This paper explores ways to integrate model uncertainty into policy evaluation. We describe a general framework that includes both model averaging methods as well as some measures that describe whether policies and their consequences are model dependent. These general ideas are then applied to assess simple monetary policy rules for some standard New Keynesian specifications. We conclude that the original Taylor rule has good robustness properties, but may reasonably be challenged in overall quality with respect to stabilization by alternative simple rules, even when these rules employ parameters that are set without accounting for model uncertainty.  相似文献   

3.
以上海市为研究区域,构建包含驱动力、状态、响应指标的DFSR模型来分析房地产税收政策对房价响应的评价体系,其中利用主成份分析法确定驱动力指标综合得分,再根据量化的各项指标,明确房地产税收政策在房价影响中的地位。研究结果表明,房地产税收政策的调整呈明显的周期性变化,在政策调控力度强的情况下,政策对房价的影响是非常明显的。同时房地产税收政策调控具有时滞性,政策的调控力度与房价的变化并不完全同步。最后提出相应的改革措施,为优化房地产税收政策提供建议与参考。  相似文献   

4.
This paper analyzes data from an investigation of a majoritarian bargaining experiment. A learning model is proposed to account for the evolution of play in this experiment. It is also suggested that an adjustment must be made to account for the panel structure of the data. Such adjustments have been used in other fields and are known to be important as unadjusted standard errors may be severely biased downward. These results indicate that this adjustment also has an important effect in this application. Furthermore, an efficient estimator that takes into account heterogeneity across players is proposed. A unique learning model to account for the paths of play under two different amendment rules cannot be rejected with the standard estimator with adjusted standard errors, however it can be rejected using the efficient estimator. The data and the estimated learning model suggest that after proposing “fair” divisions, subjects adapt and their proposals change rapidly in the treatment where uneven proposals are almost always accepted. Their beliefs in the estimated learning model are influenced by more than just the most recent outcomes.  相似文献   

5.
We estimate versions of the Nelson–Siegel model of the yield curve of US government bonds using a Markov switching latent variable model that allows for discrete changes in the stochastic process followed by the interest rates. Our modeling approach is motivated by evidence suggesting the existence of breaks in the behavior of the US yield curve that depend, for example, on whether the economy is in a recession or a boom, or on the stance of monetary policy. Our model is parsimonious, relatively easy to estimate and flexible enough to match the changing shapes of the yield curve over time. We also derive the discrete time non‐arbitrage restrictions for the Markov switching model. We compare the forecasting performance of these models with that of the standard dynamic Nelson and Siegel model and an extension that allows the decay rate parameter to be time varying. We show that some parametrizations of our model with regime shifts outperform the single‐regime Nelson and Siegel model and other standard empirical models of the yield curve. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

6.
We introduce a modified conditional logit model that takes account of uncertainty associated with mis‐reporting in revealed preference experiments estimating willingness‐to‐pay (WTP). Like Hausman et al. [Journal of Econometrics (1988) Vol. 87, pp. 239–269], our model captures the extent and direction of uncertainty by respondents. Using a Bayesian methodology, we apply our model to a choice modelling (CM) data set examining UK consumer preferences for non‐pesticide food. We compare the results of our model with the Hausman model. WTP estimates are produced for different groups of consumers and we find that modified estimates of WTP, that take account of mis‐reporting, are substantially revised downwards. We find a significant proportion of respondents mis‐reporting in favour of the non‐pesticide option. Finally, with this data set, Bayes factors suggest that our model is preferred to the Hausman model.  相似文献   

7.
This paper describes a dynamic stochastic general equilibrium model augmented with labour frictions, namely: indivisible labour, predetermined employment and adjustment costs. This improves the fit to the data as shown by a higher log marginal likelihood and closer match to key business cycle statistics. The labour frictions introduced are relevant for model dynamics and economic policy: the effect of total factor productivity shocks on most macroeconomic variables is substantially mitigated; fiscal policy leads to a greater crowding out of private sector activity and monetary policy has a lower impact on output. Labour frictions also provide a better match to impulse response functions from vector autoregressive models.  相似文献   

8.
Accounting for the uncertainty in real-time perceptions of the state of the economy is believed to be critical for monetary policy analysis. We investigate this claim through the lens of a New Keynesian model with optimal discretionary policy and partial information. Structural parameters are estimated using a data set that includes real-time and ex post revised observations spanning 1965–2010. In comparison to a standard complete information model, our estimates reveal that under partial information: (i) the Federal Reserve demonstrates a significant concern for stabilizing the output gap after 1979, (ii) the model׳s fit with revised data improves, and (iii) the tension between optimal and observed policy is smaller.  相似文献   

9.
Our paper estimates the effect of US internal migration on wage growth for young men between their first and second job. Our analysis of migration extends previous research by: (i) exploiting the distance-based measures of migration in the National Longitudinal Surveys of Youth 1979 (NLSY79); (ii) allowing the effect of migration to differ by schooling level and (iii) using propensity score matching to estimate the average treatment effect on the treated (ATET) for movers and (iv) using local average treatment effect (LATE) estimators with covariates to estimate the average treatment effect (ATE) and ATET for compliers.We believe the Conditional Independence Assumption (CIA) is reasonable for our matching estimators since the NLSY79 provides a relatively rich array of variables on which to match. Our matching methods are based on local linear, local cubic, and local linear ridge regressions. Local linear and local ridge regression matching produce relatively similar point estimates and standard errors, while local cubic regression matching badly over-fits the data and provides very noisy estimates.We use the bootstrap to calculate standard errors. Since the validity of the bootstrap has not been investigated for the matching estimators we use, and has been shown to be invalid for nearest neighbor matching estimators, we conduct a Monte Carlo study on the appropriateness of using the bootstrap to calculate standard errors for local linear regression matching. The data generating processes in our Monte Carlo study are relatively rich and calibrated to match our empirical models or to test the sensitivity of our results to the choice of parameter values. The estimated standard errors from the bootstrap are very close to those from the Monte Carlo experiments, which lends support to our using the bootstrap to calculate standard errors in our setting.From the matching estimators we find a significant positive effect of migration on the wage growth of college graduates, and a marginally significant negative effect for high school dropouts. We do not find any significant effects for other educational groups or for the overall sample. Our results are generally robust to changes in the model specification and changes in our distance-based measure of migration. We find that better data matters; if we use a measure of migration based on moving across county lines, we overstate the number of moves, while if we use a measure based on moving across state lines, we understate the number of moves. Further, using either the county or state measures leads to much less precise estimates.We also consider semi-parametric LATE estimators with covariates (Frölich 2007), using two sets of instrumental variables. We precisely estimate the proportion of compliers in our data, but because we have a small number of compliers, we cannot obtain precise LATE estimates.  相似文献   

10.
A formal test on the Lyapunov exponent is developed to distinguish a random walk model from a chaotic system, which is based on the Nadaraya–Watson kernel estimator of the Lyapunov exponent. The asymptotic null distribution of our test statistic is free of nuisance parameter, and simply given by the range of standard Brownian motion on the unit interval. The test is consistent against the chaotic alternatives. A simulation study shows that the test performs reasonably well in finite samples. We apply our test to some of the standard macro and financial time series, finding no significant empirical evidence of chaos.  相似文献   

11.
We investigate the economic significance of trading off empirical validity of models against other desirable model properties. Our investigation is based on three alternative econometric systems of the supply side, in a model that can be used to discuss optimal monetary policy in Norway. Our results caution against compromising empirical validity when selecting a model for policy analysis. We also find large costs from basing policies on the robust model, or on a suite of models, even when it contains the valid model. This confirms an important role for econometric modelling and evaluation in model choice for policy analysis.  相似文献   

12.
This paper proposes a model of the US unemployment rate which accounts for both its asymmetry and its long memory. Our approach introduces fractional integration and nonlinearities simultaneously into the same framework, using a Lagrange multiplier procedure with a standard null‐limit distribution. The empirical results suggest that the US unemployment rate can be specified in terms of a fractionally integrated process, which interacts with some nonlinear functions of labour‐demand variables such as real oil prices and real interest rates. We also find evidence of a long‐memory component. Our results are consistent with a hysteresis model with path dependency rather than a non‐accelerating inflation rate of unemployment (NAIRU) model with an underlying unemployment equilibrium rate, thereby giving support to more activist stabilization policies. However, any suitable model should also include business cycle asymmetries, with implications for both forecasting and policy‐making.  相似文献   

13.
Central Banks regularly make forecasts, such as the Fed’s Greenbook forecast, that are conditioned on hypothetical paths for the policy interest rate. While there are good public policy reasons to evaluate the quality of such forecasts, up until now, the most common approach has been to ignore their conditional nature and apply standard forecast efficiency tests. In this paper we derive tests for the efficiency of conditional forecasts. Intuitively, these tests involve implicit estimates of the degree to which the conditioning path is counterfactual and the magnitude of the policy feedback over the forecast horizon. We apply the tests to the Greenbook forecast and the Bank of England’s inflation report forecast, finding some evidence of forecast inefficiency. Nonetheless, we argue that the conditional nature of the forecasts made by central banks represents a substantial impediment to the analysis of their quality—stronger assumptions are needed and forecast inefficiency may go undetected for longer than would be the case if central banks were instead to report unconditional forecasts.  相似文献   

14.
We propose a general two-step estimator for a popular Markov discrete choice model that includes a class of Markovian games with continuous observable state space. Our estimation procedure generalizes the computationally attractive methodology of Pesendorfer and Schmidt-Dengler (2008) that assumed finite observable states. This extension is non-trivial as the policy value functions are solutions to some type II integral equations. We show that the inverse problem is well-posed. We provide a set of primitive conditions to ensure root-T consistent estimation for the finite dimensional structural parameters and the distribution theory for the value functions in a time series framework.  相似文献   

15.
US monetary policy is investigated using a regime-switching no-arbitrage term structure model that relies on inflation, output, and the short interest rate as factors. The model is complemented with a set of assumptions that allow the dynamics of the private sector to be separated from monetary policy. The monetary policy regimes cannot be estimated if the yield curve is ignored during estimation. Counterfactual analysis evaluates importance of regimes in policy and shocks for the great moderation. The low-volatility regime of exogenous shocks plays an important role. Monetary policy contributes by trading off asymmetric responses of output and inflation under different regimes.  相似文献   

16.
The paper moves from a discussion of the challenges posed by the crisis to standard macroeconomics and the solutions adopted within the DSGE community. Although several recent improvements have enhanced the realism of standard models, we argue that major drawbacks still undermine their reliability. In particular, DSGE models still fail to recognize the complex adaptive nature of economic systems, and the implications of money endogeneity. The paper argues that a coherent and exhaustive representation of the inter-linkages between the real and financial sides of the economy should be a pivotal feature of every macroeconomic model and proposes a macroeconomic framework based on the combination of the Agent Based and Stock Flow Consistent approaches. The papers aims at contributing to the nascent AB-SFC literature under two fundamental respects: first, we develop a fully decentralized AB-SFC model with several innovative features, and we thoroughly validate it in order to check whether the model is a good candidate for policy analysis applications. Results suggest that the properties of the model match many empirical regularities, ranking among the best performers in the related literature, and that these properties are robust across different parameterizations. Second, the paper has also a methodological purpose in that we try to provide a set or rules and tools to build, calibrate, validate, and display AB-SFC models.  相似文献   

17.
Traditional vector autoregressions derive impulse responses using iterative techniques that may compound specification errors. Local projection techniques are more robust to this problem, and Monte Carlo evidence has suggested they provide reliable estimates of the true impulse responses. We use local linear projections to investigate the dynamic properties of a model for a small open economy, New Zealand. We compare impulse responses from projections to those from standard techniques, and consider the implications for monetary policy. We pay careful attention to the dimensionality of the model, and focus on effects of policy on gross domestic product, interest rates, prices and exchange rates.  相似文献   

18.
Policy makers must base their decisions on preliminary and partially revised data of varying reliability. Realistic modeling of data revisions is required to guide decision makers in their assessment of current and future conditions. This paper provides a new framework with which to model data revisions.Recent empirical work suggests that measurement errors typically have much more complex dynamics than existing models of data revisions allow. This paper describes a state-space model that allows for richer dynamics in these measurement errors, including the noise, news and spillover effects documented in this literature. We also show how to relax the common assumption that “true” values are observed after a few revisions.The result is a unified and flexible framework that allows for more realistic data revision properties, and allows the use of standard methods for optimal real-time estimation of trends and cycles. We illustrate the application of this framework with real-time data on US real output growth.  相似文献   

19.
For eight major national currencies, this study estimates, and tests several hypotheses with, a t-distribution GARCH model of daily spot nominal exchange rate changes. The sample period covered is June 1, 1982 through September 30, 1992. By using likelihood ratio and parameter stability tests, it finds that for most of the currencies considered, both the conditional means and variances of unexpected exchange rate changes experienced statistically significant structural breaks across the five subperiods that are associated with four episodes of international foreign-exchange policy coordination. The study also finds that the orderings of the GARCH-estimated unconditional standard deviations roughly match the orderings of the sample standard deviations across the five subperiods. An explanation is provided for what underlying factors contributed to these structural shifts.  相似文献   

20.
Modeling the joint term structure of interest rates in the United States and the European Union, the two largest economies in the world, is extremely important in international finance. In this article, we provide both theoretical and empirical analysis of multi-factor joint affine term structure models (ATSM) for dollar and euro interest rates. In particular, we provide a systematic classification of multi-factor joint ATSM similar to that of Dai and Singleton (2000). A principal component analysis of daily dollar and euro interest rates reveals four factors in the data. We estimate four-factor joint ATSM using the approximate maximum likelihood method of [A?t-Sahalia, 2002] and [A?t-Sahalia, forthcoming] and compare the in-sample and out-of-sample performances of these models using some of the latest nonparametric methods. We find that a new four-factor model with two common and two local factors captures the joint term structure dynamics in the US and the EU reasonably well.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号