首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
This paper focuses on the relationship between political instability, policy–making and macroeconomic outcomes. The theoretical section explores various models that explain the effect of instability (and political uncertainty) on growth, budget formation, inflation and monetary policy. The empirical section discusses the evidence on the predictions generated by theoretical models. Preliminary to this discussion, however, is the analysis of a few general issues concerning the specification and estimation of econometric models with political variables. Some new results are then produced on the empirical relevance of theories of strategic use of fiscal deficit.  相似文献   

2.
This paper derives limit distributions of empirical likelihood estimators for models in which inequality moment conditions provide overidentifying information. We show that the use of this information leads to a reduction of the asymptotic mean-squared estimation error and propose asymptotically uniformly valid tests and confidence sets for the parameters of interest. While inequality moment conditions arise in many important economic models, we use a dynamic macroeconomic model as a data generating process and illustrate our methods with instrumental variable estimators of monetary policy rules. The results obtained in this paper extend to conventional GMM estimators.  相似文献   

3.
4.
The empirical analysis of monetary policy requires the construction of instruments for future expected inflation. Dynamic factor models have been applied rather successfully to inflation forecasting. In fact, two competing methods have recently been developed to estimate large‐scale dynamic factor models based, respectively, on static and dynamic principal components. This paper combines the econometric literature on dynamic principal components and the empirical analysis of monetary policy. We assess the two competing methods for extracting factors on the basis of their success in instrumenting future expected inflation in the empirical analysis of monetary policy. We use two large data sets of macroeconomic variables for the USA and for the Euro area. Our results show that estimated factors do provide a useful parsimonious summary of the information used in designing monetary policy. Copyright © 2005 John Wiley & Sons, Ltd.  相似文献   

5.
Abstract . The assumptions and conclusions of New Classical Macroeconomics (NCM) are critically examined. NCM grew out of the alleged failure of the Keynesian school to deal with the problems of stagflation of the 1970s. The two fundamental ideas of the NCM are the rational expectations hypothesis and the theory of instantaneous market clearing. According to the NCM, fiscal and monetary policies will achieve desired results if they are unanticipated. Business cycles are thought to be results of imperfect information on the part of rational agents (people). The NCM has been severely criticized by such prominent economists as Arrow, Tobin and Thurow.  相似文献   

6.
In a recent paper Mercenier and Sekkat (1988) use a linear-quadratic model to examine the willingness of a monetary authority in a small open economy to target its exchange rate. Based on their empirical results, the authors conclude that the Bank of Canada has displayed a willingness to use the money supply to target the Canada—US exchange rate. We re-examine their empirical results using a different estimation approach and with different assumptions about the forcing process of the exogeneous variables. We also extend the sample period to include more recent observations. While we find some weak evidence to support their conclusion, the results, in general, suggest that a linear-quadratic model may not be a particularly useful representation of the assumed exchange rate targeting by a monetary authority.  相似文献   

7.
The mismatch between the timescale of DSGE (dynamic stochastic general equilibrium) models and the data used in their estimation translates into identification problems, estimation bias, and distortions in policy analysis. We propose an estimation strategy based on mixed‐frequency data to alleviate these shortcomings. The virtues of our approach are explored for two monetary policy models. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

8.
Recently, single‐equation estimation by the generalized method of moments (GMM) has become popular in the monetary economics literature, for estimating forward‐looking models with rational expectations. We discuss a method for analysing the empirical identification of such models that exploits their dynamic structure and the assumption of rational expectations. This allows us to judge the reliability of the resulting GMM estimation and inference and reveals the potential sources of weak identification. With reference to the New Keynesian Phillips curve of Galí and Gertler [Journal of Monetary Economics (1999) Vol. 44, 195] and the forward‐looking Taylor rules of Clarida, Galí and Gertler [Quarterly Journal of Economics (2000) Vol. 115, 147], we demonstrate that the usual ‘weak instruments’ problem can arise naturally, when the predictable variation in inflation is small relative to unpredictable future shocks (news). Hence, we conclude that those models are less reliably estimated over periods when inflation has been under effective policy control.  相似文献   

9.
We consider improved estimation strategies for a two-parameter inverse Gaussian distribution and use a shrinkage technique for the estimation of the mean parameter. In this context, two new shrinkage estimators are suggested and demonstrated to dominate the classical estimator under the quadratic risk with realistic conditions. Furthermore, based on our shrinkage strategy, a new estimator is proposed for the common mean of several inverse Gaussian distributions, which uniformly dominates the Graybill–Deal type unbiased estimator. The performance of the suggested estimators is examined by using simulated data and our shrinkage strategies are shown to work well. The estimation methods and results are illustrated by two empirical examples.  相似文献   

10.
DISEQUILIBRIUM BUFFER STOCK MODELS: A SURVEY   总被引:1,自引:0,他引:1  
Abstract. This paper surveys the growing literature on buffer stock models of the monetary transmission process. The first part indicates the basic issue of how (assumed) exogenous changes in the money supply work their way through the economic system via disequilibrium in the money market. After a brief historical development, buffer stock models are divided into four types depending upon whether the disequilibrium arises in stocks and/or flows, and to whether changes in the money stock are purely exogenous. The empirical implications, including how buffer stock models explain the recent behaviour of money demand functions are given. Theoretical and empirical criticisms of the models are presented in terms of the classification system. The survey concludes that whilst the buffer stock notion is an interesting idea, the current models do not lend themselves to empirical testing, and those models which do have performed poorly.  相似文献   

11.
《Journal of econometrics》1986,32(3):385-397
When explanatory variable data in a regression model are drawn from a population with grouped structure, the regression errors are often correlated within groups. Error component and random coefficient regression models are considered as models of the intraclass correlation. This paper analyzes several empirical examples to investigate the applicability of random effects models and the consequences of inappropriately using ordinary least squares (OLS) estimation in the presence of random group effects. The principal findings are that the assumption of independent errors is usually incorrect and the unadjusted OLS standard errors often have a substantial downward bias, suggesting a considerable danger of spurious regression.  相似文献   

12.
This paper focuses on the dynamic misspecification that characterizes the class of small‐scale New Keynesian models currently used in monetary and business cycle analysis, and provides a remedy for the typical difficulties these models have in accounting for the rich contemporaneous and dynamic correlation structure of the data. We suggest using a statistical model for the data as a device through which it is possible to adapt the econometric specification of the New Keynesian model such that the risk of omitting important propagation mechanisms is kept under control. A pseudo‐structural form is built from the baseline system of Euler equations by forcing the state vector of the system to have the same dimension as the state vector characterizing the statistical model. The pseudo‐structural form gives rise to a set of cross‐equation restrictions that do not penalize the autocorrelation structure and persistence of the data. Standard estimation and evaluation methods can be used. We provide an empirical illustration based on USA quarterly data and a small‐scale monetary New Keynesian model.  相似文献   

13.
This paper proposes new approximate long-memory VaR models that incorporate intra-day price ranges. These models use lagged intra-day range with the feature of considering different range components calculated over different time horizons. We also investigate the impact of the market overnight return on the VaR forecasts, which has not yet been considered with the range in VaR estimation. Model estimation is performed using linear quantile regression. An empirical analysis is conducted on 18 market indices. In spite of the simplicity of the proposed methods, the empirical results show that they successfully capture the main features of the financial returns and are competitive with established benchmark methods. The empirical results also show that several of the proposed range-based VaR models, utilizing both the intra-day range and the overnight returns, are able to outperform GARCH-based methods and CAViaR models.  相似文献   

14.
ABSTRACT. This paper surveys equilibrium business cycles (EBC) theory, which has dominated the business cycle literature since the mid 1970s. It focuses primarily on the real business cycle (RBC) literature the origin of which is traced to the monetary equilibrium business cycle (MBC) model developed by Lucas (1975). RBC and MBC models are themselves related to a wider class of linear stochastic business cycle models which, following Frisch (1933), view the cycle as the result of the propagation, by the economic system, of a series of random shocks. The MBC approach highlighted the importance of monetary shocks but its failure to adequately explain observed fluctuations provided the impetus to the development of the RBC approach, which emphasises the importance of real shocks. This paper also appraises the empirical support for the RBC approach and finds it less than compelling. Given the failure of Keynesian, and equilibrium linear stochastic business cycle models to fully explain economic fluctuations, the Frischian approach to business cycle modelling is called into question. Developments to existing models, which may help to clarify our understanding of business cycle behaviour, are discussed with a view to setting out a research agenda for the 1990s and beyond.  相似文献   

15.
This paper studies the empirical performance of stochastic volatility models for twenty years of weekly exchange rate data for four major currencies. We concentrate on the effects of the distribution of the exchange rate innovations for both parameter estimates and for estimates of the latent volatility series. The density of the log of squared exchange rate innovations is modelled as a flexible mixture of normals. We use three different estimation techniques: quasi-maximum likelihood, simulated EM, and a Bayesian procedure. The estimated models are applied for pricing currency options. The major findings of the paper are that: (1) explicitly incorporating fat-tailed innovations increases the estimates of the persistence of volatility dynamics; (2) the estimation error of the volatility time series is very large; (3) this in turn causes standard errors on calculated option prices to be so large that these prices are rarely significantly different from a model with constant volatility. © 1998 John Wiley & Sons, Ltd.  相似文献   

16.
Recently, there has been considerable work on stochastic time-varying coefficient models as vehicles for modelling structural change in the macroeconomy with a focus on the estimation of the unobserved paths of random coefficient processes. The dominant estimation methods, in this context, are based on various filters, such as the Kalman filter, that are applicable when the models are cast in state space representations. This paper introduces a new class of autoregressive bounded processes that decompose a time series into a persistent random attractor, a time varying autoregressive component, and martingale difference errors. The paper examines, rigorously, alternative kernel based, nonparametric estimation approaches for such models and derives their basic properties. These estimators have long been studied in the context of deterministic structural change, but their use in the presence of stochastic time variation is novel. The proposed inference methods have desirable properties such as consistency and asymptotic normality and allow a tractable studentization. In extensive Monte Carlo and empirical studies, we find that the methods exhibit very good small sample properties and can shed light on important empirical issues such as the evolution of inflation persistence and the purchasing power parity (PPP) hypothesis.  相似文献   

17.
This paper reviews recent research on the relationship between central bank policies and inequality. A new paradigm which integrates sticky‐prices, incomplete markets, and heterogeneity among households is emerging, which allows for the joint study of how inequality shapes macroeconomic aggregates and how macroeconomic shocks and policies affect inequality. The new paradigm features multiple distributional channels of monetary policy. Most empirical studies, however, analyze each potential channel of redistribution in isolation. Our review suggests that empirical research on the effects of conventional monetary policy on income and wealth inequality yields mixed findings, although there seems to be a consensus that higher inflation, at least above some threshold, increases inequality. In contrast to common wisdom, conclusions concerning the impact of unconventional monetary policies on inequality are also not clear cut. To better understand policy effects on inequality, future research should focus on the estimation of General Equilibrium models with heterogeneous agents.  相似文献   

18.
Structural vector autoregressive (SVAR) models have emerged as a dominant research strategy in empirical macroeconomics, but suffer from the large number of parameters employed and the resulting estimation uncertainty associated with their impulse responses. In this paper, we propose general‐to‐specific (Gets) model selection procedures to overcome these limitations. It is shown that single‐equation procedures are generally efficient for the reduction of recursive SVAR models. The small‐sample properties of the proposed reduction procedure (as implemented using PcGets) are evaluated in a realistic Monte Carlo experiment. The impulse responses generated by the selected SVAR are found to be more precise and accurate than those of the unrestricted VAR. The proposed reduction strategy is then applied to the US monetary system considered by Christiano, Eichenbaum and Evans (Review of Economics and Statistics, Vol. 78, pp. 16–34, 1996) . The results are consistent with the Monte Carlo and question the validity of the impulse responses generated by the full system.  相似文献   

19.
Maximum Likelihood (ML) estimation of probit models with correlated errors typically requires high-dimensional truncated integration. Prominent examples of such models are multinomial probit models and binomial panel probit models with serially correlated errors. In this paper we propose to use a generic procedure known as Efficient Importance Sampling (EIS) for the evaluation of likelihood functions for probit models with correlated errors. Our proposed EIS algorithm covers the standard GHK probability simulator as a special case. We perform a set of Monte Carlo experiments in order to illustrate the relative performance of both procedures for the estimation of a multinomial multiperiod probit model. Our results indicate substantial numerical efficiency gains for ML estimates based on the GHK–EIS procedure relative to those obtained by using the GHK procedure.  相似文献   

20.
The time varying empirical spectral measure plays a major role in the treatment of inference problems for locally stationary processes. The properties of the empirical spectral measure and related statistics are studied — both when its index function is fixed or when dependent on the sample size. In particular we prove a general central limit theorem. Several applications and examples are given including semiparametric Whittle estimation, local least squares estimation and spectral density estimation.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号