首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
We propose methods for testing hypothesis of non-causality at various horizons, as defined in Dufour and Renault (Econometrica 66, (1998) 1099–1125). We study in detail the case of VAR models and we propose linear methods based on running vector autoregressions at different horizons. While the hypotheses considered are nonlinear, the proposed methods only require linear regression techniques as well as standard Gaussian asymptotic distributional theory. Bootstrap procedures are also considered. For the case of integrated processes, we propose extended regression methods that avoid nonstandard asymptotics. The methods are applied to a VAR model of the US economy.  相似文献   

2.
Inference for multiple-equation Markov-chain models raises a number of difficulties that are unlikely to appear in smaller models. Our framework allows for many regimes in the transition matrix, without letting the number of free parameters grow as the square as the number of regimes, but also without losing a convenient form for the posterior distribution. Calculation of marginal data densities is difficult in these high-dimensional models. This paper gives methods to overcome these difficulties, and explains why existing methods are unreliable. It makes suggestions for maximizing posterior density and initiating MCMC simulations that provide robustness against the complex likelihood shape.  相似文献   

3.
When Japanese short-term bond yields were near their zero bound, yields on long-term bonds showed substantial fluctuation, and there was a strong positive relationship between the level of interest rates and yield volatilities/risk premiums. We explore whether several families of dynamic term structure models that enforce a zero lower bound on short rates imply conditional distributions of Japanese bond yields consistent with these patterns. Multi-factor “shadow-rate” and quadratic-Gaussian models, evaluated at their maximum likelihood estimates, capture many features of the data. Furthermore, model-implied risk premiums track realized excess returns during extended periods of near-zero short rates. In contrast, the conditional distributions implied by non-negative affine models do not match their sample counterparts, and standard Gaussian affine models generate implausibly large negative risk premiums.  相似文献   

4.
A random sample drawn from a population would appear to offer an ideal opportunity to use the bootstrap in order to perform accurate inference, since the observations of the sample are IID. In this paper, Monte Carlo results suggest that bootstrapping a commonly used index of inequality leads to inference that is not accurate even in very large samples, although inference with poverty indices is satisfactory. We find that the major cause is the extreme sensitivity of many inequality indices to the exact nature of the upper tail of the income distribution. This leads us to study two non-standard bootstraps, the m out of n bootstrap, which is valid in some situations where the standard bootstrap fails, and a bootstrap in which the upper tail is modelled parametrically. Monte Carlo results suggest that accurate inference can be achieved with this last method in moderately large samples.  相似文献   

5.
This paper conducts a broad-based comparison of iterated and direct multi-period forecasting approaches applied to both univariate and multivariate models in the form of parsimonious factor-augmented vector autoregressions. To account for serial correlation in the residuals of the multi-period direct forecasting models we propose a new SURE-based estimation method and modified Akaike information criteria for model selection. Empirical analysis of the 170 variables studied by Marcellino, Stock and Watson (2006) shows that information in factors helps improve forecasting performance for most types of economic variables although it can also lead to larger biases. It also shows that SURE estimation and finite-sample modifications to the Akaike information criterion can improve the performance of the direct multi-period forecasts.  相似文献   

6.
Structural vector autoregressive (SVAR) models have emerged as a dominant research strategy in empirical macroeconomics, but suffer from the large number of parameters employed and the resulting estimation uncertainty associated with their impulse responses. In this paper, we propose general‐to‐specific (Gets) model selection procedures to overcome these limitations. It is shown that single‐equation procedures are generally efficient for the reduction of recursive SVAR models. The small‐sample properties of the proposed reduction procedure (as implemented using PcGets) are evaluated in a realistic Monte Carlo experiment. The impulse responses generated by the selected SVAR are found to be more precise and accurate than those of the unrestricted VAR. The proposed reduction strategy is then applied to the US monetary system considered by Christiano, Eichenbaum and Evans (Review of Economics and Statistics, Vol. 78, pp. 16–34, 1996) . The results are consistent with the Monte Carlo and question the validity of the impulse responses generated by the full system.  相似文献   

7.
General‐to‐Specific (GETS) modelling has witnessed major advances thanks to the automation of multi‐path GETS specification search. However, the estimation complexity associated with financial models constitutes an obstacle to automated multi‐path GETS modelling in finance. Making use of a recent result we provide and study simple but general and flexible methods that automate financial multi‐path GETS modelling. Starting from a general model where the mean specification can contain autoregressive terms and explanatory variables, and where the exponential volatility specification can include log‐ARCH terms, asymmetry terms, volatility proxies and other explanatory variables, the algorithm we propose returns parsimonious mean and volatility specifications.  相似文献   

8.
Steady‐state restrictions are commonly imposed on highly persistent variables to achieve stationarity prior to confronting rational expectations models with data. However, the resulting steady‐state deviations are often surprisingly persistent indicating that some aspects of the underlying theory may be empirically problematic. This paper discusses how to formulate steady‐state restrictions in rational expectations models with latent forcing variables and test their validity using cointegration techniques. The approach is illustrated by testing steady‐state restrictions for alternative specifications of the New Keynesian model and shown to be able to discriminate between different assumptions on the sources of the permanent shocks.  相似文献   

9.
A commonly used defining property of long memory time series is the power law decay of the autocovariance function. Some alternative methods of deriving this property are considered, working from the alternate definition in terms of a fractional pole in the spectrum at the origin. The methods considered involve the use of (i) Fourier transforms of generalized functions, (ii) asymptotic expansions of Fourier integrals with singularities, (iii) direct evaluation using hypergeometric function algebra, and (iv) conversion to a simple gamma integral. The paper is largely pedagogical but some novel methods and results involving complete asymptotic series representations are presented. The formulae are useful in many ways, including the calculation of long run variation matrices for multivariate time series with long memory and the econometric estimation of such models.  相似文献   

10.
The technique of Monte Carlo (MC) tests [Dwass (1957, Annals of Mathematical Statistics 28, 181–187); Barnard (1963, Journal of the Royal Statistical Society, Series B 25, 294)] provides a simple method for building exact tests from statistics whose finite sample distribution is intractable but can be simulated (when no nuisance parameter is involved). We extend this method in two ways: first, by allowing for MC tests based on exchangeable possibly discrete test statistics; second, by generalizing it to statistics whose null distribution involves nuisance parameters [maximized MC (MMC) tests]. Simplified asymptotically justified versions of the MMC method are also proposed: these provide a simple way of improving standard asymptotics and dealing with nonstandard asymptotics.  相似文献   

11.
Structural vector‐autoregressive models are potentially very useful tools for guiding both macro‐ and microeconomic policy. In this study, we present a recently developed method for estimating such models, which uses non‐normality to recover the causal structure underlying the observations. We show how the method can be applied to both microeconomic data (to study the processes of firm growth and firm performance) and macroeconomic data (to analyse the effects of monetary policy).  相似文献   

12.
Cointegration ideas as introduced by Granger in 1981 are commonly embodied in empirical macroeconomic modelling through the vector error correction model (VECM). It has become common practice in these models to treat some variables as weakly exogenous, resulting in conditional VECMs. This paper studies the consequences of different approaches to weak exogeneity for the dynamic properties of such models, in the context of two models of the UK economy, one a national-economy model, the other the UK submodel of a global model. Impulse response and common trend analyses are shown to be sensitive to these assumptions and other specification choices.  相似文献   

13.
This special issue of the Journal of Econometrics honors William A. Barnett’s exceptional contributions to unifying economic theory with rigorous statistical inference to interpret economic data and inform public policy. It is devoted to papers that advance microeconometrics, macroeconometrics, and financial econometrics to build models to interpret evidence.  相似文献   

14.
This paper proposes a testing strategy for the null hypothesis that a multivariate linear rational expectations (LRE) model may have a unique stable solution (determinacy) against the alternative of multiple stable solutions (indeterminacy). The testing problem is addressed by a misspecification-type approach in which the overidentifying restrictions test obtained from the estimation of the system of Euler equations of the LRE model through the generalized method of moments is combined with a likelihood-based test for the cross-equation restrictions that the model places on its reduced form solution under determinacy. The resulting test has no power against a particular class of indeterminate equilibria, hence the non rejection of the null hypothesis can not be interpreted conclusively as evidence of determinacy. On the other hand, this test (i) circumvents the nonstandard inferential problem generated by the presence of the auxiliary parameters that appear under indeterminacy and that are not identifiable under determinacy, (ii) does not involve inequality parametric restrictions and hence the use of nonstandard inference, (iii) is consistent against the dynamic misspecification of the LRE model, and (iv) is computationally simple. Monte Carlo simulations show that the suggested testing strategy delivers reasonable size coverage and power against dynamic misspecification in finite samples. An empirical illustration focuses on the determinacy/indeterminacy of a New Keynesian monetary business cycle model of the US economy.  相似文献   

15.
We introduce quasi-likelihood ratio tests for one sided multivariate hypotheses to evaluate the null that a parsimonious model performs equally well as a small number of models which nest the benchmark. The limiting distributions of the test statistics are non-standard. For critical values we consider: (i) bootstrapping and (ii) simulations assuming normality of the mean square prediction error difference. The proposed tests have good size and power properties compared with existing equal and superior predictive ability tests for multiple model comparison. We apply our tests to study the predictive ability of a Phillips curve type for the US core inflation.  相似文献   

16.
The framework of Geweke (1982. Journal of the American Statistical Association 77, 304–324.) and Hosoya (1991. Probability Theory and Related Fields 88, 429–444.) is adopted to construct a simple test for causality in the frequency domain. This test can also be applied to cointegrated systems. To study the large sample properties of the test, we analyze the power against a sequence of local alternatives. The finite sample properties are investigated by means of Monte Carlo simulations. Our methodology is applied to investigate the predictive content of the yield spread for future output growth. Using quarterly US data we observe reasonable leading indicator properties at frequencies around one year and typical business cycle frequencies.  相似文献   

17.
Nonlinear time series models have become fashionable tools to describe and forecast a variety of economic time series. A closer look at reported empirical studies, however, reveals that these models apparently fit well in‐sample, but rarely show a substantial improvement in out‐of‐sample forecasts, at least over linear models. One of the many possible reasons for this finding is the use of inappropriate model selection criteria and forecast evaluation criteria. In this paper we therefore propose a novel criterion, which we believe does more justice to the very nature of nonlinear models. Simulations show that this criterion outperforms those criteria currently in use, in the sense that the true nonlinear model is more often found to perform better in out‐of‐sample forecasting than a benchmark linear model. An empirical illustration for US GDP emphasizes its relevance.  相似文献   

18.
We propose new information criteria for impulse response function matching estimators (IRFMEs). These estimators yield sampling distributions of the structural parameters of dynamic stochastic general equilibrium (DSGE) models by minimizing the distance between sample and theoretical impulse responses. First, we propose an information criterion to select only the responses that produce consistent estimates of the true but unknown structural parameters: the Valid Impulse Response Selection Criterion (VIRSC). The criterion is especially useful for mis-specified models. Second, we propose a criterion to select the impulse responses that are most informative about DSGE model parameters: the Relevant Impulse Response Selection Criterion (RIRSC). These criteria can be used in combination to select the subset of valid impulse response functions with minimal dimension that yields asymptotically efficient estimators. The criteria are general enough to apply to impulse responses estimated by VARs, local projections, and simulation methods. We show that the use of our criteria significantly affects estimates and inference about key parameters of two well-known new Keynesian DSGE models. Monte Carlo evidence indicates that the criteria yield gains in terms of finite sample bias as well as offering tests statistics whose behavior is better approximated by the first order asymptotic theory. Thus, our criteria improve existing methods used to implement IRFMEs.  相似文献   

19.
We decompose the squared VIX index, derived from US S&P500 options prices, into the conditional variance of stock returns and the equity variance premium. We evaluate a plethora of state-of-the-art volatility forecasting models to produce an accurate measure of the conditional variance. We then examine the predictive power of the VIX and its two components for stock market returns, economic activity and financial instability. The variance premium predicts stock returns while the conditional stock market variance predicts economic activity and has a relatively higher predictive power for financial instability than does the variance premium.  相似文献   

20.
Models for the 12‐month‐ahead US rate of inflation, measured by the chain‐weighted consumer expenditure deflator, are estimated for 1974–98 and subsequent pseudo out‐of‐sample forecasting performance is examined. Alternative forecasting approaches for different information sets are compared with benchmark univariate autoregressive models, and substantial out‐performance is demonstrated including against Stock and Watson's unobserved components‐stochastic volatility model. Three key ingredients to the out‐performance are: including equilibrium correction component terms in relative prices; introducing nonlinearities to proxy state‐dependence in the inflation process and replacing the information criterion, commonly used in VARs to select lag length, with a ‘parsimonious longer lags’ parameterization. Forecast pooling or averaging also improves forecast performance.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号