首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Decomposing Granger causality over the spectrum allows us to disentangle potentially different Granger causality relationships over different frequencies. This may yield new and complementary insights compared to traditional versions of Granger causality. In this paper, we compare two existing approaches in the frequency domain, proposed originally by Pierce [Pierce, D. A. (1979). R-squared measures for time series. Journal of the American Statistical Association, 74, 901–910] and Geweke [Geweke, J. (1982). Measurement of linear dependence and feedback between multiple time series. Journal of the American Statistical Association, 77, 304–324], and introduce a new testing procedure for the Pierce spectral Granger causality measure. To provide insights into the relative performance of this test, we study its power properties by means of Monte Carlo simulations. In addition, we apply the methodology in the context of the predictive value of the European production expectation surveys. This predictive content is found to vary widely with the frequency considered, illustrating the usefulness of not restricting oneself to a single overall test statistic.  相似文献   

2.
This paper presents analytical, Monte Carlo, and empirical evidence on the effects of structural breaks on tests for equal forecast accuracy and encompassing. We show that out-of-sample predictive content can be hard to find because out-of-sample tests are highly dependent on the timing of the predictive ability. Moreover, predictive content is harder to find with some tests than others: in power, F-type tests of equal forecast accuracy and encompassing often dominate t-type alternatives. Based on these results and evidence from an empirical application, we conclude that structural breaks under the alternative may explain why researchers often find evidence of in-sample, but not out-of-sample, predictive content.  相似文献   

3.
This article presents a formal explanation of the forecast combination puzzle, that simple combinations of point forecasts are repeatedly found to outperform sophisticated weighted combinations in empirical applications. The explanation lies in the effect of finite‐sample error in estimating the combining weights. A small Monte Carlo study and a reappraisal of an empirical study by Stock and Watson [Federal Reserve Bank of Richmond Economic Quarterly (2003) Vol. 89/3, pp. 71–90] support this explanation. The Monte Carlo evidence, together with a large‐sample approximation to the variance of the combining weight, also supports the popular recommendation to ignore forecast error covariances in estimating the weight.  相似文献   

4.
Many forecasts are conditional in nature. For example, a number of central banks routinely report forecasts conditional on particular paths of policy instruments. Even though conditional forecasting is common, there has been little work on methods for evaluating conditional forecasts. This paper provides analytical, Monte Carlo and empirical evidence on tests of predictive ability for conditional forecasts from estimated models. In the empirical analysis, we examine conditional forecasts obtained with a VAR in the variables included in the DSGE model of Smets and Wouters (American Economic Review 2007; 97 : 586–606). Throughout the analysis, we focus on tests of bias, efficiency and equal accuracy applied to conditional forecasts from VAR models. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

5.
在非线性平滑转移误差修正模型(ST-ECM)的协整检验中,由于存在未识别参数而使协整检验统计量构造困难,同时由于目前文献普遍使用的泰勒展开近似法并不能精确替代原始非线性模型,从而导致协整检验统计量功效较低。本文首先在遍历未识别参数的参数空间的基础上构造了ST-ECM模型协整检验的supF统计量,推导了supF统计量的极限分布并说明了其收敛性质。接着,蒙特卡洛仿真模拟结果显示,supF统计量在ST-ECM模型协整检验中具有良好的检验水平和功效,且supF统计量的功效明显优于EG统计量、F*NEC统计量和inft统计量。最后,本文对亚洲六个国家的利率期限结构预期假说进行了验证,结果表明中国、新加坡和泰国三个国家的利率期限结构预期假说成立且存在非线性调整效应,supF统计量较其他统计量具有更高的检验功效。  相似文献   

6.
We contrast the forecasting performance of alternative panel estimators, divided into three main groups: homogeneous, heterogeneous and shrinkage/Bayesian. Via a series of Monte Carlo simulations, the comparison is performed using different levels of heterogeneity and cross sectional dependence, alternative panel structures in terms of T and N and the specification of the dynamics of the error term. To assess the predictive performance, we use traditional measures of forecast accuracy (Theil’s U statistics, RMSE and MAE), the Diebold–Mariano test, and Pesaran and Timmerman’s statistic on the capability of forecasting turning points. The main finding of our analysis is that when the level of heterogeneity is high, shrinkage/Bayesian estimators are preferred, whilst when there is low or mild heterogeneity, homogeneous estimators have the best forecast accuracy.  相似文献   

7.
An article by Chan et al. ( 2013 ) published in the Journal of Business and Economic Statistics introduces a new model for trend inflation. They allow the trend inflation to evolve according to a bounded random walk. In order to draw the latent states from their respective conditional posteriors, they use accept–reject Metropolis–Hastings procedures. We reproduce their results using particle Markov chain Monte Carlo (PMCMC), which approaches drawing the latent states from a different technical point of view by relying on combining Markov chain Monte Carlo and sequential Monte Carlo methods. To conclude: we are able to reproduce the results of Chan et al. ( 2013 ). Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

8.
Modeling tourism: A fully identified VECM approach   总被引:1,自引:0,他引:1  
System-based cointegration methods have become popular tools for economic analysis and forecasting. However, the identification of structural relationships is often problematic. Using a theory-directed sequential reduction method suggested by Hall, Henry and Greenslade [Hall, S. G., Henry, S., & Greenslade, J. (2002). On the identification of cointegrated systems in small samples: A modelling strategy with an application to UK wages and prices. Journal of Economic Dynamics and Control, 26, 1517–1537], we estimate a vector error correction model of Hawaii tourism, where both demand and supply-side influences are important. We identify reasonable long-run equilibrium relationships, and Diebold–Mariano tests for forecast accuracy demonstrate satisfactory forecasting performance.  相似文献   

9.
We study the problem of testing hypotheses on the parameters of one- and two-factor stochastic volatility models (SV), allowing for the possible presence of non-regularities such as singular moment conditions and unidentified parameters, which can lead to non-standard asymptotic distributions. We focus on the development of simulation-based exact procedures–whose level can be controlled in finite samples–as well as on large-sample procedures which remain valid under non-regular conditions. We consider Wald-type, score-type and likelihood-ratio-type tests based on a simple moment estimator, which can be easily simulated. We also propose a C(α)-type test which is very easy to implement and exhibits relatively good size and power properties. Besides usual linear restrictions on the SV model coefficients, the problems studied include testing homoskedasticity against a SV alternative (which involves singular moment conditions under the null hypothesis) and testing the null hypothesis of one factor driving the dynamics of the volatility process against two factors (which raises identification difficulties). Three ways of implementing the tests based on alternative statistics are compared: asymptotic critical values (when available), a local Monte Carlo (or parametric bootstrap) test procedure, and a maximized Monte Carlo (MMC) procedure. The size and power properties of the proposed tests are examined in a simulation experiment. The results indicate that the C(α)-based tests (built upon the simple moment estimator available in closed form) have good size and power properties for regular hypotheses, while Monte Carlo tests are much more reliable than those based on asymptotic critical values. Further, in cases where the parametric bootstrap appears to fail (for example, in the presence of identification problems), the MMC procedure easily controls the level of the tests. Moreover, MMC-based tests exhibit relatively good power performance despite the conservative feature of the procedure. Finally, we present an application to a time series of returns on the Standard and Poor’s Composite Price Index.  相似文献   

10.
ABSTRACT

Observations recorded on ‘locations’ usually exhibit spatial dependence. In an effort to take into account both the spatial dependence and the possible underlying non-linear relationship, a partially linear single-index spatial regression model is proposed. This paper establishes the estimators of the unknowns. Moreover, it builds a generalized F-test to determine whether or not the data provide evidence on using linear settings in empirical studies. Their asymptotic properties are derived. Monte Carlo simulations indicate that the estimators and test statistic perform well. The analysis of Chinese house price data shows the existence of both spatial dependence and a non-linear relationship.  相似文献   

11.
Graph‐theoretic methods of causal search based on the ideas of Pearl (2000), Spirtes et al. (2000), and others have been applied by a number of researchers to economic data, particularly by Swanson and Granger (1997) to the problem of finding a data‐based contemporaneous causal order for the structural vector autoregression, rather than, as is typically done, assuming a weakly justified Choleski order. Demiralp and Hoover (2003) provided Monte Carlo evidence that such methods were effective, provided that signal strengths were sufficiently high. Unfortunately, in applications to actual data, such Monte Carlo simulations are of limited value, as the causal structure of the true data‐generating process is necessarily unknown. In this paper, we present a bootstrap procedure that can be applied to actual data (i.e. without knowledge of the true causal structure). We show with an applied example and a simulation study that the procedure is an effective tool for assessing our confidence in causal orders identified by graph‐theoretic search algorithms.  相似文献   

12.
We propose in this article a two‐step testing procedure of fractional cointegration in macroeconomic time series. It is based on Robinson's (Journal of the American Statistical Association, Vol. 89, p. 1420) univariate tests and is similar in spirit to the one proposed by Engle & Granger (Econometrica, Vol. 55, p. 251), testing initially the order of integration of the individual series and then, testing the degree of integration of the residuals from the cointegrating relationship. Finite‐sample critical values of the new tests are computed and Monte Carlo experiments are conducted to examine the size and the power properties of the tests in finite samples. An empirical application, using the same datasets as in Engle & Granger (Econometrica, Vol. 55, p. 251) and Campbell & Shiller (Journal of Political Economy, Vol. 95, p. 1062), is also carried out at the end of the article.  相似文献   

13.
This paper extends the cross‐sectionally augmented panel unit‐root test (CIPS) developed by Pesaran et al. (2013, Journal of Econometrics, Vol. 175, pp. 94–115) to allow for smoothing structural changes in deterministic terms modelled by a Fourier function. The proposed statistic is called the break augmented CIPS (BCIPS) statistic. We show that the non‐standard limiting distribution of the (truncated) BCIPS statistic exists and tabulate its critical values. Monte‐Carlo experiments point out that the sizes and powers of the BCIPS statistic are generally satisfactory as long as the number of time periods, T, is not less than fifty. The BCIPS test is then applied to examine the validity of long‐run purchasing power parity.  相似文献   

14.
Hotelling's T 2 statistic is an important tool for inference about the center of a multivariate normal population. However, hypothesis tests and confidence intervals based on this statistic can be adversely affected by outliers. Therefore, we construct an alternative inference technique based on a statistic which uses the highly robust MCD estimator [9] instead of the classical mean and covariance matrix. Recently, a fast algorithm was constructed to compute the MCD [10]. In our test statistic we use the reweighted MCD, which has a higher efficiency. The distribution of this new statistic differs from the classical one. Therefore, the key problem is to find a good approximation for this distribution. Similarly to the classical T 2 distribution, we obtain a multiple of a certain F-distribution. A Monte Carlo study shows that this distribution is an accurate approximation of the true distribution. Finally, the power and the robustness of the one-sample test based on our robust T 2 are investigated through simulation.  相似文献   

15.
This paper considers the implications of the permanent/transitory decomposition of shocks for identification of structural models in the general case where the model might contain more than one permanent structural shock. It provides a simple and intuitive generalization of the influential work of Blanchard and Quah [1989. The dynamic effects of aggregate demand and supply disturbances. The American Economic Review 79, 655–673], and shows that structural equations with known permanent shocks cannot contain error correction terms, thereby freeing up the latter to be used as instruments in estimating their parameters. The approach is illustrated by a re-examination of the identification schemes used by Wickens and Motto [2001. Estimating shocks and impulse response functions. Journal of Applied Econometrics 16, 371–387], Shapiro and Watson [1988. Sources of business cycle fluctuations. NBER Macroeconomics Annual 3, 111–148], King et al. [1991. Stochastic trends and economic fluctuations. American Economic Review 81, 819–840], Gali [1992. How well does the ISLM model fit postwar US data? Quarterly Journal of Economics 107, 709–735; 1999. Technology, employment, and the business cycle: Do technology shocks explain aggregate fluctuations? American Economic Review 89, 249–271] and Fisher [2006. The dynamic effects of neutral and investment-specific technology shocks. Journal of Political Economy 114, 413–451].  相似文献   

16.
We examine the performance of a metric entropy statistic as a robust test for time-reversibility (TR), symmetry, and serial dependence. It also serves as a measure of goodness-of-fit. The statistic provides a consistent and unified basis in model search, and is a powerful diagnostic measure with surprising ability to pinpoint areas of model failure. We provide empirical evidence comparing the performance of the proposed procedure with some of the modern competitors in nonlinear time-series analysis, such as robust implementations of the BDS and characteristic function-based tests of TR, along with correlation-based competitors such as the Ljung–Box Q-statistic. Unlike our procedure, each of its competitors is motivated for a different, specific, context and hypothesis. Our evidence is based on Monte Carlo simulations along with an application to several stock indices for the US equity market.  相似文献   

17.
Gabriela Ciuperca 《Metrika》2018,81(6):689-720
This article proposes a test statistic based on the adaptive LASSO quantile method to detect in real-time a change in a linear model. The model can have a large number of explanatory variables and the errors don’t satisfy the classical assumptions for a statistical model. For the proposed test statistic, the asymptotic distribution under \(H_0\) is obtained and the divergence under \(H_1\) is shown. It is shown via Monte Carlo simulations, in terms of empirical sizes, of empirical powers and of stopping time detection, that the useful test statistic for applications is better than other test statistics proposed in literature. Two applications on the air pollution and in the health field data are also considered.  相似文献   

18.
The main objective of this paper is to propose a feasible, model free estimator of the predictive density of integrated volatility. In this sense, we extend recent papers by Andersen et al. [Andersen, T.G., Bollerslev, T., Diebold, F.X., Labys, P., 2003. Modelling and forecasting realized volatility. Econometrica 71, 579–626], and by Andersen et al. [Andersen, T.G., Bollerslev, T., Meddahi, N., 2004. Analytic evaluation of volatility forecasts. International Economic Review 45, 1079–1110; Andersen, T.G., Bollerslev, T., Meddahi, N., 2005. Correcting the errors: Volatility forecast evaluation using high frequency data and realized volatilities. Econometrica 73, 279–296], who address the issue of pointwise prediction of volatility via ARMA models, based on the use of realized volatility. Our approach is to use a realized volatility measure to construct a non-parametric (kernel) estimator of the predictive density of daily volatility. We show that, by choosing an appropriate realized measure, one can achieve consistent estimation, even in the presence of jumps and microstructure noise in prices. More precisely, we establish that four well known realized measures, i.e. realized volatility, bipower variation, and two measures robust to microstructure noise, satisfy the conditions required for the uniform consistency of our estimator. Furthermore, we outline an alternative simulation based approach to predictive density construction. Finally, we carry out a simulation experiment in order to assess the accuracy of our estimators, and provide an empirical illustration that underscores the importance of using microstructure robust measures when using high frequency data.  相似文献   

19.
This note presents some properties of the stochastic unit‐root processes developed in Granger and Swanson [Journal of Econometrics (1997) Vol. 80, pp. 35–62] and Leybourne, McCabe and Tremayne [Journal of Business & Economic Statistics (1996) Vol. 14, pp. 435–446] that have not been or only implicitly discussed in the literature.  相似文献   

20.
Using the measure of risk aversion suggested by Kihlstrom and Mirman [Kihlstrom, R., Mirman, L., 1974. Risk aversion with many commodities. Journal of Economic Theory 8, 361–388; Kihlstrom, R., Mirman, L., 1981. Constant, increasing and decreasing risk aversion with many commodities. Review of Economic Studies 48, 271–280], we propose a dynamic consumption-savings–portfolio choice model in which the consumer-investor maximizes the expected value of a non-additively separable utility function of current and future consumption. Preferences for consumption streams are CES and the elasticity of substitution can be chosen independently of the risk aversion measure. The additively separable case is a special case. Because choices are not dynamically consistent, we follow the “consistent planning” approach of Strotz [Strotz, R., 1956. Myopia and inconsistency in dynamic utility maximization. Review of Economic Studies 23, 165–180] and also interpret our analysis from the game theoretic perspective taken by Peleg and Yaari [Peleg, B., Yaari, M., 1973. On the existence of a consistent course of action when tastes are changing. Review of Economic Studies 40, 391–401]. The equilibrium of the Lucas asset pricing model with i.i.d. consumption growth is obtained and the equity premium is shown to depend on the elasticity of substitution as well as the risk aversion measure. The nature of the dependence is examined. Our results are contrasted with those of the non-expected utility recursive approach of Epstein–Zin and Weil.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号