首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 578 毫秒
1.
This paper provides detailed responses to the following eight discussants of my paper ‘To criticize the critics: an objective Bayesian analysis of stochastic trends’: Gary Koop and Mark Steel; Edward Learner; In-Moo Kim and G. S. Maddala; Dale J. Poirier; Peter C. Schotman and Herman K. van Dijk; James H. Stock; David DeJong and Charles H. Whiteman; and Christopher Sims. This reply puts new emphasis on the call made in the earlier paper for objective Bayesian analysis in time-series; it underlines the need for a new approach, especially with regard to posterior odds testing; and it draws attention to a new methodology of Bayesian analysis developed in a recent paper by Phillips and Ploberger (1991a). Some new simulations that shed light on certain comments of the discussants are provided; new empirical evidence is reported with the extended Nelson-Plosser data supplied by Schotman and van Dijk; and the new Phillips-Ploberger posterior odds test is given a brief empirical illustration.  相似文献   

2.
This paper is a comment on P. C. B. Phillips, ‘To criticise the critics: an objective Bayesian analysis of stochastic trends’ [Phillips, (1991)]. Departing from the likelihood of an univariate autoregressive model different routes that lead to a posterior odds analysis of the unit root hypothesis are explored, where the differences in routes are due to the different choices of the prior. Improper priors like the uniform and the Jeffreys prior are less suited for Bayesian inference on a sharp null hypothesis as the unit root. A proper normal prior on the mean of the process is analysed and empirical results using extended Nelson-Plosser data are presented.  相似文献   

3.
In DeJong and Whiteman (1991a), we concluded that 11 of the 14 macroeconomic time-series originally studied by Nelson and Plosser (1982) supported trend-stationarity. Phillips (1991) criticizes this inference, claiming that our procedure is biased against integration, and that our results are sensitive to model and prior specification. However, Phillips' alternative models and priors bias his results in favour of integration; despite these biases, Phillips' own findings indicate that the data provide the greatest relative support to trend-stationarity. This result is similar to our own (1989, 1990, 1991b) findings concerning the sensitivity of our results; the trend-stationarity inference is remarkably robust.  相似文献   

4.
Large Bayesian VARs with stochastic volatility are increasingly used in empirical macroeconomics. The key to making these highly parameterized VARs useful is the use of shrinkage priors. We develop a family of priors that captures the best features of two prominent classes of shrinkage priors: adaptive hierarchical priors and Minnesota priors. Like adaptive hierarchical priors, these new priors ensure that only ‘small’ coefficients are strongly shrunk to zero, while ‘large’ coefficients remain intact. At the same time, these new priors can also incorporate many useful features of the Minnesota priors such as cross-variable shrinkage and shrinking coefficients on higher lags more aggressively. We introduce a fast posterior sampler to estimate BVARs with this family of priors—for a BVAR with 25 variables and 4 lags, obtaining 10,000 posterior draws takes about 3 min on a standard desktop computer. In a forecasting exercise, we show that these new priors outperform both adaptive hierarchical priors and Minnesota priors.  相似文献   

5.
Many structural break and regime-switching models have been used with macroeconomic and financial data. In this paper, we develop an extremely flexible modeling approach which can accommodate virtually any of these specifications. We build on earlier work showing the relationship between flexible functional forms and random variation in parameters. Our contribution is based around the use of priors on the time variation that is developed from considering a hypothetical reordering of the data and distance between neighboring (reordered) observations. The range of priors produced in this way can accommodate a wide variety of nonlinear time series models, including those with regime-switching and structural breaks. By allowing the amount of random variation in parameters to depend on the distance between (reordered) observations, the parameters can evolve in a wide variety of ways, allowing for everything from models exhibiting abrupt change (e.g. threshold autoregressive models or standard structural break models) to those which allow for a gradual evolution of parameters (e.g. smooth transition autoregressive models or time varying parameter models). Bayesian econometric methods for inference are developed for estimating the distance function and types of hypothetical reordering. Conditional on a hypothetical reordering and distance function, a simple reordering of the actual data allows us to estimate our models with standard state space methods by a simple adjustment to the measurement equation. We use artificial data to show the advantages of our approach, before providing two empirical illustrations involving the modeling of real GDP growth.  相似文献   

6.
Bayesian model selection with posterior probabilities and no subjective prior information is generally not possible because of the Bayes factors being ill‐defined. Using careful consideration of the parameter of interest in cointegration analysis and a re‐specification of the triangular model of Phillips (Econometrica, Vol. 59, pp. 283–306, 1991), this paper presents an approach that allows for Bayesian comparison of models of cointegration with ‘ignorance’ priors. Using the concept of Stiefel and Grassman manifolds, diffuse priors are specified on the dimension and direction of the cointegrating space. The approach is illustrated using a simple term structure of the interest rates model.  相似文献   

7.
Due to weaknesses in traditional tests, a Bayesian approach is developed to investigate whether unit roots exist in macroeconomic time-series. Bayesian posterior odds comparing unit root models to stationary and trend-stationary alternatives are calculated using informative priors. Two classes of reference priors which are informative but require minimal subjective prior input are used. In this sense the Bayesian unit root tests developed here are objective. Bayesian procedures are carried out on the Nelson–Plosser and Shiller data sets as well as on generated data. The conclusion is that the failure of classical procedures to reject the unit root hypothesis is not necessarily proof that a unit root is present with high probability.  相似文献   

8.
Many recent papers in macroeconomics have used large vector autoregressions (VARs) involving 100 or more dependent variables. With so many parameters to estimate, Bayesian prior shrinkage is vital to achieve reasonable results. Computational concerns currently limit the range of priors used and render difficult the addition of empirically important features such as stochastic volatility to the large VAR. In this paper, we develop variational Bayesian methods for large VARs that overcome the computational hurdle and allow for Bayesian inference in large VARs with a range of hierarchical shrinkage priors and with time-varying volatilities. We demonstrate the computational feasibility and good forecast performance of our methods in an empirical application involving a large quarterly US macroeconomic data set.  相似文献   

9.
Computationally efficient methods for Bayesian analysis of seemingly unrelated regression (SUR) models are described and applied that involve the use of a direct Monte Carlo (DMC) approach to calculate Bayesian estimation and prediction results using diffuse or informative priors. This DMC approach is employed to compute Bayesian marginal posterior densities, moments, intervals and other quantities, using data simulated from known models and also using data from an empirical example involving firms’ sales. The results obtained by the DMC approach are compared to those yielded by the use of a Markov Chain Monte Carlo (MCMC) approach. It is concluded from these comparisons that the DMC approach is worthwhile and applicable to many SUR and other problems.  相似文献   

10.
In Bayesian analysis of vector autoregressive models, and especially in forecasting applications, the Minnesota prior of Litterman is frequently used. In many cases other prior distributions provide better forecasts and are preferable from a theoretical standpoint. Several of these priors require numerical methods in order to evaluate the posterior distribution. Different ways of implementing Monte Carlo integration are considered. It is found that Gibbs sampling performs as well as, or better, then importance sampling and that the Gibbs sampling algorithms are less adversely affected by model size. We also report on the forecasting performance of the different prior distributions. © 1997 by John Wiley & Sons, Ltd.  相似文献   

11.
This paper compares alternative models of time‐varying volatility on the basis of the accuracy of real‐time point and density forecasts of key macroeconomic time series for the USA. We consider Bayesian autoregressive and vector autoregressive models that incorporate some form of time‐varying volatility, precisely random walk stochastic volatility, stochastic volatility following a stationary AR process, stochastic volatility coupled with fat tails, GARCH and mixture of innovation models. The results show that the AR and VAR specifications with conventional stochastic volatility dominate other volatility specifications, in terms of point forecasting to some degree and density forecasting to a greater degree. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

12.
In this paper, we introduce a threshold stochastic volatility model with explanatory variables. The Bayesian method is considered in estimating the parameters of the proposed model via the Markov chain Monte Carlo (MCMC) algorithm. Gibbs sampling and Metropolis–Hastings sampling methods are used for drawing the posterior samples of the parameters and the latent variables. In the simulation study, the accuracy of the MCMC algorithm, the sensitivity of the algorithm for model assumptions, and the robustness of the posterior distribution under different priors are considered. Simulation results indicate that our MCMC algorithm converges fast and that the posterior distribution is robust under different priors and model assumptions. A real data example was analyzed to explain the asymmetric behavior of stock markets.  相似文献   

13.
We develop a Bayesian median autoregressive (BayesMAR) model for time series forecasting. The proposed method utilizes time-varying quantile regression at the median, favorably inheriting the robustness of median regression in contrast to the widely used mean-based methods. Motivated by a working Laplace likelihood approach in Bayesian quantile regression, BayesMAR adopts a parametric model bearing the same structure as autoregressive models by altering the Gaussian error to Laplace, leading to a simple, robust, and interpretable modeling strategy for time series forecasting. We estimate model parameters by Markov chain Monte Carlo. Bayesian model averaging is used to account for model uncertainty, including the uncertainty in the autoregressive order, in addition to a Bayesian model selection approach. The proposed methods are illustrated using simulations and real data applications. An application to U.S. macroeconomic data forecasting shows that BayesMAR leads to favorable and often superior predictive performance compared to the selected mean-based alternatives under various loss functions that encompass both point and probabilistic forecasts. The proposed methods are generic and can be used to complement a rich class of methods that build on autoregressive models.  相似文献   

14.
This paper extends the existing fully parametric Bayesian literature on stochastic volatility to allow for more general return distributions. Instead of specifying a particular distribution for the return innovation, nonparametric Bayesian methods are used to flexibly model the skewness and kurtosis of the distribution while the dynamics of volatility continue to be modeled with a parametric structure. Our semiparametric Bayesian approach provides a full characterization of parametric and distributional uncertainty. A Markov chain Monte Carlo sampling approach to estimation is presented with theoretical and computational issues for simulation from the posterior predictive distributions. An empirical example compares the new model to standard parametric stochastic volatility models.  相似文献   

15.
Bayesian priors are often used to restrain the otherwise highly over‐parametrized vector autoregressive (VAR) models. The currently available Bayesian VAR methodology does not allow the user to specify prior beliefs about the unconditional mean, or steady state, of the system. This is unfortunate as the steady state is something that economists usually claim to know relatively well. This paper develops easily implemented methods for analyzing both stationary and cointegrated VARs, in reduced or structural form, with an informative prior on the steady state. We document that prior information on the steady state leads to substantial gains in forecasting accuracy on Swedish macro data. A second example illustrates the use of informative steady‐state priors in a cointegration model of the consumption‐wealth relationship in the USA. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

16.
Recently, there has been considerable work on stochastic time-varying coefficient models as vehicles for modelling structural change in the macroeconomy with a focus on the estimation of the unobserved paths of random coefficient processes. The dominant estimation methods, in this context, are based on various filters, such as the Kalman filter, that are applicable when the models are cast in state space representations. This paper introduces a new class of autoregressive bounded processes that decompose a time series into a persistent random attractor, a time varying autoregressive component, and martingale difference errors. The paper examines, rigorously, alternative kernel based, nonparametric estimation approaches for such models and derives their basic properties. These estimators have long been studied in the context of deterministic structural change, but their use in the presence of stochastic time variation is novel. The proposed inference methods have desirable properties such as consistency and asymptotic normality and allow a tractable studentization. In extensive Monte Carlo and empirical studies, we find that the methods exhibit very good small sample properties and can shed light on important empirical issues such as the evolution of inflation persistence and the purchasing power parity (PPP) hypothesis.  相似文献   

17.
The paper compares, by a Monte-Carlo study based on an AR(1) model, the performance of the flat prior and the ignorance prior suggested by Phillips. It argues that the ignorance prior gives heavy weight to values of the autoregressive parameter p higher than 1, and hence distorts the sample evidence as summarized in the likelihood function. It yields bimodal posterior distributions, with the second mode at p higher than 1, even when the true value of p is substantially less than 1.  相似文献   

18.
Bayesian stochastic search for VAR model restrictions   总被引:1,自引:0,他引:1  
We propose a Bayesian stochastic search approach to selecting restrictions for vector autoregressive (VAR) models. For this purpose, we develop a Markov chain Monte Carlo (MCMC) algorithm that visits high posterior probability restrictions on the elements of both the VAR regression coefficients and the error variance matrix. Numerical simulations show that stochastic search based on this algorithm can be effective at both selecting a satisfactory model and improving forecasting performance. To illustrate the potential of our approach, we apply our stochastic search to VAR modeling of inflation transmission from producer price index (PPI) components to the consumer price index (CPI).  相似文献   

19.
Hedonic price models are widely employed to estimate implicit prices for bundled attributes. Residential property value studies dominate these applications. Using a representative cross-sectional property value data set, we employ Bayesian methods to translate a range of priors in covariate selection typical of hedonic property value studies into a range of posterior estimates. We also formulate priors regarding measurement error in individual covariates and compute the ranges of resulting posterior means. Finally, we empirically demonstrate that a greater and more systematic use of prior information drawn from one's own data and from other studies can break the collinearity deadlock in this data.  相似文献   

20.
In this paper, we propose a Bayesian estimation and forecasting procedure for noncausal autoregressive (AR) models. Specifically, we derive the joint posterior density of the past and future errors and the parameters, yielding predictive densities as a by‐product. We show that the posterior model probabilities provide a convenient model selection criterion in discriminating between alternative causal and noncausal specifications. As an empirical application, we consider US inflation. The posterior probability of noncausality is found to be high—over 98%. Furthermore, the purely noncausal specifications yield more accurate inflation forecasts than alternative causal and noncausal AR models. Copyright © 2010 John Wiley & Sons, Ltd.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号