首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
Markov chain Monte Carlo (MCMC) methods have become a ubiquitous tool in Bayesian analysis. This paper implements MCMC methods for Bayesian analysis of stochastic frontier models using the WinBUGS package, a freely available software. General code for cross-sectional and panel data are presented and various ways of summarizing posterior inference are discussed. Several examples illustrate that analyses with models of genuine practical interest can be performed straightforwardly and model changes are easily implemented. Although WinBUGS may not be that efficient for more complicated models, it does make Bayesian inference with stochastic frontier models easily accessible for applied researchers and its generic structure allows for a lot of flexibility in model specification.   相似文献   

2.
This paper analyzes the drivers of financial distress that were experienced by small Italian cooperative banks during the latest deep recession, focusing mainly on the importance of bank capital as a predictor of bankruptcy for Italian nonprofit banks. The analysis aims to build an early-warning model that is suitable for this type of bank.The results reveal non-monotonic effects of bank capital on the probability of failure. In contrast to distress models for for-profit banks, non-performing loans, profitability, liquidity, and management quality have a negligible predictive value. The findings also show that unreserved impaired loans have an important impact on the probability of bank distress. Moreover, the loan–loss ratio provision on substandard loans constitutes a suitable antibody against bank distress. Overall, the results are robust in terms of both the methodology (i.e., frequentist and Bayesian approaches) and the sample used (i.e., cooperative banks in Italy and euro-area countries).  相似文献   

3.
In contrast to a posterior analysis given a particular sampling model, posterior model probabilities in the context of model uncertainty are typically rather sensitive to the specification of the prior. In particular, ‘diffuse’ priors on model-specific parameters can lead to quite unexpected consequences. Here we focus on the practically relevant situation where we need to entertain a (large) number of sampling models and we have (or wish to use) little or no subjective prior information. We aim at providing an ‘automatic’ or ‘benchmark’ prior structure that can be used in such cases. We focus on the normal linear regression model with uncertainty in the choice of regressors. We propose a partly non-informative prior structure related to a natural conjugate g-prior specification, where the amount of subjective information requested from the user is limited to the choice of a single scalar hyperparameter g0j. The consequences of different choices for g0j are examined. We investigate theoretical properties, such as consistency of the implied Bayesian procedure. Links with classical information criteria are provided. More importantly, we examine the finite sample implications of several choices of g0j in a simulation study. The use of the MC3 algorithm of Madigan and York (Int. Stat. Rev. 63 (1995) 215), combined with efficient coding in Fortran, makes it feasible to conduct large simulations. In addition to posterior criteria, we shall also compare the predictive performance of different priors. A classic example concerning the economics of crime will also be provided and contrasted with results in the literature. The main findings of the paper will lead us to propose a ‘benchmark’ prior specification in a linear regression context with model uncertainty.  相似文献   

4.
Abstract

We attempt to clarify a number of points regarding use of spatial regression models for regional growth analysis. We show that as in the case of non-spatial growth regressions, the effect of initial regional income levels wears off over time. Unlike the non-spatial case, long-run regional income levels depend on: own region as well as neighbouring region characteristics, the spatial connectivity structure of the regions, and the strength of spatial dependence. Given this, the search for regional characteristics that exert important influences on income levels or growth rates should take place using spatial econometric methods that account for spatial dependence as well as own and neighbouring region characteristics, the type of spatial regression model specification, and weight matrix. The framework adopted here illustrates a unified approach for dealing with these issues.  相似文献   

5.
    
The goal of this article is to develop a flexible Bayesian analysis of regression models for continuous and categorical outcomes. In the models we study, covariate (or regression) effects are modeled additively by cubic splines, and the error distribution (that of the latent outcomes in the case of categorical data) is modeled as a Dirichlet process mixture. We employ a relatively unexplored but attractive basis in which the spline coefficients are the unknown function ordinates at the knots. We exploit this feature to develop a proper prior distribution on the coefficients that involves the first and second differences of the ordinates, quantities about which one may have prior knowledge. We also discuss the problem of comparing models with different numbers of knots or different error distributions through marginal likelihoods and Bayes factors which are computed within the framework of Chib (1995) as extended to DPM models by Basu and Chib (2003). The techniques are illustrated with simulated and real data.  相似文献   

6.
We develop a Bayesian median autoregressive (BayesMAR) model for time series forecasting. The proposed method utilizes time-varying quantile regression at the median, favorably inheriting the robustness of median regression in contrast to the widely used mean-based methods. Motivated by a working Laplace likelihood approach in Bayesian quantile regression, BayesMAR adopts a parametric model bearing the same structure as autoregressive models by altering the Gaussian error to Laplace, leading to a simple, robust, and interpretable modeling strategy for time series forecasting. We estimate model parameters by Markov chain Monte Carlo. Bayesian model averaging is used to account for model uncertainty, including the uncertainty in the autoregressive order, in addition to a Bayesian model selection approach. The proposed methods are illustrated using simulations and real data applications. An application to U.S. macroeconomic data forecasting shows that BayesMAR leads to favorable and often superior predictive performance compared to the selected mean-based alternatives under various loss functions that encompass both point and probabilistic forecasts. The proposed methods are generic and can be used to complement a rich class of methods that build on autoregressive models.  相似文献   

7.
In many applications involving time-varying parameter VARs, it is desirable to restrict the VAR coefficients at each point in time to be non-explosive. This is an example of a problem where inequality restrictions are imposed on states in a state space model. In this paper, we describe how existing MCMC algorithms for imposing such inequality restrictions can work poorly (or not at all) and suggest alternative algorithms which exhibit better performance. Furthermore, we show that previous algorithms involve an approximation relating to a key prior integrating constant. Our algorithms are exact, not involving this approximation. In an application involving a commonly used U.S. data set, we present evidence that the algorithms proposed in this paper work well.  相似文献   

8.
    
This article reviews the application of some advanced Monte Carlo techniques in the context of multilevel Monte Carlo (MLMC). MLMC is a strategy employed to compute expectations, which can be biassed in some sense, for instance, by using the discretization of an associated probability law. The MLMC approach works with a hierarchy of biassed approximations, which become progressively more accurate and more expensive. Using a telescoping representation of the most accurate approximation, the method is able to reduce the computational cost for a given level of error versus i.i.d. sampling from this latter approximation. All of these ideas originated for cases where exact sampling from couples in the hierarchy is possible. This article considers the case where such exact sampling is not currently possible. We consider some Markov chain Monte Carlo and sequential Monte Carlo methods, which have been introduced in the literature, and we describe different strategies that facilitate the application of MLMC within these methods.  相似文献   

9.
    
In the Bayesian approach to model selection and hypothesis testing, the Bayes factor plays a central role. However, the Bayes factor is very sensitive to prior distributions of parameters. This is a problem especially in the presence of weak prior information on the parameters of the models. The most radical consequence of this fact is that the Bayes factor is undetermined when improper priors are used. Nonetheless, extending the non-informative approach of Bayesian analysis to model selection/testing procedures is important both from a theoretical and an applied viewpoint. The need to develop automatic and robust methods for model comparison has led to the introduction of several alternative Bayes factors. In this paper we review one of these methods: the fractional Bayes factor (O'Hagan, 1995). We discuss general properties of the method, such as consistency and coherence. Furthermore, in addition to the original, essentially asymptotic justifications of the fractional Bayes factor, we provide further finite-sample motivations for its use. Connections and comparisons to other automatic methods are discussed and several issues of robustness with respect to priors and data are considered. Finally, we focus on some open problems in the fractional Bayes factor approach, and outline some possible answers and directions for future research.  相似文献   

10.
We develop a minimum amount of theory of Markov chains at as low a level of abstraction as possible in order to prove two fundamental probability laws for standard Markov chain Monte Carlo algorithms:
1. The law of large numbers explains why the algorithm works: it states that the empirical means calculated from the samples converge towards their "true" expected values, viz. expectations with respect to the invariant distribution of the associated Markov chain (=the target distribution of the simulation).
2. The central limit theorem expresses the deviations of the empirical means from their expected values in terms of asymptotically normally distributed random variables. We also present a formula and an estimator for the associated variance.  相似文献   

11.
In this paper we study the Candy model, a marked point process introduced by S toica et al. (2000) . We prove Ruelle and local stability, investigate its Markov properties, and discuss how the model may be sampled. Finally, we consider estimation of the model parameters and present a simulation study.  相似文献   

12.
Abstract.  This survey presents the set of methods available in the literature on selection bias correction, when selection is specified as a multinomial logit model. It contrasts the underlying assumptions made by the different methods and shows results from a set of Monte Carlo experiments. We find that, in many cases, the approach initiated by Dubin and MacFadden (1984) as well as the semi-parametric alternative recently proposed by Dahl (2002) are to be preferred to the most commonly used Lee (1983) method. We also find that a restriction imposed in the original Dubin and MacFadden paper can be waived to achieve more robust estimators. Monte Carlo experiments also show that selection bias correction based on the multinomial logit model can provide fairly good correction for the outcome equation, even when the IIA hypothesis is violated.  相似文献   

13.
    
Bayesian modification indices are presented that provide information for the process of model evaluation and model modification. These indices can be used to investigate the improvement in a model if fixed parameters are re-specified as free parameters. The indices can be seen as a Bayesian analogue to the modification indices commonly used in a frequentist framework. The aim is to provide diagnostic information for multi-parameter models where the number of possible model violations and the related number of alternative models is too large to render estimation of each alternative practical. As an example, the method is applied to an item response theory (IRT) model, that is, to the two-parameter model. The method is used to investigate differential item functioning and violations of the assumption of local independence.  相似文献   

14.
This paper develops Bayesian methodology for estimating long-term trends in the daily maxima of tropospheric ozone. The methods are then applied to study long-term trends in ozone at six monitoring sites in the state of Texas. The methodology controls for the effects of meteorological variables because it is known that variables such as temperature, wind speed and humidity substantially affect the formation of tropospheric ozone. A semiparametric regression model is estimated in which a nonparametric trivariate surface is used to model the relationship between ozone and these meteorological variables because, while it is known that the relatinship is a complex nonlinear one, its functional form is unknown. The model also allows for the effects of wind direction and seasonality. The errors are modeled as an autoregression, which is methodologically challenging because the observations are unequally spaced over time. Each function in the model is represented as a linear combination of basis functions located at all of the design points. We also estimate an appropriate data transformation simulataneously with the functions. The functions are estimated nonparametrically by a Bayesian hierarchical model that uses indicator variables to allow a non-zero probability that the coefficient of each basis term is zero. The entire model, including the nonparametric surfaces, data transformation and autoregression for the unequally spaced errors, is estimated using a Markov chain Monte Carlo sampling scheme with a computationally efficient transition kernel for generating the indicator variables. The empirical results indicate that key meteorological variables explain most of the variation in daily ozone maxima through a nonlinear interaction and that their effects are consistent across the six sites. However, the estimated trends vary considerably from site to site, even within the same city.  相似文献   

15.
A new approach to Markov chain Monte Carlo simulation was recently proposed by Propp and Wilson. This approach, unlike traditional ones, yields samples which have exactly the desired distribution. The Propp–Wilson algorithm requires this distribution to have a certain structure called monotonicity. In this paper an idea of Kendall is applied to show how the algorithm can be extended to the case where monotonicity is replaced by anti-monotonicity. As illustrating examples, simulations of the hard-core model and the random-cluster model are presented.  相似文献   

16.
    
In application areas which involve digitised speech and audio signals, such as coding, digital remastering of old recordings and recognition of speech, it is often desirable to reduce the effects of noise with the aim of enhancing intelligibility and perceived sound quality. We consider the case where noise sources contain non-Gaussian, impulsive elements superimposed upon a continuous Gaussian background. Such a situation arises in areas such as communications channels, telephony and gramophone recordings where impulsive effects might be caused by electromagnetic interference (lightning strikes), electrical switching noise or defects in recording media, while electrical circuit noise or the combined effect of many distant atmospheric events lead to a continuous Gaussian component.
In this paper we discuss the background to this type of noise degradation and describe briefly some existing statistical techniques for noise reduction. We propose new methods for enhancement based upon Markov chain Monte Carlo (MCMC) simulation. Signals are modelled as autoregressive moving-average (ARMA); while noise sources are treated as discrete and continuous mixtures of Gaussian distributions. Results are presented for both real and artificially corrupted data sequences, illustrating the potential of the new methods.  相似文献   

17.
There exists a common belief among researchers and regional policy makers that the actual central system of Aeropuertos Españoles y Navegación Aérea (AENA) should be changed to one more decentralized where airport managers could have more autonomy. The main objective of this article is to evaluate the efficiency of the Spanish airports using Markov chain Monte Carlo (MCMC) simulation to estimate a stochastic frontier analysis (SFA) model. Our results show the existence of a significant level of inefficiency in airport operations. Additionally, we provide efficient marginal cost estimates for each airport which also cast some doubts about the current pricing practices.  相似文献   

18.
    
These days, road safety has become a major concern in most modern societies. In this respect, the determination of road locations that are more dangerous than others (black spots or also called sites with promise) can help in better scheduling road safety policies. The present paper proposes a multivariate model to identify and rank sites according to their total expected cost to the society. Bayesian estimation of the model via a Markov Chain Monte Carlo approach is discussed in this paper. To illustrate the proposed model, accident data from 23,184 accident locations in Flanders (Belgium) are used and a cost function proposed by the European Transport Safety Council is adopted to illustrate the model. It is shown in the paper that the model produces insightful results that can help policy makers in prioritizing road infrastructure investments.  相似文献   

19.
    
This paper compares the performance of Bayesian variable selection approaches for spatial autoregressive models. It presents two alternative approaches that can be implemented using Gibbs sampling methods in a straightforward way and which allow one to deal with the problem of model uncertainty in spatial autoregressive models in a flexible and computationally efficient way. A simulation study shows that the variable selection approaches tend to outperform existing Bayesian model averaging techniques in terms of both in-sample predictive performance and computational efficiency. The alternative approaches are compared in an empirical application using data on economic growth for European NUTS-2 regions.  相似文献   

20.
本文采用Bayes方法对非参数空间滞后模型进行全面分析,包括参数的估计以及用自由节点样条来拟合未知联系函数。所建议的Bayes方法通过逆跳Markov chain Monte carlo算法(RJMCMC)来实现。在进行贝叶斯分析时,对样条系数与误差方差选取共轭的正态—逆伽玛先验分布,进而获得其他未知量的边际后验分布;另外,文章还设计了一个简单但一般的随机游动Metropolis抽样器,以方便从空间权重因子的条件后验分布中进行抽样。最后应用所建议的方法进行数值模拟。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号