首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Markov Chain Monte Carlo (MCMC) methods are used to sample from complicated multivariate distributions with normalizing constants that may not be computable in practice and from which direct sampling is not feasible. A fundamental problem is to determine convergence of the chains. Propp & Wilson (1996) devised a Markov chain algorithm called Coupling From The Past (CFTP) that solves this problem, as it produces exact samples from the target distribution and determines automatically how long it needs to run. Exact sampling by CFTP and other methods is currently a thriving research topic. This paper gives a review of some of these ideas, with emphasis on the CFTP algorithm. The concepts of coupling and monotone CFTP are introduced, and results on the running time of the algorithm presented. The interruptible method of Fill (1998) and the method of Murdoch & Green (1998) for exact sampling for continuous distributions are presented. Novel simulation experiments are reported for exact sampling from the Ising model in the setting of Bayesian image restoration, and the results are compared to standard MCMC. The results show that CFTP works at least as well as standard MCMC, with convergence monitored by the method of Raftery & Lewis (1992, 1996).  相似文献   

2.
The purpose of this paper is to develop an efficiency measurement model by enhancing a CCR (Charnes?CCooper?CRhodes) model and then to prove that the enhanced model satisfies five desirable properties: indication, strict monotonicity, homogeneity, continuity and units unvariance. In order for our model to be empirically tractable, we also provide an algorithm aimed at estimating efficiency scores.  相似文献   

3.
A family of direct utility functions is constructed which exhibit the characteristic of Giffenity, while satisfying the axioms of convexity and monotonicity. The approach starts by specifying a price offer curve, C0, with a required backward-bending segment. Then a set of convex indifference curves is constructed having price offer curve arbitrarily close to C0.  相似文献   

4.
In order to better understand the monotonicity properties which characterize the gradient of pseudoconvex and quasi convex funzions, psedomonotonicity and quasi monotonicity can be introduced. A quite different approach is proposed in this paper, by defining new order relations, whose preservation leads precisely to several kinds of pseudoconvexity and quasi convexity. Some general properties of order preservation are proved; they are useful to state necessary and sufficient conditions of monotonicity for particular orders related with generalized convexity.  相似文献   

5.
In epidemiology and clinical research, there is often a proportion of unexposed individuals resulting in zero values of exposure, meaning that some individuals are not exposed and those exposed have some continuous distribution. Examples are smoking or alcohol consumption. We will call these variables with a spike at zero (SAZ). In this paper, we performed a systematic investigation on how to model covariates with a SAZ and derived theoretical odds ratio functions for selected bivariate distributions. We consider the bivariate normal and bivariate log normal distribution with a SAZ. Both confounding and effect modification can be elegantly described by formalizing the covariance matrix given the binary outcome variable Y. To model the effect of these variables, we use a procedure based on fractional polynomials first introduced by Royston and Altman (1994, Applied Statistics 43: 429–467) and modified for the SAZ situation (Royston and Sauerbrei, 2008, Multivariable model‐building: a pragmatic approach to regression analysis based on fractional polynomials for modelling continuous variables, Wiley; Becher et al., 2012, Biometrical Journal 54: 686–700). We aim to contribute to theory, practical procedures and application in epidemiology and clinical research to derive multivariable models for variables with a SAZ. As an example, we use data from a case–control study on lung cancer.  相似文献   

6.
This paper estimates a class of models which satisfy a monotonicity condition on the conditional quantile function of the response variable. This class includes as a special case the monotonic transformation model with the error term satisfying a conditional quantile restriction, thus allowing for very general forms of conditional heteroscedasticity. A two-stage approach is adopted to estimate the relevant parameters. In the first stage the conditional quantile function is estimated nonparametrically by the local polynomial estimator discussed in Chaudhuri (Journal of Multivariate Analysis 39 (1991a) 246–269; Annals of Statistics 19 (1991b) 760–777) and Cavanagh (1996, Preprint). In the second stage, the monotonicity of the quantile function is exploited to estimate the parameters of interest by maximizing a rank-based objective function. The proposed estimator is shown to have desirable asymptotic properties and can then also be used for dimensionality reduction or to estimate the unknown structural function in the context of a transformation model.  相似文献   

7.
In this study, a stochastic multi-objective mixed-integer mathematical programming is proposed for logistic distribution and evacuation planning during an earthquake. Decisions about the pre- and post-phases of the disaster are considered seamless. The decisions of the pre-disaster phase relate to the location of permanent relief distribution centers and the number of the commodities to be stored. The decisions of the second phase are to determine the optimal location for the establishment of temporary care centers to increase the speed of treating the injured people and the distribution of the commodities at the affected areas. Humanitarian and cost issues are considered in the proposed models through three objective functions. Several sets of constraints are also considered in the proposed model to make it flexible to handle real issues. Demands for food, blood, water, blanket, and tent are assumed to be probabilistic which are related to several complicated factors and modeled using a complicated network in this study. A simulation is setup to generate the probabilistic distribution of demands through several scenarios. The stochastic demands are assumed as inputs for the proposed stochastic multi-objective mixed integer mathematical programming model.The model is transformed to its deterministic equivalent using chance constraint programming approach. The equivalent deterministic model is solved using an efficient epsilon-constraint approach and an evolutionary algorithm, called non-dominated sorting genetic algorithm (NSGA-II). First several illustrative numerical examples are solved using both solution procedures. The performance of solution procedures is compared and the most efficient solution procedure, i.e., NSGA-II, is used to handle the case study of Tehran earthquake. The results are promising and show that the proposed model and the solution approach can handle the real case study in an efficient way.  相似文献   

8.
We propose a novel methodology for identification of first-price auctions, when bidders’ private valuations are independent conditional on one-dimensional unobserved heterogeneity. We extend the existing literature ( and ) by allowing the unobserved heterogeneity to be non-separable from bidders’ valuations. Our central identifying assumption is that the distribution of bidder values is increasing in the state. When the state-space is finite, such monotonicity implies the full-rank condition needed for identification. Further, we extend our approach to the conditionally independent private values model of Li et al. (2000), as well as to unobserved heterogeneity settings in which the implicit reserve price or the cost of bidding varies across auctions.  相似文献   

9.
The proportional odds model is the most widely used model when the response has ordered categories. In the case of high‐dimensional predictor structure, the common maximum likelihood approach typically fails when all predictors are included. A boosting technique pomBoost is proposed to fit the model by implicitly selecting the influential predictors. The approach distinguishes between metric and categorical predictors. In the case of categorical predictors, where each predictor relates to a set of parameters, the objective is to select simultaneously all the associated parameters. In addition, the approach distinguishes between nominal and ordinal predictors. In the case of ordinal predictors, the proposed technique uses the ordering of the ordinal predictors by penalizing the difference between the parameters of adjacent categories. The technique has also a provision to consider some mandatory predictors (if any) that must be part of the final sparse model. The performance of the proposed boosting algorithm is evaluated in a simulation study and applications with respect to mean squared error and prediction error. Hit rates and false alarm rates are used to judge the performance of pomBoost for selection of the relevant predictors.  相似文献   

10.
Discrete choice experiments are widely used to learn about the distribution of individual preferences for product attributes. Such experiments are often designed and conducted deliberately for the purpose of designing new products. There is a long-standing literature on nonparametric and Bayesian modelling of preferences for the study of consumer choice when there is a market for each product, but this work does not apply when such markets fail to exist as is the case with most product attributes. This paper takes up the common case in which attributes can be quantified and preferences over these attributes are monotone. It shows that monotonicity is the only shape constraint appropriate for a utility function in these circumstances. The paper models components of utility using a Dirichlet prior distribution and demonstrates that all monotone nondecreasing utility functions are supported by the prior. It develops a Markov chain Monte Carlo algorithm for posterior simulation that is reliable and practical given the number of attributes, choices and sample sizes characteristic of discrete choice experiments. The paper uses the algorithm to demonstrate the flexibility of the model in capturing heterogeneous preferences and applies it to a discrete choice experiment that elicits preferences for different auto insurance policies.  相似文献   

11.
In this paper we formulate a family of conditions called `Bk-monotonicity' that are necessary for Nash implementation, where k is a natural number that indexes a particular condition, and where the condition only becomes more restrictive as k increases. Bk-monotonicity is in general a stricter condition than Maskin monotonicity, and can be used to show that certain social choice correspondences that satisfy Maskin monotonicity cannot be Nash implemented.  相似文献   

12.
This paper makes two important contributions to the literature on prediction intervals for firm specific inefficiency estimates in cross sectional SFA models. Firstly, the existing intervals in the literature do not correspond to the minimum width intervals and in this paper we discuss how to compute such intervals and how they either include or exclude zero as a lower bound depending on where the probability mass of the distribution of \( u_{i} |\varepsilon_{i} \) resides. This has useful implications for practitioners and policy makers, with greatest reductions in interval width for the most efficient firms. Secondly, we propose an ‘asymptotic’ approach to incorporating parameter uncertainty into prediction intervals for firm specific inefficiency (given that in practice model parameters have to be estimated) as an alternative to the ‘bagging’ procedure suggested in Simar and Wilson (Econom Rev 29(1):62–98, 2010). The approach is computationally much simpler than the bagging approach.  相似文献   

13.
This paper develops the structure of a parsimonious Portfolio Index (PI) GARCH model. Unlike the conventional approach to Portfolio Index returns, which employs the univariate ARCH class, the PI-GARCH approach incorporates the effects on individual assets, leading to a better understanding of portfolio risk management, and achieves greater accuracy in forecasting Value-at-Risk (VaR) thresholds. For various asymmetric GARCH models, a Portfolio Index Composite News Impact Surface (PI-CNIS) is developed to measure the effects of news on the conditional variances. The paper also investigates the finite sample properties of the PI-GARCH model. The empirical example shows that the asymmetric PI-GARCH-t model outperforms the GJR-t model and the filtered historical simulation with a t distribution in forecasting VaR thresholds.  相似文献   

14.
《Journal of econometrics》2005,126(2):493-523
The estimated parameters of output distance functions frequently violate the monotonicity, quasi-convexity and convexity constraints implied by economic theory, leading to estimated elasticities and shadow prices that are incorrectly signed, and ultimately to perverse conclusions concerning the effects of input and output changes on productivity growth and relative efficiency levels. We show how a Bayesian approach can be used to impose these constraints on the parameters of a translog output distance function. Implementing the approach involves the use of a Gibbs sampler with data augmentation. A Metropolis–Hastings algorithm is also used within the Gibbs to simulate observations from truncated pdfs. Our methods are developed for the case where panel data is available and technical inefficiency effects are assumed to be time-invariant. Two models—a fixed effects model and a random effects model—are developed and applied to panel data on 17 European railways. We observe significant changes in estimated elasticities and shadow price ratios when regularity restrictions are imposed.  相似文献   

15.
For contingency tables with extensive missing data, the unrestricted MLE under the saturated model, computed by the EM algorithm, is generally unsatisfactory. In this case, it may be better to fit a simpler model by imposing some restrictions on the parameter space. Perlman and Wu (1999) propose lattice conditional independence (LCI) models for contingency tables with arbitrary missing data patterns. When this LCI model fits well, the restricted MLE under the LCI model is more accurate than the unrestricted MLE under the saturated model, but not in general. Here we propose certain empirical Bayes (EB) estimators that adaptively combine the best features of the restricted and unrestricted MLEs. These EB estimators appear to be especially useful when the observed data is sparse, even in cases where the suitability of the LCI model is uncertain. We also study a restricted EM algorithm (called the ER algorithm) with similar desirable features. Received: July 1999  相似文献   

16.
It is shown how to implement an EM algorithm for maximum likelihood estimation of hierarchical nonlinear models for data sets consisting of more than two levels of nesting. This upward–downward algorithm makes use of the conditional independence assumptions implied by the hierarchical model. It cannot only be used for the estimation of models with a parametric specification of the random effects, but also to extend the two-level nonparametric approach – sometimes referred to as latent class regression – to three or more levels. The proposed approach is illustrated with an empirical application.  相似文献   

17.
The surplus approach to value and distribution represents an older and in this author's opinion much more relevant way to model capitalistic socio‐economic systems than the mainstream neoclassical approach. This essay develops the analytical structure of the surplus approach by tracing its origins to the groundbreaking work of the circular flow concept in the Physiocrats and Quesnay's Tableau Economique, the Classical economists, Marx, and Piero Sraffa. Original archival material from the Sraffa Papers at the Wren Library, Trinity College, University of Cambridge is also presented in support of the thesis advance. Finally, the surplus approach is developed in terms of the distribution and growth nexus.  相似文献   

18.
This paper extends Kurz’s (1968) growth model to a stochastic growth framework with social-status concern and unbounded production shocks. Using the stochastic monotonicity of a stochastic dynamic system and the methods adopted in Zhang (2007), the existence, uniqueness, and stability of invariant distribution are investigated. Different from the existence of multiple steady states under certainty, it is shown here that there exists a unique stable invariant distribution under uncertainty.  相似文献   

19.
In a regression context, consider the difference in expected outcome associated with a particular difference in one of the input variables. If the true regression relationship involves interactions, then this predictive comparison can depend on the values of the other input variables. Therefore, one may wish to consider an average predictive comparison as a target of inference, where the averaging is with respect to the population distribution of the input variables. We consider inferences about such targets, with emphasis on inferential performance when the regression model is misspecified. Particularly, in light of the difficulties in dealing with interaction terms in regression models, we examine inferences about average predictive comparisons when additive models are fitted to relationships truly involving pairwise interaction terms. We identify some circumstances where such inferences are consistent despite the model misspecification, notably when the input variables are independent, or have a multivariate normal distribution.  相似文献   

20.
Abstract With the aid of the Bank's banknote sorting system the issue and subsequent withdrawal of f 25–banknotes on three varieties of paper have been recorded for two-and-a-half years. The aim was to measure the durability of the three paper varieties in circulation. The results of this second trial with f 25–banknotes confirm the statistical model developed previously for the first trial with f 100–banknotes. GRESHAM's Law is equally not applicable, neither to f 25–banknotes nor to f 100–banknotes. A two-parameter gamma distribution fits the cumulative fraction of banknotes withdrawn reasonably well.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号