首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Analysis of the behavior of technical inefficiency with respect to parameters and variables of a stochastic frontier model is a neglected area of research in frontier literature. An attempt in this direction, however, has recently been made. It has been shown that in a “standard” stochastic frontier model that both the firm level technical inefficiency and the production uncertainty are monotonically decreasing with observational error. In this paper we show, considering a stochastic frontier model whose error components are jointly distributed as truncated bivariate normal, that this property holds if and only if the distribution of observational error is negatively skewed. We also derive a necessary and sufficient condition under which both firm level technical inefficiency and production uncertainty are monotonically increasing with noise-inefficiency correlation. We next propose a new measure of the industry level production uncertainty and establish the necessary and sufficient condition for firm level technical inefficiency and production uncertainty to be monotonically increasing with industry level production uncertainty. We also study the limiting probabilistic behavior of these conditions under different parametric configuration of our model. Finally we carry out Monte Carlo simulations to study the sample behavior of the population monotonic property of the firm level technical inefficiency and production uncertainty in our model.
Arabinda DasEmail:
  相似文献   

2.
This paper presents a general result on the random selection of an element from an ordered sequence of risks and uses this result to derive additive and cross risk apportionment. Preferences favoring an improvement of the sampling distribution in univariate or bivariate first-order stochastic dominance are those exhibiting additive or cross risk apportionment. The univariate additive and multiplicative risk apportionment concepts are then related to the notion of bivariate cross risk apportionment by viewing the single-attribute utility function of an aggregate position (sum or product of attributes) as a 2-attribute utility function. The results derived in the present paper allow one to further explore the connections between the different concepts of risk apportionment proposed so far in the literature.  相似文献   

3.
The ambiguous return pattern for the PEGR (the ratio of the stock’s price/earnings to its estimated earnings growth rate) strategy has been documented in literature for the US stock markets. As stock prices and earnings per share (EPS) are objective data, earnings growth rate, however, is estimated by analyst whose method partial explains the PEGR vague return pattern. The purpose of this study is not to deny or substitute analysts’ estimation, but rather, to provide a simple and popular method, log-linear regression model, to forecast the earnings growth rate (G), and examine whether the typical PEGR effect, such as PER (price/earnings ratio) or PBR (price/book ratio) effect, exists by using our alternative estimation method. Our evidence indeed shows that returns on the lowest PEGR portfolio not only dominate over all higher PEGR portfolios, but also beat the market with stochastic dominance (SD) analysis, which is consistent with our prediction. Our results, at least, imply that using the log-linear regression model to construct the PEGR-sorted portfolios can benefit investors and the model is also a good choice for analysts in their forecasting.  相似文献   

4.
We aim to calibrate stochastic volatility models from option prices. We develop a Tikhonov regularization approach with an efficient numerical algorithm to recover the risk neutral drift term of the volatility (or variance) process. In contrast to most existing literature, we do not assume that the drift term has any special structure. As such, our algorithm applies to calibration of general stochastic volatility models. An extensive numerical analysis is presented to demonstrate the efficiency of our approach. Interestingly, our empirical study reveals that the risk neutral variance processes recovered from market prices of options on S&P 500 index and EUR/USD exchange rate are indeed linearly mean-reverting.  相似文献   

5.
Manzini and Mariotti (2014) define the menu-independent random consideration set rule, where the decision maker considers each alternative with a menu-independent probability known as the attention function. We relax the assumption of menu-independence and allow for any restriction to be imposed on the attention function. We show that there is an equivalence between the attention function and the hazard rate. This equivalence is used to characterize the menu dependent random consideration set rules that correspond to (i) specific conditions on the probability rule, and (ii) different stochastic choice models from the literature.  相似文献   

6.
This paper provides a formal justification for the existence of subjective random components intrinsic to the outcome evaluation process of decision makers and explicitly assumed in the stochastic choice literature. We introduce the concepts of admissible error function and generalized certainty equivalent, which allow us to analyze two different criteria, a cardinal and an ordinal one, when defining suitable approximations to expected utility values. Contrary to the standard literature requirements for irrational preferences, adjustment errors arise in a natural way within our setting, their existence following directly from the disconnectedness of the range of the utility functions. Conditions for the existence of minimal errors are also studied. Our results imply that neither the cardinal nor the ordinal criterion do necessarily provide the same evaluation for two or more different prospects with the same expected utility value. As a consequence, a rational decision maker may define two different generalized certainty equivalents when presented with the same prospect in two different occasions.  相似文献   

7.
Stochastic dominance techniques have been mainly employed in poverty analyses to overcome what it is called the multiplicity of poverty indices problem. Moreover, in the multidimensional context, stochastic dominance techniques capture the possible relationships between the dimensions of poverty as they rely upon their joint distribution, unlike most multidimensional poverty indices, which are only based on marginal distributions. In this paper, we first review the general definition of unidimensional stochastic dominance and its relationship with poverty orderings. Then we focus on the conditions of multivariate stochastic dominance and their relationship with multidimensional poverty orderings, highlighting the additional difficulties that the multivariate setting involves. In both cases, we focus our discussion on first‐ and second‐order dominance, though some guidelines on higher order dominance are also mentioned. We also present an overview of some relevant empirical applications of these methods that can be found in the literature in both univariate and multivariate contexts.  相似文献   

8.
This paper extends the existing fully parametric Bayesian literature on stochastic volatility to allow for more general return distributions. Instead of specifying a particular distribution for the return innovation, nonparametric Bayesian methods are used to flexibly model the skewness and kurtosis of the distribution while the dynamics of volatility continue to be modeled with a parametric structure. Our semiparametric Bayesian approach provides a full characterization of parametric and distributional uncertainty. A Markov chain Monte Carlo sampling approach to estimation is presented with theoretical and computational issues for simulation from the posterior predictive distributions. An empirical example compares the new model to standard parametric stochastic volatility models.  相似文献   

9.
In this study, a stochastic multi-objective mixed-integer mathematical programming is proposed for logistic distribution and evacuation planning during an earthquake. Decisions about the pre- and post-phases of the disaster are considered seamless. The decisions of the pre-disaster phase relate to the location of permanent relief distribution centers and the number of the commodities to be stored. The decisions of the second phase are to determine the optimal location for the establishment of temporary care centers to increase the speed of treating the injured people and the distribution of the commodities at the affected areas. Humanitarian and cost issues are considered in the proposed models through three objective functions. Several sets of constraints are also considered in the proposed model to make it flexible to handle real issues. Demands for food, blood, water, blanket, and tent are assumed to be probabilistic which are related to several complicated factors and modeled using a complicated network in this study. A simulation is setup to generate the probabilistic distribution of demands through several scenarios. The stochastic demands are assumed as inputs for the proposed stochastic multi-objective mixed integer mathematical programming model.The model is transformed to its deterministic equivalent using chance constraint programming approach. The equivalent deterministic model is solved using an efficient epsilon-constraint approach and an evolutionary algorithm, called non-dominated sorting genetic algorithm (NSGA-II). First several illustrative numerical examples are solved using both solution procedures. The performance of solution procedures is compared and the most efficient solution procedure, i.e., NSGA-II, is used to handle the case study of Tehran earthquake. The results are promising and show that the proposed model and the solution approach can handle the real case study in an efficient way.  相似文献   

10.
This paper presents a comparison of two different models (Land et al (1993) and Olesen and Petersen (1995)), both designed to extend DEA to the case of stochastic inputs and outputs. The two models constitute two approaches within this area, that share certain characteristics. However, the two models behave very differently, and the choice between these two models can be confusing. This paper presents a systematic attempt to point out differences as well as similarities. It is demonstrated that the two models under some assumptions do have Lagrangian duals expressed in closed form. Similarities and differences are discussed based on a comparison of these dual structures. Weaknesses of the each of the two models are discussed and a merged model that combines attractive features of each of the two models is proposed.
O. B. OlesenEmail:
  相似文献   

11.
Poverty Orderings   总被引:5,自引:0,他引:5  
This paper reviews the literature of partial poverty orderings. Partial poverty orderings require unanimous poverty rankings for a class of poverty measures or a set of poverty lines. The need to consider multiple poverty measures and multiple poverty lines arises inevitably from the arbitrariness inherent in poverty comparisons. In the paper, we first survey the ordering conditions of various individual poverty measures for a range of poverty lines; for some measures necessary and sufficient conditions are identified while for others only some easily verifiable sufficient conditions are established. These ordering conditions are shown to have a close link with the stochastic dominance relations which are based on the comparisons of cumulative distribution functions. We then survey the ordering conditions for various classes of poverty measures with a single or a set of poverty lines; in all cases necessary and sufficient conditions are established. These conditions again rely on the stochastic dominance relations or their transformations. We also extend the relationship between poverty orderings and stochastic dominance to higher orders and explore the possibility and the conditions of increasing the power of poverty orderings beyond the second degree dominance condition.  相似文献   

12.
In this paper, we prove the existence of a stationary equilibrium in an intergenerational stochastic game with non-paternalistic altruism as defined by Ray (1987). Our approach is based on the assumption that the transition probabilities are non-atomic. The utility function of each generation has a very general form including many special cases studied in the literature.  相似文献   

13.
Two veins of literature, namely, production risk literature and stochastic frontier analysis, are examined. Both fields are concerned of output variation; the former due to exogenous shocks, the latter due inefficiency. By covering the literature from both the fields, this review suggests that the concept of heteroscedasticity can be utilized to build a synthesis between these mainly separate branches of literature. However, the synthetic approach brings a challenge how to differentiate between different sources of output variation. This challenge is identified as the main obstacle to meaningfully combine the two approaches.  相似文献   

14.
The paper is concerned with several kinds of stochastic frontier models whose likelihood function is not available in closed form. First, with output-oriented stochastic frontier models whose one-sided errors have a distribution other than the standard ones (exponential or half-normal). The gamma and beta distributions are leading examples. Second, with input-oriented stochastic frontier models which are common in theoretical discussions but not in econometric applications. Third, with two-tiered stochastic frontier models when the one-sided error components follow gamma distributions. Fourth, with latent class models with gamma distributed one-sided error terms. Fifth, with models whose two-sided error component is distributed as stable Paretian and the one-sided error is gamma. The principal aim is to propose approximations to the density of the composed error based on the inversion of the characteristic function (which turns out to be manageable) using the Fourier transform. Procedures that are based on the asymptotic normal form of the log-likelihood function and have arbitrary degrees of asymptotic efficiency are also proposed, implemented and evaluated in connection with output-oriented stochastic frontiers. The new methods are illustrated using data for US commercial banks, electric utilities, and a sample from the National Youth Longitudinal Survey.  相似文献   

15.
16.
This paper was to price and hedge a quanto floating range accrual note (QFRAN) by an affine term structure model with affine-jump processes. We first generalized the affine transform proposed by Duffie et al. (2000) under both the domestic and foreign risk-neutral measures with a change of measure, which provides a flexible structure to value quanto derivatives. Then, we provided semi-analytic pricing and hedging solutions for QFRAN under a four-factor affine-jump model with the stochastic mean, stochastic volatility, and jumps. The numerical results demonstrated that both the common and local factors significantly affect the value and hedging strategy of QFRAN. Notably,  the factor of stochastic mean plays the most important role in either valuation or hedging. This study suggested that ignorance of these factors in a term-structure model will result in significant pricing and hedging errors in QFRAN. In summary, this study provided flexible and easily implementable solutions in valuing quanto derivatives.  相似文献   

17.
Pareto-Koopmans efficiency in Data Envelopment Analysis (DEA) is extended to stochastic inputs and outputs via probabilistic input-output vector comparisons in a given empirical production (possibility) set. In contrast to other approaches which have used Chance Constrained Programming formulations in DEA, the emphasis here is on joint chance constraints. An assumption of arbitrary but known probability distributions leads to the P-Model of chance constrained programming. A necessary condition for a DMU to be stochastically efficient and a sufficient condition for a DMU to be non-stochastically efficient are provided. Deterministic equivalents using the zero order decision rules of chance constrained programming and multivariate normal distributions take the form of an extended version of the additive model of DEA. Contacts are also maintained with all of the other presently available deterministic DEA models in the form of easily identified extensions which can be used to formalize the treatment of efficiency when stochastic elements are present.  相似文献   

18.
Projects, private or public, that share input factors or output requirements had better be construed as members of a portfolio. Present risk, the capital asset pricing model may facilitate valuation of each member. Chief results of that model are derived and generalized here as core solutions to a transferable-utility production game. Shadow prices define stochastic discount factors that determine values of individual projects. Variance aversion largely affects such prices whence optimal allocations.  相似文献   

19.
We develop a model of labor productivity as a combination of capital-labour ratio, vintage of capital stock, regional externalities, and total factor productivity (TFP). The skewness of TFP distribution is related to different growth theories. While negative skewness is consistent with the neo-Schumpeterian idea of catching up with leaders, zero skewness supports the neoclassical view that deviations from the frontier reflect only idiosyncratic productivity shocks. We argue that positive skewness is consistent with an economy where exogenous technology is combined with non-transferable knowledge accumulated in specific sectors and regions. This argument provides the framework for an empirical model based on stochastic frontier analysis. The model is used to analyse regional and sectoral inequalities in Denmark.
Arnab BhattacharjeeEmail:
  相似文献   

20.
The paper aims to study the effect of spatial interdependence, among nearby municipalities, on public services efficiency. An empirical analysis on the waste disposal service in 4250 Italian municipalities was carried out to evaluate the efficiency of waste management expenditure, once the impact of positive/negative externalities, of neighbouring local governments, on efficiency levels is isolated. From a methodological point of view, our study extends the spatial stochastic frontier methodology proposed in Fusco & Vidoli (2013) usable only for production analysis, allowing to admit into a cost frontier the spatial autocorrelation among residuals. Ignoring spatial autocorrelation leads to inferential problems violating one of the most important assumption of classical regression models: the non-correlation of the residuals. We found a significant spatial interdependence among neighbouring municipalities in term of cost efficiency, that, thanks to the methodology proposed, has been isolated allowing a discussion on the specific efficiency of municipalities. These results may suggest the need to consider proximity effects in future investigations about the efficiency of waste management and, more generally, of public services.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号