首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
This paper examines the wide-spread practice where data envelopment analysis (DEA) efficiency estimates are regressed on some environmental variables in a second-stage analysis. In the literature, only two statistical models have been proposed in which second-stage regressions are well-defined and meaningful. In the model considered by Simar and Wilson (J Prod Anal 13:49–78, 2007), truncated regression provides consistent estimation in the second stage, where as in the model proposed by Banker and Natarajan (Oper Res 56: 48–58, 2008a), ordinary least squares (OLS) provides consistent estimation. This paper examines, compares, and contrasts the very different assumptions underlying these two models, and makes clear that second-stage OLS estimation is consistent only under very peculiar and unusual assumptions on the data-generating process that limit its applicability. In addition, we show that in either case, bootstrap methods provide the only feasible means for inference in the second stage. We also comment on ad hoc specifications of second-stage regression equations that ignore the part of the data-generating process that yields data used to obtain the initial DEA estimates.  相似文献   

2.
In frontier analysis, most nonparametric approaches (DEA, FDH) are based on envelopment ideas which assume that with probability one, all observed units belong to the attainable set. In these “deterministic” frontier models, statistical inference is now possible, by using bootstrap procedures. In the presence of noise, envelopment estimators could behave dramatically since they are very sensitive to extreme observations that might result only from noise. DEA/FDH techniques would provide estimators with an error of the order of the standard deviation of the noise. This paper adapts some recent results on detecting change points [Hall P, Simar L (2002) J Am Stat Assoc 97:523–534] to improve the performances of the classical DEA/FDH estimators in the presence of noise. We show by simulated examples that the procedure works well, and better than the standard DEA/FDH estimators, when the noise is of moderate size in term of signal to noise ratio. It turns out that the procedure is also robust to outliers. The paper can be seen as a first attempt to formalize stochastic DEA/FDH estimators.   相似文献   

3.
The optimality of designs obtained by adding p runs to an orthogonal array is studied for experiments involving m factors each at s levels. The optimality criterion used here, is the Type 1 criterion due to Cheng (1978) which is an extension of Kiefer (1975) universal optimality criterion. Unlike what happens with orthogonal array plus one run designs, the behavior of designs obtained via augmentation of an orthogonal array by p runs depends on the particular runs added.  相似文献   

4.
Stochastic FDH/DEA estimators for frontier analysis   总被引:2,自引:2,他引:0  
In this paper we extend the work of Simar (J Product Ananl 28:183–201, 2007) introducing noise in nonparametric frontier models. We develop an approach that synthesizes the best features of the two main methods in the estimation of production efficiency. Specifically, our approach first allows for statistical noise, similar to Stochastic frontier analysis (even in a more flexible way), and second, it allows modelling multiple-inputs-multiple-outputs technologies without imposing parametric assumptions on production relationship, similar to what is done in non-parametric methods, like Data Envelopment Analysis (DEA), Free Disposal Hull (FDH), etc.... The methodology is based on the theory of local maximum likelihood estimation and extends recent works of Kumbhakar et al. (J Econom 137(1):1–27, 2007) and Park et al. (J Econom 146:185–198, 2008). Our method is suitable for modelling and estimation of the marginal effects onto inefficiency level jointly with estimation of marginal effects of input. The approach is robust to heteroskedastic cases and to various (unknown) distributions of statistical noise and inefficiency, despite assuming simple anchorage models. The method also improves DEA/FDH estimators, by allowing them to be quite robust to statistical noise and especially to outliers, which were the main problems of the original DEA/FDH estimators. The procedure shows great performance for various simulated cases and is also illustrated for some real data sets. Even in the single-output case, our simulated examples show that our stochastic DEA/FDH improves the Kumbhakar et al. (J Econom 137(1):1–27, 2007) method, by making the resulting frontier smoother, monotonic and, if we wish, concave.  相似文献   

5.
The t regression models provide a useful extension of the normal regression models for datasets involving errors with longer-than-normal tails. Homogeneity of variances (if they exist) is a standard assumption in t regression models. However, this assumption is not necessarily appropriate. This paper is devoted to tests for heteroscedasticity in general t linear regression models. The asymptotic properties, including asymptotic Chi-square and approximate powers under local alternatives of the score tests, are studied. Based on the modified profile likelihood (Cox and Reid in J R Stat Soc Ser B 49(1):1–39, 1987), an adjusted score test for heteroscedasticity is developed. The properties of the score test and its adjustment are investigated through Monte Carlo simulations. The test methods are illustrated with land rent data (Weisberg in Applied linear regression. Wiley, New York, 1985). The project supported by NSFC 10671032, China, and a grant (HKBU2030/07P) from the Grant Council of Hong Kong, Hong Kong, China.  相似文献   

6.
The process capability index C pm , which considers the process variance and departure of the process mean from the target value, is important in the manufacturing industry to measure process potential and performance. This paper extends its applications to calculate the process capability index [(C)\tilde]pm{\tilde {C}_{pm} } of fuzzy numbers. In this paper, the α-cuts of fuzzy observations are first derived based on various values of α. The membership function of fuzzy process capability index [(C)\tilde]pm{\tilde {C}_{pm} } is then constructed based on the α-cuts of fuzzy observations. An example is presented to demonstrate how the fuzzy process capability index [(C)\tilde]pm{\tilde {C}_{pm} } is interpreted. When the quality characteristic cannot be precisely determined, the proposed method provides the most possible value and spread of fuzzy process capability index [(C)\tilde]pm{\tilde {C}_{pm} }. With crisp data, the proposed method reduces to the classical method of process capability index C pm .  相似文献   

7.
The link between martingales and arbitrage is well-known in financial theory: arbitrage is not available if and only if there exists an equivalent measure such that the discounted prices are martingales with respect to this measure (MME). As a consequence, under MME, any previsible (non-anticipative) strategy cannot have secure (without risk) profit. Moreover, a careful reading of a bootstrap proof of the first fundamental theorem of asset pricing (see Schachermeier (1992)) underlines the fact that, if there is no possibility of arbitrage during any unit interval, then no arbitrage is allowed with any finite temporal horizon strategy. Mathematics Subject Classification (2000): Primary: 60G48; Secondary: 60G40, 60G07 Journal of Economic Literature Classification: C50, C72, D84  相似文献   

8.
We consider the problem of estimating R=P(X<Y) where X and Y have independent exponential distributions with parameters and respectively and a common location parameter . Assuming that there is a prior guess or estimate R0, we develop various shrinkage estimators of R that incorporate this prior information. The performance of the new estimators is investigated and compared with the maximum likelihood estimator using Monte Carlo methods. It is found that some of these estimators are very successful in taking advantage of the prior estimate available.Acknowledgments. The authors are grateful to the editor and to the referees for their constructive comments that resulted in a substantial improvement of the paper.  相似文献   

9.
Subsampling and the m out of n bootstrap have been suggested in the literature as methods for carrying out inference based on post-model selection estimators and shrinkage estimators. In this paper we consider a subsampling confidence interval (CI) that is based on an estimator that can be viewed either as a post-model selection estimator that employs a consistent model selection procedure or as a super-efficient estimator. We show that the subsampling CI (of nominal level 1−α for any α(0,1)) has asymptotic confidence size (defined to be the limit of finite-sample size) equal to zero in a very simple regular model. The same result holds for the m out of n bootstrap provided m2/n→0 and the observations are i.i.d. Similar zero-asymptotic-confidence-size results hold in more complicated models that are covered by the general results given in the paper and for super-efficient and shrinkage estimators that are not post-model selection estimators. Based on these results, subsampling and the m out of n bootstrap are not recommended for obtaining inference based on post-consistent model selection or shrinkage estimators.  相似文献   

10.
Understanding the effects of operational conditions and practices on productive efficiency can provide valuable economic and managerial insights. The conventional approach is to use a two-stage method where the efficiency estimates are regressed on contextual variables representing the operational conditions. The main problem of the two-stage approach is that it ignores the correlations between inputs and contextual variables. To address this shortcoming, we build on the recently developed regression interpretation of data envelopment analysis (DEA) to develop a new one-stage semi-nonparametric estimator that combines the nonparametric DEA-style frontier with a regression model of the contextual variables. The new method is referred to as stochastic semi-nonparametric envelopment of z variables data (StoNEZD). The StoNEZD estimator for the contextual variables is shown to be statistically consistent under less restrictive assumptions than those required by the two-stage DEA estimator. Further, the StoNEZD estimator is shown to be unbiased, asymptotically efficient, asymptotically normally distributed, and converge at the standard parametric rate of order n −1/2. Therefore, the conventional methods of statistical testing and confidence intervals apply for asymptotic inference. Finite sample performance of the proposed estimators is examined through Monte Carlo simulations.  相似文献   

11.
Statistical Inference in Nonparametric Frontier Models: The State of the Art   总被引:14,自引:8,他引:6  
Efficiency scores of firms are measured by their distance to an estimated production frontier. The economic literature proposes several nonparametric frontier estimators based on the idea of enveloping the data (FDH and DEA-type estimators). Many have claimed that FDH and DEA techniques are non-statistical, as opposed to econometric approaches where particular parametric expressions are posited to model the frontier. We can now define a statistical model allowing determination of the statistical properties of the nonparametric estimators in the multi-output and multi-input case. New results provide the asymptotic sampling distribution of the FDH estimator in a multivariate setting and of the DEA estimator in the bivariate case. Sampling distributions may also be approximated by bootstrap distributions in very general situations. Consequently, statistical inference based on DEA/FDH-type estimators is now possible. These techniques allow correction for the bias of the efficiency estimators and estimation of confidence intervals for the efficiency measures. This paper summarizes the results which are now available, and provides a brief guide to the existing literature. Emphasizing the role of hypotheses and inference, we show how the results can be used or adapted for practical purposes.  相似文献   

12.
Top-k-lists are introduced as sequences of k-dimensional random vectors with ordered components being k largest observations from a sequence of independent identically distributed random variables. Such lists changing in time are natural stochastic models of ranking tables which appear in many situations in real life, when one wants to keep a track of several best results in a given field. Here we study basic properties of top-k-lists as joint distributions, conditional structures, representations, driving examples of top-k-lists from exponential and uniform distributions, asymptotics and a relation to generalized order statistics.  相似文献   

13.
In the reliability studies, k-out-of-n systems play an important role. In this paper, we consider sharp bounds for the mean residual life function of a k-out-of-n system consisting of n identical components with independent lifetimes having a common distribution function F, measured in location and scale units of the residual life random variable X t  = (Xt|X > t). We characterize the probability distributions for which the bounds are attained. We also evaluate the so obtained bounds numerically for various choices of k and n.  相似文献   

14.
Supersaturated designs (SSDs) constitute an important class of fractional factorial designs that could be extremely useful in factor screening experiments. Most of the existing studies have focused on balanced designs. This paper provides a new lower bound for the \(E(f_{NOD})\)-optimality measure of SSDs with general run sizes. This bound is a generalization of existing bounds since it is applicable to both balanced and unbalanced designs. Optimal multi and mixed-level, balanced and nearly balanced SSDs are constructed by applying a k-circulant type methodology. Necessary and sufficient conditions are introduced for the generator vectors, in order to pre-ensure the optimality of the constructed k-circulant SSDs. The provided lower bounds were used to measure the efficiency of the generated designs. The presented methodology leads to a number of new families of improved SSDs, providing tools for directly constructing optimal or nearly-optimal k-circulant designs by just checking the corresponding generator vector.  相似文献   

15.
Tashiro (Ann Inst Stat Math 29:295–300, 1977) studied methods for generating unform points on the surface of the regular unit sphere. The L p -norm unit sphere is a generalization of the regular unit sphere. In this paper we propose a method associated with an algorithm for generating uniformly scattered points on the L p -norm unit sphere and discuss its applications in statistical simulation, representative points of a wide class of multivariate probability distributions and optimization problems. Some examples are illustrated for these applications. This research was supported by The University of Hong Kong Research Grant and a University of New Haven 2005 and 2006 Summer Faculty Fellowships.  相似文献   

16.
Supersaturated designs are an important class of factorial designs in which the number of factors is larger than the number of runs. These designs supply an economical method to perform and analyze industrial experiments. In this paper, we consider generalized Legendre pairs and their corresponding matrices to construct E(s 2)-optimal two-level supersaturated designs suitable for screening experiments. Also, we provide some general theorems which supply several infinite families of E(s 2)-optimal two-level supersaturated designs of various sizes.   相似文献   

17.
Since the landmark decision in Burlington Northern & Santa Fe Railway Co. v. White, numerous federal district courts and circuit courts of appeals in the United States have considered employees’ retaliation claims. This paper reviews several post-Burlington cases and provides employers with a roadmap as to what has been held to be retaliation under the law and what has not. Our contribution is an up-to-date analysis of retaliation cases, on a specific employment action basis, to provide guidance to employers of the types of activity that could support a claim of retaliation and to employees to alert them as to the types of activity that they should not have to endure in the workplace.  相似文献   

18.
In continuous time, we study a financial market which is free of arbitrage opportunities but incomplete under the physical probability measure P. Thus one has several choices of equivalent martingale measures. In the present paper, the (unique) martingale measure P * is studied which is defined by the concept of the numeraire portfolio. The choice of P * can be justified by a change of numeraire in place of a change of measure. Mathematics Subject Classification (2000): 90A09, 91B28, 91B62, 93E20, 62P05 Journal of Economic Literature Classification: G10, G12, G13  相似文献   

19.
Mike Jacroux 《Metrika》2007,65(2):235-242
Two level regular fractional factorial designs are often used in industry as screening designs to help identify early on in an experimental process those experimental or system variables which have significant effects on the process being studied. In a recent paper, Li and Lin (2003) suggested a strategy for constructing optimal follow up designs using the well known foldover technique and the minimum aberration criterion. In this paper, we extend the results of Li and Lin (2003) by giving an alternate technique for constructing optimal follow up designs using the foldover technique in conjunction with the maximal rank–minimum aberration criterion suggested in Jacroux (2003).  相似文献   

20.
The nineteenth century is a century of reform for the Ottoman state. The Tanzimat reforms hold a unique place in the Ottoman history of modernization. During the Tanzimat period (1839–1878), the state underwent a restructuring process in almost all of its institutions to establish a centralized modern state and many new institutions were established. The Ottomans paid special attention to education to train the new generation required for the continuity of modernization and the centralized bureaucratic structure. While they opened modern high schools and higher education institutions, they attempted to reform the existing s?byan schools, which were the primary education institutions. This process of restructuring education was also carried out in Cyprus, which had been an Ottoman island since 1571. These attempts remained restricted to efforts to increase the number of s?byan schools in Cyprus. There was a failure to replace the religious education given at schools with a secular program, curriculum or modern education system based on education management. This situation also adversely affected the quality of education at the Rü?tiye School in Nicosia, the first and only modern secondary school of the period and which was opened in Nicosia in 1864.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号