首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 163 毫秒
1.
In data envelopment analysis (DEA), there are two principal methods for identifying and measuring congestion: Those of Färe et al. [Färe R, Grosskopf S. When can slacks be used to identify congestion. An answer to W. W. Cooper, L. Seiford, J. Zhu. Socio-Economic Planning Sciences 2001;35:1–10] and Cooper et al. [Cooper WW, Deng H, Huang ZM, Li SX. A one-model approach to congestion in data envelopment analysis. Socio-Economic Planning Sciences 2002;36:231–8]. In the present paper, we focus on the latter work in proposing a new method that requires considerably less computation. Then, by proving a selected theorem, we show that our proposed methodology is indeed equivalent to that of Cooper et al.  相似文献   

2.
Stochastic FDH/DEA estimators for frontier analysis   总被引:2,自引:2,他引:0  
In this paper we extend the work of Simar (J Product Ananl 28:183–201, 2007) introducing noise in nonparametric frontier models. We develop an approach that synthesizes the best features of the two main methods in the estimation of production efficiency. Specifically, our approach first allows for statistical noise, similar to Stochastic frontier analysis (even in a more flexible way), and second, it allows modelling multiple-inputs-multiple-outputs technologies without imposing parametric assumptions on production relationship, similar to what is done in non-parametric methods, like Data Envelopment Analysis (DEA), Free Disposal Hull (FDH), etc.... The methodology is based on the theory of local maximum likelihood estimation and extends recent works of Kumbhakar et al. (J Econom 137(1):1–27, 2007) and Park et al. (J Econom 146:185–198, 2008). Our method is suitable for modelling and estimation of the marginal effects onto inefficiency level jointly with estimation of marginal effects of input. The approach is robust to heteroskedastic cases and to various (unknown) distributions of statistical noise and inefficiency, despite assuming simple anchorage models. The method also improves DEA/FDH estimators, by allowing them to be quite robust to statistical noise and especially to outliers, which were the main problems of the original DEA/FDH estimators. The procedure shows great performance for various simulated cases and is also illustrated for some real data sets. Even in the single-output case, our simulated examples show that our stochastic DEA/FDH improves the Kumbhakar et al. (J Econom 137(1):1–27, 2007) method, by making the resulting frontier smoother, monotonic and, if we wish, concave.  相似文献   

3.
Baumol, Panzar, and Willig [1982] introduced the idea that there may be efficiency gain from merger of two or more firms. This paper shows that the Debreu [1951]-Farrell [1957] type index suggested by Färe [1986] to calculate the gain in efficiency from merger of firms can be related to a concentration formula. We then introduce a dominance relation consistent with the ordering generated by the Färe gain function. Next, it is shown that a normalized form of the gain function satisfying a homogeneity condition becomes a subclass of the Hannah-Kay [1977] concentration indices. A limiting case of this subclass becomes the entropy-based index of concentration.The editor of this paper was Rolf Färe.I wish to express sincere gratitude to Shmuel Nitzan for bringing my attention to this problem. Comments and suggestions from Rolf Färe, Alexis Jacquemin, and two anonymous referees are acknowledged with thanks.  相似文献   

4.
Junming Liu  Kaoru Tone 《Socio》2008,42(2):75-91
When measuring technical efficiency with existing data envelopment analysis (DEA) techniques, mean efficiency scores generally exhibit volatile patterns over time. This appears to be at odds with the general perception of learning-by-doing management, due to Arrow [The economic implications of learning by doing. Review of Economic Studies 1964; 154–73]. Further, this phenomenon is largely attributable to the fundamental assumption of deterministic data maintained in DEA models, and to the difficulty such models have in incorporating environmental influences. This paper proposes a three-stage method to measure DEA efficiency while controlling for the impacts of both statistical noise and environmental factors. Using panel data on Japanese banking over the period 1997–2001, we demonstrate that the proposed approach greatly mitigates these weaknesses of DEA models. We find a stable upward trend in mean measured efficiency, indicating that, on average, the bankers were learning over the sample period. Therefore, we conclude that this new method is a significant improvement relative to those DEA models currently used by researchers, corporate management, and industrial regulatory bodies to evaluate performance of their respective interests.  相似文献   

5.
Organized Industrial Zones (OIZs) are the production areas of goods and services that are established to provide planned industrialization and planned urbanization by structuring the industry in suitable areas, to prevent environmental problems, and to provide efficient use of resources. The aim of this study is to propose a decision-making model based on data envelopment analysis (DEA) to evaluate the relative efficiencies of OIZs located in the Eastern Black Sea Region of Turkey. First, the efficiency score of each alternative is determined by employing a DEA formulation with interval data. Second, a common weight DEA-based formulation is applied in order to obtain common set of weights and provide ranking results with an improved discriminating power in imprecise nature. In this study, the best performing OIZ alternative, which performs in Eastern Black Sea region of Turkey, is identified based on the approach proposed by Salahi et al. [1]. The DEA-based model developed by Salahi et al. [1] is improved by modifying a constraint. In order to demonstrate the robustness of the application, two numerical illustrations are given. The first example compares the results obtained by the formulation addressed in Salahi et al. [1] and the improved model; while second numerical illustration provides a case study conducted in Eastern Black Sea region of Turkey.  相似文献   

6.
Data envelopment analysis (DEA) is extended to the case of stochastic inputs and outputs through the use of chance-constrained programming. The chance-constrained frontier envelops a given set of observations ‘most of the time’. As an empirical illustration, we re-examine the pioneering 1981 study of Program Follow Through by Charnes, Cooper and Rhodes.  相似文献   

7.
This paper proposes a logistics model for delivery of prioritized items in disaster relief operations. It considers multi-items, multi-vehicles, multi-periods, soft time windows, and a split delivery strategy scenario, and is formulated as a multi-objective integer programming model. To effectively solve this model we limit the number of available tours. Two heuristic approaches are introduced for this purpose. The first approach is based on a genetic algorithm, while the second approach is developed by decomposing the original problem. We compare these two approaches via a computational study. The multi-objective problem is converted to a single-objective problem by the weighted sum method. A case study is presented to illustrate the potential applicability of our model. Also, presented is a comparison of our model with that proposed in a recent paper by Balcik et al. [6]. The results show that our proposed model outperforms theirs in terms of delivering prioritized items over several time periods.  相似文献   

8.
In this paper, we implement the conditional difference asymmetry model (CDAS) for square tables with nominal categories proposed by Tomizawa et al. (J. Appl. Stat. 31(3): 271–277, 2004) with the use of the non-standard log-linear model formulation approach. The implementation is carried out by refitting the model in the 3 ×  3 table in (Tomizawa et al. J. Appl. Stat. 31(3): 271–277, 2004). We extend this approach to a larger 4 ×  4 table of religious affiliation. We further calculated the measure of asymmetry along with its asymptotic standard error and confidence bounds. The procedure is implemted with SAS PROC GENMOD but can also be implemented in SPSS by following the discussion in (Lawal, J. Appl. Stat. 31(3): 279–303, 2004; Lawal, Qual. Quant. 38(3): 259–289, 2004).  相似文献   

9.
The question of compositional effects (that is, the effect of collective properties of a pupil body on the individual members), or Aggregated Group-Level Effects (AGLEs) as the author prefers to call them, has been the subject of considerable controversy. Some authors, e.g. Rutter et al. [Fifteen thousand hours: Secondary Schools and Their Effects on Children. London: Open Books.], Willms [Oxford Review of Education 11(1): 33–41; (1986). American Sociological Review, 51, 224–241.], Bondi [British Educational Research Journal, 17(3), 203-218.], have claimed to find such effects, while on the other hand Mortimore et al. [School Matters: the Junior Years. Wells: Open Books.] and Thomas and Mortimore [Oxford Review of Education 16(2): 137–158.] did not. Others, for example Hauser [1970], have implied that many apparent AGLEs may be spurious, while Gray et al. [Review of Research in Education, 8, 158–193.] have suggested that at least in certain circumstances such apparent effects may arise as a result of inadequate allowance for pre-existing differences. A possible statistical mechanism for this is outlined in the work of Burstein [In R. Dreeben, & J. A. Thomas (Eds.), The Analysis of Educational Productivity. Volume 1: Issues in Microanalysis, Cambridge, MASS: Ballinger, pp. 119–190] on the effect of aggregating the data when a variable is omitted from the model used. This paper suggests another way in which spurious AGLEs can arise. It shows mathematically that even if there are no omitted variables, measurement error in an explanatory variable could give rise to apparent, but spurious, AGLEs, when analysed using a multilevel modelling procedure. Using simulation methods, it investigates what the practical effects of this are likely to be, and shows that statistically significant spurious effects occur systematically under fairly standard conditions.  相似文献   

10.
It is well-known that the naive bootstrap yields inconsistent inference in the context of data envelopment analysis (DEA) or free disposal hull (FDH) estimators in nonparametric frontier models. For inference about efficiency of a single, fixed point, drawing bootstrap pseudo-samples of size m < n provides consistent inference, although coverages are quite sensitive to the choice of subsample size m. We provide a probabilistic framework in which these methods are shown to valid for statistics comprised of functions of DEA or FDH estimators. We examine a simple, data-based rule for selecting m suggested by Politis et al. (Stat Sin 11:1105–1124, 2001), and provide Monte Carlo evidence on the size and power of our tests. Our methods (i) allow for heterogeneity in the inefficiency process, and unlike previous methods, (ii) do not require multivariate kernel smoothing, and (iii) avoid the need for solutions of intermediate linear programs.  相似文献   

11.
Nonparametric methods for measuring productivity indexes based on bounds for the underlying production technology are presented. Following Banker and Maindiratta, the lower bound is obtained from a primal approach while the upper bound corresponds to a dual approach to nonparametric production analysis. These nonparametric bounds are then used to estimate input-based and output-based distance functions. These radial measures provide the basis for measuring productivity indexes. Application to times series data on U.S. agriculture indicates a large gap between the primal lower bound and the dual upper bound. This generates striking differences between the primal and dual nonparametric productivity indexes.Respectively, professor and associate professor of Agricultural Economics, University of Wisconsin-Madison. Seniority of authorship is equally shared. We would like to thank Rolf Färe and an anonymous reviewer for useful comments on an earlier draft of the paper. This research was supported in part by a Hatch grant from the College of Agriculture and Life Sciences, University of Wisconsin-Madison.  相似文献   

12.
This paper investigates productivity growth, technical progress, and efficiency change for a group of the 56 largest CPA firms in the US from the period 1996–1999 through the period 2003–2006, where the former preceded, and the latter followed, enactment of the Sarbanes–Oxley Act (SOX). Data envelopment analysis (DEA) is used to calculate Malmquist indices of three measures of interest: productivity growth, technical progress, and efficiency change. Results indicate that CPA firms, on average, experienced a productivity growth of approx. 17% from the pre- to post-SOX period. Consistent with the finding of Banker et al. [Banker RD, Chang H, Natarajan R. Productivity change, technical progress and relative efficiency change in the public accounting industry. Management Science 2005;51:291–304], this productivity gain can be attributed primarily to technical progress rather than a change in relative efficiency. In addition, results indicate that the “Big 4” firms underperformed their non-Big 4 counterparts in both productivity growth and technical progress.  相似文献   

13.
Quanling Wei  Hong Yan 《Socio》2009,43(1):40-54
Our earlier work [Wei QL, Yan H. Congestion and returns to scale in data envelopment analysis. European Journal of Operational Research 2004;153:641–60] discussed necessary and sufficient conditions for the existence of congestion together with aspects of returns to scale under an output-oriented DEA framework. In line with this work, the current paper investigates the issue of “weak congestion”, wherein congestion occurs when the reduction of selected inputs causes some, rather than all, outputs to increase, without a worsening of others. We define output efficiency for decision making units under a series of typical DEA output additive models. Based on this definition, we offer necessary and sufficient conditions for the existence of weak congestion. Numerical examples are provided for purposes of illustration.  相似文献   

14.
Theory and Application of Directional Distance Functions   总被引:4,自引:1,他引:4  
In 1957 Farrell demonstrated how cost inefficiency could be decomposed into two mutually exclusive and exhaustive components: technical and allocative inefficiency. This result is consequence of the fact that—as shown by Shephard—the cost function and the input distance function (the reciprocal of Farrell's technical efficiency measure) are dual to each other. Similarly, the revenue function and the output distance function are dual providing the basis for the decomposition of revenue inefficiency into technical and allocative components (see for example, Färe, Grosskopf and Lovell (1994)). Here we extend those results to include the directional distance function and its dual, the profit function. This provides the basis for defining and decomposing profit efficiency. As we show, the output and input distance functions (reciprocals of Farrell efficiency measures) are special cases of the directional distance function. We also show how to use the directional distance function as a tool for measuring capacity utilization using DEA type techniques.  相似文献   

15.
Congestion is an economic phenomenon of overinvestment that occurs when excessive inputs decrease the maximally possible outputs. Although decision-makers are unlikely to decrease outputs by increasing inputs, congestion is widespread in reality. Identifying and measuring congestion can help decision-makers detect the problem of overinvestment. This paper reviews the development of the concept of congestion in the framework of data envelopment analysis (DEA), which is a widely accepted method for identifying and measuring congestion. In this paper, six main congestion identification and measurement methods are analysed through several numerical examples. We investigate the ideas of these methods, the contributions compared with the previous methods, and the existing shortcomings. Based on our analysis, we conclude that existing congestion identification and measurement methods are still inadequate. Three problems are anticipated for further study: maintaining the consistency between congestion and overinvestment, considering joint weak disposability assumption between desirable outputs and undesirable outputs, and quantifying the degree of congestion.  相似文献   

16.
In a production technology, the type of returns to scale (RTS) associated with an efficient decision making unit (DMU) is indicative of the direction of marginal rescaling that the DMU should undertake in order to improve its productivity. In this paper a concept of global returns to scale (GRS) is developed as an indicator of the direction in which the most productive scale size (MPSS) of an efficient DMU is achieved. The GRS classes are useful in assisting strategic decisions like those involving mergers of units or splitting into smaller firms. The two characterisations, RTS and GRS, are the same in a convex technology but generally different in a non-convex one. It is shown that, in a non-convex technology, the well-known method of testing RTS proposed by Färe et al. is in fact testing for GRS and not RTS. Further, while there are three types of RTS: constant, decreasing and increasing (CRS, DRS and IRS, respectively), the classification according to GRS includes the fourth type of sub-constant GRS, which describes a DMU able to achieve its MPSS by both reducing and increasing the scale of operations. The notion of GRS is applicable to a wide range of technologies, including the free disposal hull (FDH) and all polyhedral technologies used in data envelopment analysis (DEA).  相似文献   

17.
This paper examines the technical efficiency of US Federal Reserve check processing offices over 1980–2003. We extend results from Park et al. [Park, B., Simar, L., Weiner, C., 2000. FDH efficiency scores from a stochastic point of view. Econometric Theory 16, 855–877] and Daouia and Simar [Daouia, A., Simar, L., 2007. Nonparametric efficiency analysis: a multivariate conditional quantile approach. Journal of Econometrics 140, 375–400] to develop an unconditional, hyperbolic, α-quantile estimator of efficiency. Our new estimator is fully non-parametric and robust with respect to outliers; when used to estimate distance to quantiles lying close to the full frontier, it is strongly consistent and converges at rate root-n, thus avoiding the curse of dimensionality that plagues data envelopment analysis (DEA) estimators. Our methods could be used by policymakers to compare inefficiency levels across offices or by managers of individual offices to identify peer offices.  相似文献   

18.
This paper aims at developing a new methodology to measure and decompose global DMU efficiency into efficiency of inputs (or outputs). The basic idea rests on the fact that global DMU's efficiency score might be misleading when managers proceed to reallocate their inputs or redefine their outputs. Literature provides a basic measure for global DMU's efficiency score. A revised model was developed for measuring efficiencies of global DMUs and their inputs (or outputs) efficiency components, based on a hypothesis of virtual DMUs. The present paper suggests a method for measuring global DMU efficiency simultaneously with its efficiencies of inputs components, that we call Input decomposition DEA model (ID-DEA), and its efficiencies of outputs components, that we call output decomposition DEA model (OD-DEA). These twin models differ from Supper efficiency model (SE-DEA) and Common Set Weights model (CSW-DEA). The twin models (ID-DEA, OD-DEA) were applied to agricultural farms, and the results gave different efficiency scores of inputs (or outputs), and at the same time, global DMU's efficiency score was given by the Charnes, Cooper and Rhodes (Charnes et al., 1978) [1], CCR78 model. The rationale of our new hypothesis and model is the fact that managers don't have the same information level about all inputs and outputs that constraint them to manage resources by the (global) efficiency scores. Then each input/output has a different reality depending on the manager's decision in relationship to information available at the time of decision. This paper decomposes global DMU's efficiency into input (or output) components' efficiencies. Each component will have its score instead of a global DMU score. These findings would improve management decision making about reallocating inputs and redefining outputs. Concerning policy implications of the DEA twin models, they help policy makers to assess, ameliorate and reorient their strategies and execute programs towards enhancing the best practices and minimising losses.  相似文献   

19.
The aim of this study is to confirm the factorial structure of the Identification-Commitment Inventory (ICI) developed within the frame of the Human System Audit (HSA) (Quijano et al. in Revist Psicol Soc Apl 10(2):27–61, 2000; Pap Psicól Revist Col Of Psicó 29:92–106, 2008). Commitment and identification are understood by the HSA at an individual level as part of the quality of human processes and resources in an organization; and therefore as antecedents of important organizational outcomes, such as personnel turnover intentions, organizational citizenship behavior, etc. (Meyer et al. in J Org Behav 27:665–683, 2006). The theoretical integrative model which underlies ICI Quijano et al. (2000) was tested in a sample (N = 625) of workers in a Spanish public hospital. Confirmatory factor analysis through structural equation modeling was performed. Elliptical least square solution was chosen as estimator procedure on account of non-normal distribution of the variables. The results confirm the goodness of fit of an integrative model, which underlies the relation between Commitment and Identification, although each one is operatively different.  相似文献   

20.
The “global game with strategic substitutes and complements” of Karp et al. (2007) is used to model the decision of where to fish. A complete information game is assumed, but the model is generalized to S>1S>1 sites. In this game, a fisherman’s payoff depends on fish density in each site and the actions of other fishermen which can lead to congestion or agglomeration effects. Stable and unstable equilibria are characterized, as well as notions of equilibrium dominance. The model is applied to the Alaskan flatfish fishery by specifying a strategic interaction function (response to congestion) that is a non-linear function of the degree of congestion present in a given site. Results suggest that the interaction function may be non-monotonic in congestion.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号