首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Data Envelopment Analysis (DEA) assumes, in most cases, that all inputs and outputs are controlled by the Decision Making Unit (DMU). Inputs and/or outputs that do not conform to this assumption are denoted in DEA asnon-discretionary (ND) factors. Banker and Morey [1986] formulated several variants of DEA models which incorporated ND with ordinary factors. This article extends the Banker and Morey approach for treating nondiscretionary factors in two ways. First, the model is extended to allow for thesimultaneous presence of ND factors in both the input and the output sets. Second, a generalization is offered which, for the first time, enables a quantitative evaluation ofpartially controlled factors. A numerical example is given to illustrate the different models.The editor for this paper was Wade D. Cook.  相似文献   

2.
3.
Studies that apply data envelopment analysis often neglect testing the stability of the efficient frontier to data perturbations, and, to a lesser extent, the ability of efficiency scores to correctly discriminate between units on performance (integrity). Our primary motivation is to demonstrate methods that can help reduce the number of managerial decisions based on results that may be unreliable. To this end, we illustrate multiple tests of stability and integrity in an environment of fully units-invariant efficiency measurement. This application of tests of stability and integrity using a slacks-based measure of efficiency is the first in a peer-reviewed journal.  相似文献   

4.
This paper provides diagnostic tools for examining the role of influential observations in Data Envelopment Analysis (DEA) applications. Observations may be prioritized for further scrutiny to see if they are contaminated by data errors; this prioritization is important in situations where data-checking is costly and resources are limited. Several empirical examples are provided using data from previously published studies.This research was performed while under contract with the Management Science Group, U.S. Department of Veterans Affairs, Bedford, MA 01730. Shawna Grosskopf and Richard Grabowski graciously provided data used in two of the empirical examples.  相似文献   

5.
In some contexts data envelopment analysis (DEA) gives poor discrimination on the performance of units. While this may reflect genuine uniformity of performance between units, it may also reflect lack of sufficient observations or other factors limiting discrimination on performance between units. In this paper, we present an overview of the main approaches that can be used to improve the discrimination of DEA. This includes simple methods such as the aggregation of inputs or outputs, the use of longitudinal data, more advanced methods such as the use of weight restrictions, production trade-offs and unobserved units, and a relatively new method based on the use of selective proportionality between the inputs and outputs.  相似文献   

6.
In data envelopment analysis and with a variable returns to scale production-technology, we apply Banker's [2] approach to determine the relationship between technically and cost-efficient industry structures, featuring reallocation of outputs and a variable number of firms. The interpretation based on the most productive and optimal scale-size notions allows us to both establish an inequality relationship between the corresponding industry-efficiency measures and provide adequate information on optimal solutions. At the applicative level, we introduce an exact algorithm to solve related non-linear programming problems, thus providing the decision maker with an accurate method for computing and comparing the input and output mixes and the optimal number of units obtained in the two allocations. Empirical illustration, given with reference to the Italian local-public-transit sector and employing a multiple inputs and outputs technology, reveals striking differences with regard to the managerial and regulatory implications of the two centralized allocations.  相似文献   

7.
Decisions in Economics and Finance - In this research, inverse data envelopment analysis (IDEA) approaches are proposed to measure inputs changes for output perturbations made while the convexity...  相似文献   

8.
Quanling Wei  Hong Yan 《Socio》2009,43(1):40-54
Our earlier work [Wei QL, Yan H. Congestion and returns to scale in data envelopment analysis. European Journal of Operational Research 2004;153:641–60] discussed necessary and sufficient conditions for the existence of congestion together with aspects of returns to scale under an output-oriented DEA framework. In line with this work, the current paper investigates the issue of “weak congestion”, wherein congestion occurs when the reduction of selected inputs causes some, rather than all, outputs to increase, without a worsening of others. We define output efficiency for decision making units under a series of typical DEA output additive models. Based on this definition, we offer necessary and sufficient conditions for the existence of weak congestion. Numerical examples are provided for purposes of illustration.  相似文献   

9.
The mathematical programming-based technique data envelopment analysis (DEA) has often treated data as being deterministic. In response to the criticism that in most applications there is error and random noise in the data, a number of mathematically elegant solutions to incorporating stochastic variations in data have been proposed. In this paper, we propose a chance-constrained formulation of DEA that allows random variations in the data. We study properties of the ensuing efficiency measure using a small sample in which multiple inputs and a single output are correlated, and are the result of a stochastic process. We replicate the analysis using Monte Carlo simulations and conclude that using simulations provides a more flexible and computationally less cumbersome approach to studying the effects of noise in the data. We suggest that, in keeping with the tradition of DEA, the simulation approach allows users to explicitly consider different data generating processes and allows for greater flexibility in implementing DEA under stochastic variations in data.  相似文献   

10.
This study first proposes a definition for directional congestion in certain input and output directions within the framework of data envelopment analysis. Second, two methods from different viewpoints are proposed to estimate the directional congestion. Third, we address the relationship between directional congestion and classic (strong or weak) congestion. Finally, we present a case study investigating the analysis performed by the research institutes of the Chinese Academy of Sciences to demonstrate the applicability and usefulness of the methods developed in this study.  相似文献   

11.
This study examines the Natural Gas Policy Act of 1978 and Federal Energy Regulatory Commission (FERC) policies that culminated in Order 636 in 1992. The regulatory environment in which natural gas distribution utilities operate was altered. FERC policies forced local gas distribution utilities into an increasingly competitive environment. Restructuring of the industry may affect economic efficiency. Data Envelopment Analysis is used to examine the economic efficiency of gas distributors during 1975–94. Federal policy appears to lead to a reduction in scale due to restructuring and more competition. Reduced scale economies have not altered the economic efficiency of the utilities.  相似文献   

12.
In 2014 the Brazilian Electricity Regulator (ANEEL) evaluated the efficiency of power distribution utilities using Data Envelopment Analysis (DEA). Estimated efficiencies range from 22.46% to 100%. Although environmental information is available in the data set, corrected efficiencies were not investigated. Different second stage models can be applied to adjust for environmental heterogeneity. Although statistical correlation among efficiencies and environmental variables can be easily estimated, corrected efficiencies are subject to the underlying structure of the second stage model. Therefore, different second stage models may achieve different corrected efficiencies. We provide a detailed statistical analysis of the Tobit model and compound error models for second stage analysis. Limitations are described and the corrected efficiencies using these models are evaluated. Potentially, Brazilian power distribution utilities may achieve substantial changes in estimated efficiencies if second stage analysis is used.  相似文献   

13.
This paper applies dynamic network slacks-based data envelopment analysis to measure financial performance based on the interrelationship among investment, financing, and dividend decisions. The empirical results show that financial performance is determined simultaneously by the efficiency of decisions, and sample firms have good performance in investment stage, but need to improve their financing and dividend policies. The proposal financial performance measure explains the multicriteria of decision-making rather than the single financial ratios. Besides, the research contributes a novel on the significant relationship between firm's ownership structure in the form of managers, government and foreign shareholders, and the firm's financial performance.  相似文献   

14.
《Socio》2004,38(2-3):123-140
Periodically, professionals in a given field must reflect and assess where the field has been, where it is heading, and what if anything, should be done to change that field's course. This article discusses statistical trends within the data envelopment analysis (DEA) literature. The number of articles published per year in refereed journals over the entire lifespan of the field, authorship and publishing outlets-of-choice statistics are used to indicate DEA's vitality, relevance, diffusion to other disciplines/professions and its worldwide acceptance. Lastly, based on published meta reviews, comparisons are made with other OR/MS sub-disciplines.  相似文献   

15.
Data envelopment analysis (DEA) is used to evaluate the relative technical efficiency and assist in the management of a chain of nursing homes. As with any DEA model, variables chosen are particularly important. The study looks at two possibly critical issues. The first is the appropriateness of models that include only financial and economic measures to evaluate administrators when quality care is an expected output. The second issue is the appropriateness of using noncontrollable variables, in this case operating income, to evaluate administrators. We show how efficiency scores differ when quality variables and/or operating income are included. We also demonstrate the usefulness of DEA information to both the home administrator and chain managers for improving operating efficiency.  相似文献   

16.
Measuring the performance of Non-Profit Organizations (NPOs) is a complicated issue: data envelopment analysis (DEA) is a popular quantitative tool in the past literature. However, the subjective opinions of NPOs could disturb their actual performance, and this problem is seldom considered. In this study, we use the qualitative DEA as a tool to find the emphasized inputs and outputs for these NPOs. Most DEA models are established by the basis of quantitative data, they are difficult to describe the qualitative performance of NPOs. This paper proposes a new perspective for computing the efficiency of a Decision Making Unit based on qualitative data by affinity Set. The DEA model for qualitative data could be traced back to the work of Cook et al. early in 1993. Our contribution prevents the identical efficiency scores from the model of Cook et al., and a combinatorial optimization technique is used to solve the new problem. Finally, we found most NPOs would like to get more resources from outside; but interestingly, they don’t like to be officially monitored. Therefore, we should use the quantitative DEA on NPOs very carefully.  相似文献   

17.
Environmental issues are becoming more and more important in our everyday life. Data Envelopment Analysis (DEA) is a tool developed for measuring relative operational efficiency. DEA can also be employed to estimate environmental efficiency where undesirable outputs like greenhouse gases exist. The classical DEA method identifies best practices among a given empirical data set. In many situations, however, it is advantageous to determine the worst practices and perform efficiency evaluation by comparing DMUs with the full-inefficient frontier. This strategy requires that the conventional production possibility set is defined from a reverse perspective. In this paper, presence of both desirable and undesirable outputs is assumed and a methodological framework for performing an unbiased efficiency analysis is proposed. The reverse production possibility set is defined and new models are presented regarding the full-inefficient frontier. The operational, environmental and overall reverse efficiencies are studied. The important notion of weak disposability is discussed and the effects of this assumption on the proposed models are investigated. The capability of the proposed method is examined using data from a real-world application about paper production.  相似文献   

18.
In aggregation for data envelopment analysis (DEA), a jointly determined aggregate measure of output and input efficiency is desired that is consistent with the individual decision making unit measures. An impasse has been reached in the current state of the literature, however, where only separate measures of input and output efficiency have resulted from attempts to aggregate technical efficiency with the radial measure models commonly employed in DEA. The latter measures are “incomplete” in that they omit the non-zero input and output slacks, and thus fail to account for all inefficiencies that the model can identify. The Russell measure eliminates the latter deficiency but is difficult to solve in standard formulations. A new approach has become available, however, which utilizes a ratio measure in place of the standard formulations. Referred to as an enhanced Russell graph measure (ERM), the resulting model is in the form of a fractional program. Hence, it can be transformed into an ordinary linear programming structure that can generate an optimal solution for the corresponding ERM model. As shown in this paper, an aggregate ERM can then be formed with all the properties considered to be desirable in an aggregate measure—including jointly determined input and output efficiency measures that represent separate estimates of input and output efficiency. Much of this paper is concerned with technical efficiency in both individual and system-wide efficiency measures. Weighting systems are introduced that extend to efficiency-based measures of cost, revenue, and profit, as well as derivatives such as rates of return over cost. The penultimate section shows how the solution to one model also generates optimal solutions to models with other objectives that include rates of return over cost and total profit. This is accomplished in the form of efficiency-adjusted versions of these commonly used measures of performance.  相似文献   

19.
In data envelopment analysis (DEA), there are two principal methods for identifying and measuring congestion: Those of Färe et al. [Färe R, Grosskopf S. When can slacks be used to identify congestion. An answer to W. W. Cooper, L. Seiford, J. Zhu. Socio-Economic Planning Sciences 2001;35:1–10] and Cooper et al. [Cooper WW, Deng H, Huang ZM, Li SX. A one-model approach to congestion in data envelopment analysis. Socio-Economic Planning Sciences 2002;36:231–8]. In the present paper, we focus on the latter work in proposing a new method that requires considerably less computation. Then, by proving a selected theorem, we show that our proposed methodology is indeed equivalent to that of Cooper et al.  相似文献   

20.
This paper employs a three stage procedure to investigate labor productivity growth and convergence in the Kansas farm sector for a balanced panel of 564 farms for the period 1993?C2007. In the first stage, Data Envelopment Analysis is used to compute technical efficiency indices. In the second stage, labor productivity growth is decomposed into components attributable to efficiency change, technical change, and factor intensity. The third stage employs both parametric and semiparametric regression analyses to investigate convergence in labor productivity growth and the contribution of each of the three components to the convergence process. Factor intensity and efficiency change are found to be sources of labor productivity convergence while technical change is found to be a source of divergence. Policies that encourage investment in capital goods may help to mitigate disparities in labor productivity across the farm sector.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号