首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
In this paper we propose a target efficiency DEA model that allows for the inclusion of environmental variables in a one stage model while maintaining a high degree of discrimination power. The model estimates the impact of managerial and environmental factors on efficiency simultaneously. A decomposition of the overall technical efficiency into two components, target efficiency and environmental efficiency, is derived. Estimation of target efficiency scores requires the solution of a single large non-linear optimization problem and provides both a joint estimation of target efficiency scores from all DMUs and an estimation of a common scalar expressing the environmental impact on efficiency for each environmental factor. We argue that if the indices on environmental conditions are constructed as the percentage of output with certain attributes present, then it is reasonable to let all reference DMUs characterized by a composed fraction lower than the fraction of output possessing the attribute of the evaluated DMU enter as potential dominators. It is shown that this requirement transforms the cone-ratio constraints on intensity variables in the BM-model (Banker and Morey 1986) into endogenous handicap functions on outputs. Furthermore, a priori information or general agreements on allowable handicap values can be incorporated into the model along the same lines as specifications of assurance regions in standard DEA.
O. B. OlesenEmail:
  相似文献   

2.
Several Data Envelopment Analysis (DEA) models use a radial distance measure that is based on the Debreu–Farrell notion of (in)efficiency. While this measure has an attractive interpretation, its use may be problematic if slacks or zeros are present in the data. The additive DEA model can perfectly deal with these problems, but the meaning of its associated scores is less intuitive than the one attached to the radial measures. We introduce an alternative efficiency measure, based on the results of the additive model, that can be decomposed in a Debreu–Farrell component and a factor that captures differences in input–output mixes with respect to those of the best practice reference observation. On an aggregate level, this second component can be considered as an indicator of the dispersion between radial efficiency measurement and results based on the Pareto–Koopmans efficiency notion. On the individual level, the measure allows us to regard relative inefficiency as resulting from (i) a divergence of implicit cost price vectors, and (ii) a cost level that is too high, even after adjustment for the implicit cost prices. Copyright © 1999 John Wiley & Sons, Ltd.  相似文献   

3.
Recent studies have stressed the importance of privatization and openness to foreign competition for bank efficiency and economic growth. We study bank efficiency in Turkey, an emerging economy with great heterogeneity in bank types and ownership structures. Earlier studies of Turkish banking had three limitations: (i) excessive reliance on cost‐function frontier analyses, wherein volume of loans is a measure of banking output; (ii) pooling all banks or imposing ad hoc heterogeneity assumptions; and (iii) lack of a comprehensive panel data set for proper analysis of productivity and heterogeneity. We use an estimation–classification procedure to find likelihood‐driven classification of bank technologies in an 11‐year panel. In addition, we augment traditional cost‐frontier analysis with a labour‐efficiency analysis. We conclude that state banks are not particularly inefficient overall, but that they do utilize labour inefficiently. This partially supports recent calls for privatization. We also conclude that special finance houses (or Islamic banks) utilize the same technology as conventional domestic banks, and do so relatively efficiently. This suggests that they do not cause harm to the financial system. Finally, we conclude that foreign banks utilize a different technology from domestic ones. This suggests that one should not overstate their value to the financial sector. Copyright © 2005 John Wiley & Sons, Ltd.  相似文献   

4.
This paper aims at developing a new methodology to measure and decompose global DMU efficiency into efficiency of inputs (or outputs). The basic idea rests on the fact that global DMU's efficiency score might be misleading when managers proceed to reallocate their inputs or redefine their outputs. Literature provides a basic measure for global DMU's efficiency score. A revised model was developed for measuring efficiencies of global DMUs and their inputs (or outputs) efficiency components, based on a hypothesis of virtual DMUs. The present paper suggests a method for measuring global DMU efficiency simultaneously with its efficiencies of inputs components, that we call Input decomposition DEA model (ID-DEA), and its efficiencies of outputs components, that we call output decomposition DEA model (OD-DEA). These twin models differ from Supper efficiency model (SE-DEA) and Common Set Weights model (CSW-DEA). The twin models (ID-DEA, OD-DEA) were applied to agricultural farms, and the results gave different efficiency scores of inputs (or outputs), and at the same time, global DMU's efficiency score was given by the Charnes, Cooper and Rhodes (Charnes et al., 1978) [1], CCR78 model. The rationale of our new hypothesis and model is the fact that managers don't have the same information level about all inputs and outputs that constraint them to manage resources by the (global) efficiency scores. Then each input/output has a different reality depending on the manager's decision in relationship to information available at the time of decision. This paper decomposes global DMU's efficiency into input (or output) components' efficiencies. Each component will have its score instead of a global DMU score. These findings would improve management decision making about reallocating inputs and redefining outputs. Concerning policy implications of the DEA twin models, they help policy makers to assess, ameliorate and reorient their strategies and execute programs towards enhancing the best practices and minimising losses.  相似文献   

5.
This paper fills the one remaining lacuna in (multiple-output) duality theory by provingjoint continuity (in input and output vectors) of cost, benefit, and (input and output) distance functions. Continuity is an important property where measurement error exists, for it provides assurance that small errors of measurement (of quantities or prices) result only in small errors in concepts like minimal cost. We consider continuity not only in prices and quantities, but also in technologies. Continuity in technologies might be more important than continjity in prices or quantities, because production technologies are almost certainly measured (or estimated) with error.  相似文献   

6.
Mann–Whitney‐type causal effects are generally applicable to outcome variables with a natural ordering, have been recommended for clinical trials because of their clinical relevance and interpretability and are particularly useful in analysing an ordinal composite outcome that combines an original primary outcome with death and possibly treatment discontinuation. In this article, we consider robust and efficient estimation of such causal effects in observational studies and clinical trials. For observational studies, we propose and compare several estimators: regression estimators based on an outcome regression (OR) model or a generalised probabilistic index (GPI) model, an inverse probability weighted estimator based on a propensity score model and two doubly robust (DR), locally efficient estimators. One of the DR estimators involves a propensity score model and an OR model, is consistent and asymptotically normal under the union of the two models and attains the semiparametric information bound when both models are correct. The other DR estimator has the same properties with the OR model replaced by a GPI model. For clinical trials, we extend an existing augmented estimator based on a GPI model and propose a new one based on an OR model. The methods are evaluated and compared in simulation experiments and applied to a clinical trial in cardiology and an observational study in obstetrics.  相似文献   

7.
This paper presents estimators of distributional impacts of interventions when selection to the program is based on observable characteristics. Distributional impacts are calculated as differences in inequality measures of the marginal distributions of potential outcomes of receiving and not receiving the treatment. The estimation procedure involves a first non‐parametric estimation of the propensity score. In the second step weighted versions of inequality measures are computed using weights based on the estimated propensity score. Consistency, semi‐parametric efficiency and validity of inference based on the percentile bootstrap are shown for the estimators. Results from Monte Carlo exercises show its good performance in small samples. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

8.
建立了基于DEA模型的全要素能源效率评价指标体系,运用考虑非期望产出的超效率SBM模型,对我国30个省(直辖市、自治区)2007—2016年的能源效率进行了测度,将测度结果按照东部、中部、西部三个区域进行了时空差异分析,并对各区域的全要素能源效率变化趋势进行收敛性检验。结果显示:2007—2016年我国全要素能源效率整体呈现下降趋势,从三个区域的对比研究可以看出,我国区域能源效率存在明显差异,具体表现为东部能源效率最高、中部次之、西部最差,与我国区域经济发展水平的梯度相一致。通过收敛性检验可以看出2007—2016年,我国全要素能源效率呈发散趋势,即地区之间的差距在逐渐增大。  相似文献   

9.
This paper suggests a novel inhomogeneous Markov switching approach for the probabilistic forecasting of industrial companies’ electricity loads, for which the load switches at random times between production and standby regimes. The model that we propose describes the transitions between the regimes using a hidden Markov chain with time-varying transition probabilities that depend on calendar variables. We model the demand during the production regime using an autoregressive moving-average (ARMA) process with seasonal patterns, whereas we use a much simpler model for the standby regime in order to reduce the complexity. The maximum likelihood estimation of the parameters is implemented using a differential evolution algorithm. Using the continuous ranked probability score (CRPS) to evaluate the goodness-of-fit of our model for probabilistic forecasting, it is shown that this model often outperforms classical additive time series models, as well as homogeneous Markov switching models. We also propose a simple procedure for classifying load profiles into those with and without regime-switching behaviors.  相似文献   

10.
双循环新发展格局对我国创新提出了更高的要求.创新发展一方面需要科技创新,另一方面也少不了科学普及,中央强调科学普及与科技创新同等地位.基于三阶段DEA模型,对2008—2015年31个省市、自治区科普投入产出效率进行研究,按照经济发展情况把全国划分为三个区域.研究发现:我国科普投入产出技术效率的上升和规模效率的波动影响...  相似文献   

11.
In the early 1980’s Kopp and Diewert proposed a popular method to decompose cost efficiency into allocative and technical efficiency for parametric functional forms based on the radial approach initiated by Farrell. We show that, relying on recently proposed homogeneity and duality results, their approach is unnecessary for self-dual homothetic production functions, while it is inconsistent in the non-homothetic case. By stressing that for homothetic technologies the radial distance function can be correctly interpreted as a technical efficiency measure, since allocative efficiency is independent of the output level and radial input reductions leave it unchanged, we contend that for non-homothetic technologies this is not the case because optimal input demands depend on the output targeted by the firm, as does the inequality between marginal rates of substitution and market prices—allocative inefficiency. We demonstrate that a correct definition of technical efficiency corresponds to the directional distance function because its flexibility ensures that allocative efficiency is kept unchanged through movements in the input production possibility set when solving technical inefficiency, and therefore the associated cost reductions can be solely—and rightly—ascribed to technical-engineering-improvements. The new methodology allowing for a consistent decomposition of cost inefficiency is illustrated resorting to simple examples of non-homothetic production functions.  相似文献   

12.
In this paper, data envelopment analysis (DEA) techniques are applied to the French nursing home industry in order to address two policy issues. The first involves nursing home size and returns to scale, while the second deals with the potential effects of a change in nursing home reimbursement from a flat rate to one based on the severity of case-mix. To accomplish this, our analysis expands on the existing nursing home literature to analyze technical and allocative efficiency along with budget-constrained models rather than the more common direct input-based distance function. Technical efficiency is evaluated via an indirect output distance function while allocative output efficiency is computed with a cost indirect revenue function. The findings suggest that system-wide efficiency and equity may result from coming reforms since payments would more accurately reflect resource use.  相似文献   

13.
This paper provides a probabilistic and statistical comparison of the log-GARCH and EGARCH models, which both rely on multiplicative volatility dynamics without positivity constraints. We compare the main probabilistic properties (strict stationarity, existence of moments, tails) of the EGARCH model, which are already known, with those of an asymmetric version of the log-GARCH. The quasi-maximum likelihood estimation of the log-GARCH parameters is shown to be strongly consistent and asymptotically normal. Similar estimation results are only available for the EGARCH (1,1) model, and under much stronger assumptions. The comparison is pursued via simulation experiments and estimation on real data.  相似文献   

14.
Journal of Productivity Analysis - The paper applies some of the latest advances of probabilistic approach to account directly for unobserved heterogeneity in the estimation of efficiency measures...  相似文献   

15.
文章对矿产资源调查项目的质量保障体系进行了研究,其目的是探索一种科学合理的质量保障体系供矿产勘查项目借鉴。通过归纳总结和对比分析,论文提出基于TQM-ISO矿产资源调查项目的质量保障体系克服了单一质量体系的缺陷,提高了矿产资源调查项目质量管理水平,对提高矿产勘查工作的效率和保证勘查成果的质量,将具有积极意义。  相似文献   

16.
In this study, we develop and implement an output index approach to the estimation of hospital cost functions that reflects the differentiated nature of hospital care. The approach combines the estimation of an output index within a flexible functional form. We find, in an application to California hospitals, evidence of scope economies across specialties within primary care, and diseconomies of scope within secondary and tertiary care. Minimum efficient scale is reached at larger levels of output than would be estimated by conventional techniques. These results indicate the importance of accounting for firm output heterogeneity when estimating cost functions. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

17.
This paper describes an approach used in the Canadian input–output (IO) accounts, which seeks to enhance the timeliness of the tables. It combines traditional updating methods, balancing techniques and the most recent data. To assess the performance of this approach, aggregate estimates from the synthetic accounts are presented for two years, and compared with estimates from benchmarks and with estimates obtained from a mechanical estimation technique. The results show that most IO components can be estimated with a relatively small estimation error and that substantial accuracy is gained from using the synthetic approach compared with a mechanical technique. Results based on data which are two years away from IO benchmarks are obtained at the cost of large errors. Synthetic estimates of the IO accounts improve the timeliness problem by at least a full year.  相似文献   

18.
We propose a methodology for gauging the uncertainty in output gap nowcasts across a large number of commonly-deployed vector autoregressive (VAR) specifications for inflation and the output gap. Our approach utilises many output gap measures to construct ensemble nowcasts for inflation using a linear opinion pool. The predictive densities for the latent output gap utilise weights based on the ability of each specification to provide accurate probabilistic forecasts of inflation. In an application based on US real-time data, nowcasting over the out-of-sample evaluation period from 1991q2 to 2010q1, we demonstrate that a system of bivariate VARs produces well-calibrated ensemble densities for inflation, in contrast to univariate autoregressive benchmarks. The implied nowcast densities for the output gap are multimodal and indicate a considerable degree of uncertainty. For example, we assess the probability of a negative output gap at around 45% between 2004 and 2007. Despite the Greenspan policy regime, there still remained a substantial risk that the nowcast for output was below potential in real time. We extend our methodology to include distinct output gap measures, based on alternative filters, and show that, in our application, the nowcast density for the output gap is sensitive to the detrending method.  相似文献   

19.
This paper discusses competing-destinations formulation of the gravity model for the flows of patients from their residential areas to health supplier regions. This approach explicitly acknowledges the interdependence of the patient between a set of alternative health supplier regions. This competing-destinations-based approach may be implemented as a probabilistic demand function or conditional logit model, with a Poisson outcome. A Texas-based case study of residential areas and State Mental Hospitals (SMHs) is presented. The results of the estimation do not lend support to the presence of scale effects in SMHs due to the size of population. This result, combined with the negative effect of average length of stay and with the positive effect of the provision of forensic services on patient flows, highlights the problem of caseload growth in SMHs.  相似文献   

20.
Qiang Chen  Lu Lin  Lixing Zhu 《Metrika》2010,71(1):45-58
We in this paper investigate smoothed score function based confidence regions for parameters in single-index models. Because a plug-in estimator of nonparametric link function causes the bias of smoothed score function to be non-negligible, the limit of the score function is asymptotically normal with a non-zero mean due to the slow convergence rate of nonparametric estimation. A bias-corrected smoothed score function is recommended for achieving centered normal limit without under-smoothing or high order kernel, and then the confidence region can be constructed by chi-square distribution. Simulation studies are carried out to assess the performance of bias-corrected local likelihood, and to compare with normal approximation approach.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号