首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
This paper re-examines a problem of congested inputs in the Chinese automobile and textile industries, which was identified by Cooper et al. [Cooper WW, Deng H, Gu B, Li S, Thrall RM. Using DEA to improve the management of congestion in Chinese industries (1981-1997). Socio-Economic Planning Sciences 2001;35:227-242]. Since these authors employed a single approach in measuring congestion, it is worth exploring whether alternative procedures would yield very different outcomes. Indeed, the measurement of congestion is an area where there has been much theoretical debate but relatively little empirical work. After examining the theoretical properties of the two main approaches currently available, those of Färe et al. [Färe R, Grosskopf S, Lovell CAK. The measurement of efficiency of production. Boston: Kluwer-Nijhoff; 1985] and Cooper et al., we use the data set assembled by Cooper et al. for the period 1981-1997 to compare and contrast the measurements of congestion generated by these alternative approaches. We find that the results are strikingly different, especially in terms of the amount of congestion identified. Finally, we discuss the new approach to measuring congestion proposed by Tone and Sahoo [Tone K, Sahoo BK. Degree of scale economies and congestion: a unified DEA approach. European Journal of Operational Research 2004;158:755-772].  相似文献   

2.
This paper investigates productivity growth, technical progress, and efficiency change for a group of the 56 largest CPA firms in the US from the period 1996–1999 through the period 2003–2006, where the former preceded, and the latter followed, enactment of the Sarbanes–Oxley Act (SOX). Data envelopment analysis (DEA) is used to calculate Malmquist indices of three measures of interest: productivity growth, technical progress, and efficiency change. Results indicate that CPA firms, on average, experienced a productivity growth of approx. 17% from the pre- to post-SOX period. Consistent with the finding of Banker et al. [Banker RD, Chang H, Natarajan R. Productivity change, technical progress and relative efficiency change in the public accounting industry. Management Science 2005;51:291–304], this productivity gain can be attributed primarily to technical progress rather than a change in relative efficiency. In addition, results indicate that the “Big 4” firms underperformed their non-Big 4 counterparts in both productivity growth and technical progress.  相似文献   

3.
The aim of the present work is to demonstrate the relevance of some social-economic features as a complex factor for sustainable development. In particular we have considered the Absolute Ecological Footprint instead the Pro-Capita, as a better proxy in order to investigate Socio-Economic Patterns of sustainable development. We have also set out a methodology of non-linear analysis, based on Neural Networks Self Organizing Map in order to find relationship between the different kinds of Socio-Economic Patterns. The classic economic approach, indeed, cannot explain those patterns, cause they are dependent from its social and geographical context, in a complex and non-linear way by Carlei et al. (Scienze Regionali, 2008).  相似文献   

4.
Quality function deployment (QFD) is a proven tool for process and product development, which translates the voice of customer (VoC) into engineering characteristics (EC), and prioritizes the ECs, in terms of customer's requirements. Traditionally, QFD rates the design requirements (DRs) with respect to customer needs, and aggregates the ratings to get relative importance scores of DRs. An increasing number of studies stress on the need to incorporate additional factors, such as cost and environmental impact, while calculating the relative importance of DRs. However, there is a paucity of methodologies for deriving the relative importance of DRs when several additional factors are considered. Ramanathan and Yunfeng [43] proved that the relative importance values computed by data envelopment analysis (DEA) coincide with traditional QFD calculations when only the ratings of DRs with respect to customer needs are considered, and only one additional factor, namely cost, is considered. Also, Kamvysi et al. [27] discussed the combination of QFD with analytic hierarchy process–analytic network process (AHP–ANP) and DEAHP–DEANP methodologies to prioritize selection criteria in a service context. The objective of this paper is to propose a QFD–imprecise enhanced Russell graph measure (QFD–IERGM) for incorporating the criteria such as cost of services and implementation easiness in QFD. Proposed model is applied in an Iranian hospital.  相似文献   

5.
This study first proposes a definition for directional congestion in certain input and output directions within the framework of data envelopment analysis. Second, two methods from different viewpoints are proposed to estimate the directional congestion. Third, we address the relationship between directional congestion and classic (strong or weak) congestion. Finally, we present a case study investigating the analysis performed by the research institutes of the Chinese Academy of Sciences to demonstrate the applicability and usefulness of the methods developed in this study.  相似文献   

6.
We reanalyze data from the observational study by Connors et al. (1996) on the impact of Swan–Ganz catheterization on mortality outcomes. The study by Connors et al. (1996) assumes that there are no unobserved differences between patients who are catheterized and patients who are not catheterized and finds that catheterization increases patient mortality. We instead allow for such differences between patients by implementing both the instrumental variable bounds of Manski (1990), which only exploits an instrumental variable, and the bounds of Shaikh and Vytlacil (2011), which exploit mild nonparametric, structural assumptions in addition to an instrumental variable. We propose and justify the use of indicators of weekday admission as an instrument for catheterization in this context. We find that in our application, the Manski (1990) bounds do not indicate whether catheterization increases or decreases mortality, where as the Shaikh and Vytlacil (2011) bounds reveal that at least for some diagnoses, Swan–Ganz catheterization reduces mortality at 7 days after catheterization. We show that the bounds of Shaikh and Vytlacil (2011) remain valid under even weaker assumptions than those described in Shaikh and Vytlacil (2011). We also extend the analysis to exploit a further nonparametric, structural assumption–that doctors catheterize individuals with systematically worse latent health–and find that this assumption further narrows these bounds and strengthens our conclusions. In our analysis, we construct confidence regions using the methodology developed in Romano and Shaikh (2008). We show in particular that the confidence regions are uniformly consistent in level over a large class of possible distributions for the observed data that include distributions where the instrument is arbitrarily “weak”.  相似文献   

7.
Baumol, Panzar, and Willig [1982] introduced the idea that there may be efficiency gain from merger of two or more firms. This paper shows that the Debreu [1951]-Farrell [1957] type index suggested by Färe [1986] to calculate the gain in efficiency from merger of firms can be related to a concentration formula. We then introduce a dominance relation consistent with the ordering generated by the Färe gain function. Next, it is shown that a normalized form of the gain function satisfying a homogeneity condition becomes a subclass of the Hannah-Kay [1977] concentration indices. A limiting case of this subclass becomes the entropy-based index of concentration.The editor of this paper was Rolf Färe.I wish to express sincere gratitude to Shmuel Nitzan for bringing my attention to this problem. Comments and suggestions from Rolf Färe, Alexis Jacquemin, and two anonymous referees are acknowledged with thanks.  相似文献   

8.
We report a surprising link between optimal portfolios generated by a special type of variational preferences called divergence preferences (see Maccheroni et al., 2006) and optimal portfolios generated by classical expected utility. As a special case, we connect optimization of truncated quadratic utility (see ?erný, 2003) to the optimal monotone mean–variance portfolios (see Maccheroni et al., 2009), thus simplifying the computation of the latter.  相似文献   

9.
It is well-known that the naive bootstrap yields inconsistent inference in the context of data envelopment analysis (DEA) or free disposal hull (FDH) estimators in nonparametric frontier models. For inference about efficiency of a single, fixed point, drawing bootstrap pseudo-samples of size m < n provides consistent inference, although coverages are quite sensitive to the choice of subsample size m. We provide a probabilistic framework in which these methods are shown to valid for statistics comprised of functions of DEA or FDH estimators. We examine a simple, data-based rule for selecting m suggested by Politis et al. (Stat Sin 11:1105–1124, 2001), and provide Monte Carlo evidence on the size and power of our tests. Our methods (i) allow for heterogeneity in the inefficiency process, and unlike previous methods, (ii) do not require multivariate kernel smoothing, and (iii) avoid the need for solutions of intermediate linear programs.  相似文献   

10.
This paper examines the technical efficiency of US Federal Reserve check processing offices over 1980–2003. We extend results from Park et al. [Park, B., Simar, L., Weiner, C., 2000. FDH efficiency scores from a stochastic point of view. Econometric Theory 16, 855–877] and Daouia and Simar [Daouia, A., Simar, L., 2007. Nonparametric efficiency analysis: a multivariate conditional quantile approach. Journal of Econometrics 140, 375–400] to develop an unconditional, hyperbolic, α-quantile estimator of efficiency. Our new estimator is fully non-parametric and robust with respect to outliers; when used to estimate distance to quantiles lying close to the full frontier, it is strongly consistent and converges at rate root-n, thus avoiding the curse of dimensionality that plagues data envelopment analysis (DEA) estimators. Our methods could be used by policymakers to compare inefficiency levels across offices or by managers of individual offices to identify peer offices.  相似文献   

11.
The question of compositional effects (that is, the effect of collective properties of a pupil body on the individual members), or Aggregated Group-Level Effects (AGLEs) as the author prefers to call them, has been the subject of considerable controversy. Some authors, e.g. Rutter et al. [Fifteen thousand hours: Secondary Schools and Their Effects on Children. London: Open Books.], Willms [Oxford Review of Education 11(1): 33–41; (1986). American Sociological Review, 51, 224–241.], Bondi [British Educational Research Journal, 17(3), 203-218.], have claimed to find such effects, while on the other hand Mortimore et al. [School Matters: the Junior Years. Wells: Open Books.] and Thomas and Mortimore [Oxford Review of Education 16(2): 137–158.] did not. Others, for example Hauser [1970], have implied that many apparent AGLEs may be spurious, while Gray et al. [Review of Research in Education, 8, 158–193.] have suggested that at least in certain circumstances such apparent effects may arise as a result of inadequate allowance for pre-existing differences. A possible statistical mechanism for this is outlined in the work of Burstein [In R. Dreeben, & J. A. Thomas (Eds.), The Analysis of Educational Productivity. Volume 1: Issues in Microanalysis, Cambridge, MASS: Ballinger, pp. 119–190] on the effect of aggregating the data when a variable is omitted from the model used. This paper suggests another way in which spurious AGLEs can arise. It shows mathematically that even if there are no omitted variables, measurement error in an explanatory variable could give rise to apparent, but spurious, AGLEs, when analysed using a multilevel modelling procedure. Using simulation methods, it investigates what the practical effects of this are likely to be, and shows that statistically significant spurious effects occur systematically under fairly standard conditions.  相似文献   

12.
Quanling Wei  Hong Yan 《Socio》2009,43(1):40-54
Our earlier work [Wei QL, Yan H. Congestion and returns to scale in data envelopment analysis. European Journal of Operational Research 2004;153:641–60] discussed necessary and sufficient conditions for the existence of congestion together with aspects of returns to scale under an output-oriented DEA framework. In line with this work, the current paper investigates the issue of “weak congestion”, wherein congestion occurs when the reduction of selected inputs causes some, rather than all, outputs to increase, without a worsening of others. We define output efficiency for decision making units under a series of typical DEA output additive models. Based on this definition, we offer necessary and sufficient conditions for the existence of weak congestion. Numerical examples are provided for purposes of illustration.  相似文献   

13.
In a production technology, the type of returns to scale (RTS) associated with an efficient decision making unit (DMU) is indicative of the direction of marginal rescaling that the DMU should undertake in order to improve its productivity. In this paper a concept of global returns to scale (GRS) is developed as an indicator of the direction in which the most productive scale size (MPSS) of an efficient DMU is achieved. The GRS classes are useful in assisting strategic decisions like those involving mergers of units or splitting into smaller firms. The two characterisations, RTS and GRS, are the same in a convex technology but generally different in a non-convex one. It is shown that, in a non-convex technology, the well-known method of testing RTS proposed by Färe et al. is in fact testing for GRS and not RTS. Further, while there are three types of RTS: constant, decreasing and increasing (CRS, DRS and IRS, respectively), the classification according to GRS includes the fourth type of sub-constant GRS, which describes a DMU able to achieve its MPSS by both reducing and increasing the scale of operations. The notion of GRS is applicable to a wide range of technologies, including the free disposal hull (FDH) and all polyhedral technologies used in data envelopment analysis (DEA).  相似文献   

14.
Stochastic FDH/DEA estimators for frontier analysis   总被引:2,自引:2,他引:0  
In this paper we extend the work of Simar (J Product Ananl 28:183–201, 2007) introducing noise in nonparametric frontier models. We develop an approach that synthesizes the best features of the two main methods in the estimation of production efficiency. Specifically, our approach first allows for statistical noise, similar to Stochastic frontier analysis (even in a more flexible way), and second, it allows modelling multiple-inputs-multiple-outputs technologies without imposing parametric assumptions on production relationship, similar to what is done in non-parametric methods, like Data Envelopment Analysis (DEA), Free Disposal Hull (FDH), etc.... The methodology is based on the theory of local maximum likelihood estimation and extends recent works of Kumbhakar et al. (J Econom 137(1):1–27, 2007) and Park et al. (J Econom 146:185–198, 2008). Our method is suitable for modelling and estimation of the marginal effects onto inefficiency level jointly with estimation of marginal effects of input. The approach is robust to heteroskedastic cases and to various (unknown) distributions of statistical noise and inefficiency, despite assuming simple anchorage models. The method also improves DEA/FDH estimators, by allowing them to be quite robust to statistical noise and especially to outliers, which were the main problems of the original DEA/FDH estimators. The procedure shows great performance for various simulated cases and is also illustrated for some real data sets. Even in the single-output case, our simulated examples show that our stochastic DEA/FDH improves the Kumbhakar et al. (J Econom 137(1):1–27, 2007) method, by making the resulting frontier smoother, monotonic and, if we wish, concave.  相似文献   

15.
We present a Bayesian approach for analyzing aggregate level sales data in a market with differentiated products. We consider the aggregate share model proposed by Berry et al. [Berry, Steven, Levinsohn, James, Pakes, Ariel, 1995. Automobile prices in market equilibrium. Econometrica. 63 (4), 841–890], which introduces a common demand shock into an aggregated random coefficient logit model. A full likelihood approach is possible with a specification of the distribution of the common demand shock. We introduce a reparameterization of the covariance matrix to improve the performance of the random walk Metropolis for covariance parameters. We illustrate the usefulness of our approach with both actual and simulated data. Sampling experiments show that our approach performs well relative to the GMM estimator even in the presence of a mis-specified shock distribution. We view our approach as useful for those who are willing to trade off one additional distributional assumption for increased efficiency in estimation.  相似文献   

16.
The aim of this study is to confirm the factorial structure of the Identification-Commitment Inventory (ICI) developed within the frame of the Human System Audit (HSA) (Quijano et al. in Revist Psicol Soc Apl 10(2):27–61, 2000; Pap Psicól Revist Col Of Psicó 29:92–106, 2008). Commitment and identification are understood by the HSA at an individual level as part of the quality of human processes and resources in an organization; and therefore as antecedents of important organizational outcomes, such as personnel turnover intentions, organizational citizenship behavior, etc. (Meyer et al. in J Org Behav 27:665–683, 2006). The theoretical integrative model which underlies ICI Quijano et al. (2000) was tested in a sample (N = 625) of workers in a Spanish public hospital. Confirmatory factor analysis through structural equation modeling was performed. Elliptical least square solution was chosen as estimator procedure on account of non-normal distribution of the variables. The results confirm the goodness of fit of an integrative model, which underlies the relation between Commitment and Identification, although each one is operatively different.  相似文献   

17.
We prove that, by the method of construction of a coalition production economy due to Sun et al. [Sun, N., Trockel, W., Yang, Z., 2008. Competitive outcomes and endogenous coalition formation in an nn-person game. Journal of Mathematical Economics 44, 853–860], every transferable utility (TU) game can be generated by a coalition production economy. Namely, for every TU game, we can construct a coalition production economy that generates the given game. We briefly discuss the relationship between the core of a given TU game and the set of Walrasian payoff vectors for the induced coalition production economy.  相似文献   

18.
We study estimation and inference in cointegrated regression models with multiple structural changes allowing both stationary and integrated regressors. Both pure and partial structural change models are analyzed. We derive the consistency, rate of convergence and the limit distribution of the estimated break fractions. Our technical conditions are considerably less restrictive than those in Bai et al. [Bai, J., Lumsdaine, R.L., Stock, J.H., 1998. Testing for and dating breaks in multivariate time series. Review of Economic Studies 65, 395–432] who considered the single break case in a multi-equations system, and permit a wide class of practically relevant models. Our analysis is, however, restricted to a single equation framework. We show that if the coefficients of the integrated regressors are allowed to change, the estimated break fractions are asymptotically dependent so that confidence intervals need to be constructed jointly. If, however, only the intercept and/or the coefficients of the stationary regressors are allowed to change, the estimates of the break dates are asymptotically independent as in the stationary case analyzed by Bai and Perron [Bai, J., Perron, P., 1998. Estimating and testing linear models with multiple structural changes. Econometrica 66, 47–78]. We also show that our results remain valid, under very weak conditions, when the potential endogeneity of the non-stationary regressors is accounted for via an increasing sequence of leads and lags of their first-differences as additional regressors. Simulation evidence is presented to assess the adequacy of the asymptotic approximations in finite samples.  相似文献   

19.
We consider the ability to detect interaction structure from data in a regression context. We derive an asymptotic power function for a likelihood-based test for interaction in a regression model, with possibly misspecified alternative distribution. This allows a general investigation of different types of interactions which are poorly or well detected via data. Principally we contrast pairwise-interaction models with ‘diffuse interaction models’ as introduced in Gustafson et al. (Stat Med 24:2089–2104, 2005).  相似文献   

20.
In this paper, we implement the conditional difference asymmetry model (CDAS) for square tables with nominal categories proposed by Tomizawa et al. (J. Appl. Stat. 31(3): 271–277, 2004) with the use of the non-standard log-linear model formulation approach. The implementation is carried out by refitting the model in the 3 ×  3 table in (Tomizawa et al. J. Appl. Stat. 31(3): 271–277, 2004). We extend this approach to a larger 4 ×  4 table of religious affiliation. We further calculated the measure of asymmetry along with its asymptotic standard error and confidence bounds. The procedure is implemted with SAS PROC GENMOD but can also be implemented in SPSS by following the discussion in (Lawal, J. Appl. Stat. 31(3): 279–303, 2004; Lawal, Qual. Quant. 38(3): 259–289, 2004).  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号