首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
In response to a question raised by Knox Lovell, we develop a method for estimating directional output distance functions with endogenously determined direction vectors based on exogenous normalization constraints. This is reminiscent of the Russell measure proposed by Färe and Lovell (J Econ Theory 19:150–162, 1978). Moreover it is related to the slacks-based directional distance function introduced by Färe and Grosskopf (Eur J Oper Res 200:320–322, 2010a, Eur J Oper Res 206:702, 2010b). Here we show how to use the slacks-based function to estimate the optimal directions.  相似文献   

2.
Despite the pervasiveness of Six Sigma programs, there is rising concern regarding the failure of many Six Sigma programs. One explanation for many Six Sigma failures could be escalation of commitment. Escalation of commitment refers to the propensity of decision-makers to continue investing in a failing course of action. Many researchers have applied escalation of commitment to explain the behavior of individuals, groups, companies, and nations. Using the escalation of commitment model (Staw and Ross 1987a; Ross and Staw Acad. Manag. J. 36:701–732 1993) as a basis, this research describes a Six Sigma failure in an electrical components company. In documenting this failure, this research contributes in two ways, both in the practice and in the theory of Six Sigma. First, while examining the Six Sigma failure, this research uncovers important factors for successful implementation, which should improve the practice of Six Sigma. Second, academic research (e.g., Schroeder et al. J. Oper. Manag, 26:536–554 2008; Zu et al. J. Oper. Manag, 26:630–650 2008) is engaged in uncovering the definition of Six Sigma, and its differences from other improvement programs. This research provides a new direction to academic research and has the potential to impact the theory of Six Sigma.  相似文献   

3.
In this paper we consider parametric deterministic frontier models. For example, the production frontier may be linear in the inputs, and the error is purely one-sided, with a known distribution such as exponential or half-normal. The literature contains many negative results for this model. Schmidt (Rev Econ Stat 58:238–239, 1976) showed that the Aigner and Chu (Am Econ Rev 58:826–839, 1968) linear programming estimator was the exponential MLE, but that this was a non-regular problem in which the statistical properties of the MLE were uncertain. Richmond (Int Econ Rev 15:515–521, 1974) and Greene (J Econom 13:27–56, 1980) showed how the model could be estimated by two different versions of corrected OLS, but this did not lead to methods of inference for the inefficiencies. Greene (J Econom 13:27–56, 1980) considered conditions on the distribution of inefficiency that make this a regular estimation problem, but many distributions that would be assumed do not satisfy these conditions. In this paper we show that exact (finite sample) inference is possible when the frontier and the distribution of the one-sided error are known up to the values of some parameters. We give a number of analytical results for the case of intercept only with exponential errors. In other cases that include regressors or error distributions other than exponential, exact inference is still possible but simulation is needed to calculate the critical values. We also discuss the case that the distribution of the error is unknown. In this case asymptotically valid inference is possible using subsampling methods.  相似文献   

4.
In this paper, we discuss in a general framework the design-based estimation of population parameters when sensitive data are collected by randomized response techniques. We show in close detail the procedure for estimating the distribution function of a sensitive quantitative variable and how to estimate simultaneously the population prevalence of individuals bearing a stigmatizing attribute and the distribution function for the members belonging to the hidden group. The randomized response devices by Greenberg et al. (J Am Stat Assoc 66:243–250, 1971), Franklin (Commun Stat Theory Methods 18:489–505, 1989), and Singh et al. (Aust NZ J Stat 40:291–297 1998) are here considered as data-gathering tools.  相似文献   

5.
Based on the seminal paper of Farrell (J R Stat Soc Ser A (General) 120(3):253–290, 1957), researchers have developed several methods for measuring efficiency. Nowadays, the most prominent representatives are nonparametric data envelopment analysis (DEA) and parametric stochastic frontier analysis (SFA), both introduced in the late 1970s. Researchers have been attempting to develop a method which combines the virtues—both nonparametric and stochastic—of these “oldies”. The recently introduced Stochastic non-smooth envelopment of data (StoNED) by Kuosmanen and Kortelainen (J Prod Anal 38(1):11–28, 2012) is such a promising method. This paper compares the StoNED method with the two “oldies” DEA and SFA and extends the initial Monte Carlo simulation of Kuosmanen and Kortelainen (J Prod Anal 38(1):11–28, 2012) in several directions. We show, among others, that, in scenarios without noise, the rivalry is still between the “oldies”, while in noisy scenarios, the nonparametric StoNED PL now constitutes a promising alternative to the SFA ML.  相似文献   

6.
This study examines the influence of several ex-ante factors on three-year market-adjusted returns of two-stage carve-out combinations from 1988 to 2006. We observe that several factors maintain their significance over a three-year period after equity carve-out ex-dates. Also, we report that, contrary to Vijh (J Bus 75(1):153–190, 1999), negative three-year carve-out returns are statistically significant. In addition, we note that negative combination carve-out/spin-off three-year returns are higher than those of carve-outs acquired by third parties or reacquired by their parents. Moreover, we observe that our independent variables explain 14.56% of the multiple regression three-year returns for carve-outs. Also, our negative correlation of three-year returns with initial period returns supports the “leaning against the wind” hypothesis of Loughran and Ritter (Rev Financ Stud 15(2):413–443, 2002). In addition, our results for the post-bubble period (2001–2006) provide an extension of the changing issuer objective function noted by Loughran and Ritter (Financ Manage 35(3):23–51, 2004) for IPOs and Hogan and Olson (J Financ Res 27(4):521–537, 2004) for equity carve-outs.  相似文献   

7.
This paper applies the probabilistic approach developed by Daraio and Simar (J Prod Anal 24:93–121, 2005, Advanced robust and nonparametric methods in efficiency analysis. Springer Science, New York, 2007a, J Prod Anal 28:13–32, 2007b) in order to develop conditional and unconditional data envelopment analysis (DEA) models for the measurement of countries’ environmental efficiency levels for a sample of 110 countries in 2007. In order to capture the effect of countries compliance with the Kyoto protocol agreement (KPA) policies, we condition first the years since a country has signed the KPA until 2007 and secondly the obliged percentage level of countries’ emission reductions. Particularly, various DEA models have been applied alongside with bootstrap techniques in order to determine the effect of KPA on countries’ environmental efficiencies. The study illustrates how the recent developments in efficiency analysis and statistical inference can be applied when evaluating environmental performance issues. The results indicate a nonlinear relationship between countries’ obliged percentage levels of emission reductions and their environmental efficiency levels. Finally, a similar nonlinear relationship is also recorded between the duration which a country has signed the KPA and its environmental efficiency levels.  相似文献   

8.
We consider multiple-principal multiple-agent models of moral hazard: principals compete through mechanisms in the presence of agents who take unobservable actions. In this context, we provide a rationale for restricting principals to make use of simple mechanisms, which correspond to direct mechanisms in the standard framework of Myerson (J Math Econ 10:67–81, 1982). Our results complement those of Han (J Econ Theory 137(1):610–626, 2007) who analyzes a complete information setting where agents’ actions are fully contractible.  相似文献   

9.
This paper shows how scale efficiency can be measured from an arbitrary parametric hyperbolic distance function with multiple outputs and multiple inputs. It extends the methods introduced by Ray (J Product Anal 11:183–194, 1998), and Balk (J Product Anal 15:159–183, 2001) and Ray (2003) that measure scale efficiency from a single-output multi-input distance function and from a multi-output and multi-input distance function, respectively. The method developed in the present paper is different from Ray’s and Balk’s in that it allows for simultaneous contraction of inputs and expansion of outputs. Theorems applicable to an arbitrary parametric hyperbolic distance function are introduced first, and then their uses in measuring scale efficiency are illustrated with the translog functional form.  相似文献   

10.
While the high prevalence of mental illness in workplaces is more readily documented in the literature than it was ten or so years ago, it continues to remain largely within the medical and health sciences fields. This may account for the lack of information about mental illness in workplaces (Dewa et al. Healthcare Papers 5:12–25, 2004) by operational managers and human resource departments even though such illnesses effect on average 17 % to 20 % of employees in any 12-month period (MHCC 2012; SAMHSA 2010; ABS 2007). As symptoms of mental illness have the capacity to impact negatively on employee work performance and/or attendance, the ramifications on employee performance management systems can be significant, particularly when employees choose to deliberately conceal their illness, such that any work concerns appear to derive from issues other than illness (Dewa et al. Healthcare Papers 5:12–25, 2004; De Lorenzo 2003). When employee non-disclosure of a mental illness impacts negatively in the workplace, it presents a very challenging issue in relation to performance management for both operational managers and human resource staff. Without documented medical evidence to show that impaired work performance and/or attendance is attributable to a mental illness, the issue of performance management arises. Currently, when there is no documented medical illness, performance management policies are often brought into place to improve employee performance and/or attendance by establishing achievable employee targets. Yet, given that in any twelve-month period at least a fifth of the workforce sustains a mental illness (MHCC 2012; SAMHSA 2010; ABS 2007), and that non-disclosure is significant (Barney et al. BMC Public Health 9:1–11, 2009; Munir et al. Social Science & Medicine 60:1397–1407, 2005) such targets may be unachievable for employees with a hidden mental illness. It is for these reasons that this paper reviews the incidence of mental illness in western economies, its costs, and the reasons why it is often concealed and proposes the adoption of what are termed ‘Buffer Stage’ policies as an added tool that organisations may wish to utilise in the management of hidden medical illnesses such as mental illness.  相似文献   

11.
Bing Guo  Qi Zhou  Runchu Zhang 《Metrika》2014,77(6):721-732
Zhang et al. (Stat Sinica 18:1689–1705, 2008) introduced an aliased effect-number pattern for two-level regular designs and proposed a general minimum lower-order confounding (GMC) criterion for choosing optimal designs. All the GMC \(2^{n-m}\) designs with \(N/4+1\le n\le N-1\) were constructed by Li et al. (Stat Sinica 21:1571–1589, 2011), Zhang and Cheng (J Stat Plan Inference 140:1719–1730, 2010) and Cheng and Zhang (J Stat Plan Inference 140:2384–2394, 2010), where \(N=2^{n-m}\) is run number and \(n\) is factor number. In this paper, we first study some further properties of GMC design, then we construct all the GMC \(2^{n-m}\) designs respectively with the three parameter cases of \(n\le N-1\) : (i) \(m\le 4\) , (ii) \(m\ge 5\) and \(n=(2^m-1)u+r\) for \(u>0\) and \(r=0,1,2\) , and (iii) \(m\ge 5\) and \(n=(2^m-1)u+r\) for \(u\ge 0\) and \(r=2^m-3,2^m-2\) .  相似文献   

12.
The purpose of this note is twofold. First, we survey the study of the percolation phase transition on the Hamming hypercube $\{0,1\}^{m}$ obtained in the series of papers (Borgs et al. in Random Struct Algorithms 27:137–184, 2005; Borgs et al. in Ann Probab 33:1886–1944, 2005; Borgs et al. in Combinatorica 26:395–410, 2006; van der Hofstad and Nachmias in Hypercube percolation, Preprint 2012). Secondly, we explain how this study can be performed without the use of the so-called “lace expansion” technique. To that aim, we provide a novel simple proof that the triangle condition holds at the critical probability.  相似文献   

13.
This paper develops a parametric decomposition framework of labor productivity growth relaxing the assumption of labor-specific efficiency. The decomposition analysis is applied to a sample of 121 developed and developing countries during the 1970–2007 period drawn from the recently updated Penn World Tables and Barro and Lee (A new data set of educational attainment in the world 1950–2010. NBER Working Paper No. 15902, 2010) educational databases. A generalized Cobb–Douglas functional specification is used taking into account differences in technological structures across groups of countries to approximate aggregate production technology using Jorgenson and Nishimizu (Econ J 88:707–726, 1978) bilateral model of production. The measurement of labor efficiency is based on Kopp’s (Quart J Econ 96:477–503, 1981) orthogonal non-radial index of factor-specific efficiency modified in a parametric frontier framework. The empirical results indicate that the weighted average annual rate of labor productivity growth was 1.239 % over the period analyzed. Technical change was found to be the driving force of labor productivity, while improvements in human capital and factor intensities account for the 19.5 and 12.4 % of that productivity growth, respectively. Finally, labor efficiency improvements contributed by 9.8 % to measured labor productivity growth.  相似文献   

14.
Hitchcock (Synthese 97:335–364, 1993) argues that the ternary probabilistic theory of causality meets two problems due to the problem of disjunctive factors, while arguing that the unanimity probabilistic theory of causality, which is founded on the binary contrast, does not meet them. Hitchcock also argues that only the ternary theory conveys the information about complex relations of causal relevance. In this paper, I show that Eells’ solution (Probabilistic causality, Cambridge University Press, Cambridge, 1991), which is founded on the unanimity theory, meet the two problems. I also show that the unanimity theory too reveals complex relations of causal relevance. I conclude that the two probabilistic theories of causality carve up the same causal structure in two formally different and conceptually consistent ways. Hitchcock’s ternary theory inspires several major philosophers (Maslen, Causation and counterfactuals, pp. 341–357. MIT Press, Cambridge, 2004; Schaffer, Philos Rev 114, 297–328, 2005; Northcott, Phil Stud 139, 111–123, 2007; Hausman, The place of probability in science: In honor of Eelleys Eells (1953–2006), pp. 47–64, Springer, Dordrecht, 2010) who have recently developed the ternary theory or the quaternary theory. This paper leads them to reconsider the relation between the ternary theory and the binary theory.  相似文献   

15.
Sangun Park 《Metrika》2014,77(5):609-616
The representation of the entropy in terms of the hazard function and its extensions have been studied by many authors including Teitler et al. (IEEE Trans Reliab 35:391–395, 1986). In this paper, we consider a representation of the Kullback–Leibler information of the first \(r\) order statistics in terms of the relative risk (Park and Shin in Statistics, 2012), the ratio of hazard functions, and extend it to the progressively Type II censored data. Then we study the change in Kullback–Leibler information of the first \(r\) order statistics according to \(r\) and discuss its relation with Fisher information in order statistics.  相似文献   

16.
Social scientists often consider multiple empirical models of the same process. When these models are parametric and non-nested, the null hypothesis that two models fit the data equally well is commonly tested using methods introduced by Vuong (Econometrica 57(2):307–333, 1989) and Clarke (Am J Political Sci 45(3):724–744, 2001; J Confl Resolut 47(1):72–93, 2003; Political Anal 15(3):347–363, 2007). The objective of each is to compare the Kullback–Leibler Divergence (KLD) of the two models from the true model that generated the data. Here we show that both of these tests are based upon a biased estimator of the KLD, the individual log-likelihood contributions, and that the Clarke test is not proven to be consistent for the difference in KLDs. As a solution, we derive a test based upon cross-validated log-likelihood contributions, which represent an unbiased KLD estimate. We demonstrate the CVDM test’s superior performance via simulation, then apply it to two empirical examples from political science. We find that the test’s selection can diverge from those of the Vuong and Clarke tests and that this can ultimately lead to differences in substantive conclusions.  相似文献   

17.
Qiqing Yu  Yuting Hsu  Kai Yu 《Metrika》2014,77(8):995-1011
The non-parametric likelihood L(F) for censored data, including univariate or multivariate right-censored, doubly-censored, interval-censored, or masked competing risks data, is proposed by Peto (Appl Stat 22:86–91, 1973). It does not involve censoring distributions. In the literature, several noninformative conditions are proposed to justify L(F) so that the GMLE can be consistent (see, for examples, Self and Grossman in Biometrics 42:521–530 1986, or Oller et al. in Can J Stat 32:315–326, 2004). We present the necessary and sufficient (N&S) condition so that \(L(F)\) is equivalent to the full likelihood under the non-parametric set-up. The statement is false under the parametric set-up. Our condition is slightly different from the noninformative conditions in the literature. We present two applications to our cancer research data that satisfy the N&S condition but has dependent censoring.  相似文献   

18.
This paper investigates the existence of heterogeneous technologies in the US commercial banking industry through the nondynamic panel threshold effects estimation technique proposed by Hansen (Econometrica 64:413–430, 1999, Econometrica 68:575–603, 2000a). We employ the total assets as a threshold variable, which is typically considered as a proxy for bank’s size in the banking literature. We modify the threshold effects model to allow for time-varying effects, wherein these are modeled by a time polynomial of degree two as in Cornwell et al. (J Econom 46:185–200, 1990) model. Threshold effects estimation allows us to sort banks into discrete groups based on their size in a structural and consistent manner. We determine seven such distinct technology-groups within which banks are allowed to share the same technology parameters. We provide estimates of individual and group efficiency scores, as well as of those of returns to scale and measures of technological change for each group. The presence of the threshold(s) is tested via bootstrap procedure outlined in Hansen (Econometrica 64:413–430, 1999) and the relationship between bank size and efficiency ratios is investigated.  相似文献   

19.
We study a keyword auction model where bidders have constrained budgets. In the absence of budget constraints, Edelman et al. (Am Econ Rev 97(1):242–259, 2007) and Varian (Int J Ind Organ 25(6):1163–1178, 2007) analyze “locally envy-free equilibrium” or “symmetric Nash equilibrium” bidding strategies in generalized second-price auctions. However, bidders often have to set their daily budgets when they participate in an auction; once a bidder’s payment reaches his budget, he drops out of the auction. This raises an important strategic issue that has been overlooked in the previous literature: Bidders may change their bids to inflict higher prices on their competitors because under generalized second-price, the per-click price paid by a bidder is the next highest bid. We provide budget thresholds under which equilibria analyzed in Edelman et al. (Am Econ Rev 97(1):242–259, 2007) and Varian (Int J Ind Organ 25(6):1163–1178, 2007) are sustained as “equilibria with budget constraints” in our setting. We then consider a simple environment with one position and two bidders and show that a search engine’s revenue with budget constraints may be larger than its revenue without budget constraints.  相似文献   

20.
Xin Liu  Rong-Xian Yue 《Metrika》2013,76(4):483-493
This paper considers the optimal design problem for multiresponse regression models. The $R$ -optimality introduced by Dette (J R Stat Soc B 59:97–110, 1997) for single response experiments is extended to the case of multiresponse parameter estimation. A general equivalence theorem for the $R$ -optimality is provided for multiresponse models. Illustrative examples of the $R$ -optimal designs for two multiresponse models are presented based on the general equivalence theorem.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号