首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 62 毫秒
1.
Despite the pervasiveness of Six Sigma programs, there is rising concern regarding the failure of many Six Sigma programs. One explanation for many Six Sigma failures could be escalation of commitment. Escalation of commitment refers to the propensity of decision-makers to continue investing in a failing course of action. Many researchers have applied escalation of commitment to explain the behavior of individuals, groups, companies, and nations. Using the escalation of commitment model (Staw and Ross 1987a; Ross and Staw Acad. Manag. J. 36:701–732 1993) as a basis, this research describes a Six Sigma failure in an electrical components company. In documenting this failure, this research contributes in two ways, both in the practice and in the theory of Six Sigma. First, while examining the Six Sigma failure, this research uncovers important factors for successful implementation, which should improve the practice of Six Sigma. Second, academic research (e.g., Schroeder et al. J. Oper. Manag, 26:536–554 2008; Zu et al. J. Oper. Manag, 26:630–650 2008) is engaged in uncovering the definition of Six Sigma, and its differences from other improvement programs. This research provides a new direction to academic research and has the potential to impact the theory of Six Sigma.  相似文献   

2.
This study follows the structure of Grifell-Tatjé and Lovell (Manag Sci 45:1177–1193, 1999) and uses the non-parametric approach to decompose the change in profit of Taiwanese banks into various drivers. However, risk was never considered in the papers based on profit decomposition. Without considering risk, the empirical results will be biased while decomposing the change in profit. In fact, risk is a joint but undesirable output which cannot be freely disposed of by various regulations. The non-performing loan (NPL) is employed as a risk indicator for decomposing the change in profit in this study. This study also performs a three-way comparison among (1) the original Grifell-Tatjé and Lovell (Manag Sci 45:1177–1193, 1999) analysis (OGLA) model that ignores NPL, (2) the extended Grifell-Tatjé and Lovell (Manag Sci 45:1177–1193, 1999) analysis (EGLA) model that is based on the OGLA model and incorporates NPL, and (3) the directional distance function (DDF) model that is based on Juo et al. (Omega 40:550–561, 2012) and incorporates NPLs to see if incorporating the undesirable output matters. The decomposition of the change in profit in the above three models is then illustrated using Taiwanese banks over the period 2006–2010.  相似文献   

3.
This paper investigates the existence of heterogeneous technologies in the US commercial banking industry through the nondynamic panel threshold effects estimation technique proposed by Hansen (Econometrica 64:413–430, 1999, Econometrica 68:575–603, 2000a). We employ the total assets as a threshold variable, which is typically considered as a proxy for bank’s size in the banking literature. We modify the threshold effects model to allow for time-varying effects, wherein these are modeled by a time polynomial of degree two as in Cornwell et al. (J Econom 46:185–200, 1990) model. Threshold effects estimation allows us to sort banks into discrete groups based on their size in a structural and consistent manner. We determine seven such distinct technology-groups within which banks are allowed to share the same technology parameters. We provide estimates of individual and group efficiency scores, as well as of those of returns to scale and measures of technological change for each group. The presence of the threshold(s) is tested via bootstrap procedure outlined in Hansen (Econometrica 64:413–430, 1999) and the relationship between bank size and efficiency ratios is investigated.  相似文献   

4.
This paper fist examines three set of bivariate cointegrations between any two of current accounts, stock markets, and currency exchange markets in 10 Asian countries. Furthermore, this work examined the effect of country characters on this bivariate cointegration. Our findings suggest that for three sets of cointegration test, each sample country at least exists one cointegration. India consistently exhibited a bi-directional causal relationship between any two of three indicators. Unlike Pan et al. (Int Rev Econ Financ 16:503–520, 2007 and Phylaktis and Ravazzolo (J Int Money Financ 24:1031–1053, 2005), this study found that such cointegration is influenced by three characteristics: capital control; flexibility in foreign exchange rates; and the ratio of trade to GDP. These characteristics are the result of liberalization in each Asian country. This implies that liberalization policies are effective on improving the cointegration between any two of financial markets and current account for 10 Asian countries.  相似文献   

5.
The article reports the results of a Mokken Scale Procedure (MSP) developing a hierarchical cross-national scale to measure xenophobia, and a qualitative validation of this scale. A pool of 30 xenophobic scale items were collected from several sources and edited according to established unidimensional criteria. The survey was administered to 608 undergraduate students in the USA, 193 undergraduate students in the Netherlands, and 303 undergraduate students in Norway. Fourteen scale statements measuring perceived threat or fear and meeting the criteria of the Stereotype Content Model (e.g., Fiske et al. in Trends Cogn Sci 11:77–83, 2006) were selected for further analysis. A separate item analysis and subsequently MSP analysis yielded a cumulative scale with the same five items for each of the three samples meeting criteria for homogeneity in all samples with H >.40. The result, a cross-national 5-item scale measuring fear-based xenophobia, was tested by means of the Three-Step Test-Interview (Hak et al. in Surv Res Methods 2:143–150, 2008) with 10 students in The Netherlands and 10 students in Norway. The analysis of these qualitative interviews shows that individual respondents’ criteria for the ranking of the scale items strongly depend on the way immigrants are framed. Ranking according to different levels of fear turned out to be only one criterion out of several possible ones used by individual respondents.  相似文献   

6.
In this paper we explore properties of different orders of one-sided scale elasticities in multi-input multi-output production using the theoretical framework developed by Hadjicostas and Soteriou (Eur J Oper Res 168:425–449, 2006), Krivonozhko et al. (J Oper Res Soc 55:1049–1058, 2004), and others. That framework includes as a special case the well-known operations research method of data envelopment analysis (DEA). A special case of the theory in this paper is the Banker-Morey (Oper Res 34:513–521, 1986a) DEA model for data that include both discretionary and non-discretionary inputs and outputs. Several inequalities among different orders of one-sided scale elasticities are presented. An example is used to illustrate many of the results and ideas of the paper. Finally, we show how the theory and results of this paper can be used to shed some light on implicit Hicks input technical change.  相似文献   

7.
The aim of the present study is to define quality entropy as well as to illustrate some of its properties. The simulation of the mathematical model for quality entropy shall be performed by means of specialized software for mathematical problem simulation, such as Microsoft Excel that we have employed for this particular study. Our aim is to prove that quality entropy may be expanded to the notions of Markov source of quality and Bernoulli source of quality, by analogy with the Markov and Bernoulli sources employed in information theory. Likewise, the present study delineates some aspects regarding tolerance to quality entropy. The subject of entropy and its application of the management of quality has been approached by other authors as well (Dinu and Vod?, Revista Calitatea-acces la succes, anul 8(4): 60–61, 2007; Dinu, Revista Calitatea-acces la succes, anul 8(5): 62–63, 2007; Georgescu-Roegen, Legea Entropiei ?i Procesul Economic, 1979; Stamatiu, Proceedings of the 7th International Conference on Quality, Reability and Maintainabilty, 2000; Stamatiu, Proceeding of the 18th International Conference on Quality, Reliability and Maintainability, 2002). Through our transdisciplinary approach, we would like to contribute to the development of this subject.  相似文献   

8.
While the high prevalence of mental illness in workplaces is more readily documented in the literature than it was ten or so years ago, it continues to remain largely within the medical and health sciences fields. This may account for the lack of information about mental illness in workplaces (Dewa et al. Healthcare Papers 5:12–25, 2004) by operational managers and human resource departments even though such illnesses effect on average 17 % to 20 % of employees in any 12-month period (MHCC 2012; SAMHSA 2010; ABS 2007). As symptoms of mental illness have the capacity to impact negatively on employee work performance and/or attendance, the ramifications on employee performance management systems can be significant, particularly when employees choose to deliberately conceal their illness, such that any work concerns appear to derive from issues other than illness (Dewa et al. Healthcare Papers 5:12–25, 2004; De Lorenzo 2003). When employee non-disclosure of a mental illness impacts negatively in the workplace, it presents a very challenging issue in relation to performance management for both operational managers and human resource staff. Without documented medical evidence to show that impaired work performance and/or attendance is attributable to a mental illness, the issue of performance management arises. Currently, when there is no documented medical illness, performance management policies are often brought into place to improve employee performance and/or attendance by establishing achievable employee targets. Yet, given that in any twelve-month period at least a fifth of the workforce sustains a mental illness (MHCC 2012; SAMHSA 2010; ABS 2007), and that non-disclosure is significant (Barney et al. BMC Public Health 9:1–11, 2009; Munir et al. Social Science & Medicine 60:1397–1407, 2005) such targets may be unachievable for employees with a hidden mental illness. It is for these reasons that this paper reviews the incidence of mental illness in western economies, its costs, and the reasons why it is often concealed and proposes the adoption of what are termed ‘Buffer Stage’ policies as an added tool that organisations may wish to utilise in the management of hidden medical illnesses such as mental illness.  相似文献   

9.
The purpose of this note is twofold. First, we survey the study of the percolation phase transition on the Hamming hypercube $\{0,1\}^{m}$ obtained in the series of papers (Borgs et al. in Random Struct Algorithms 27:137–184, 2005; Borgs et al. in Ann Probab 33:1886–1944, 2005; Borgs et al. in Combinatorica 26:395–410, 2006; van der Hofstad and Nachmias in Hypercube percolation, Preprint 2012). Secondly, we explain how this study can be performed without the use of the so-called “lace expansion” technique. To that aim, we provide a novel simple proof that the triangle condition holds at the critical probability.  相似文献   

10.
Hitchcock (Synthese 97:335–364, 1993) argues that the ternary probabilistic theory of causality meets two problems due to the problem of disjunctive factors, while arguing that the unanimity probabilistic theory of causality, which is founded on the binary contrast, does not meet them. Hitchcock also argues that only the ternary theory conveys the information about complex relations of causal relevance. In this paper, I show that Eells’ solution (Probabilistic causality, Cambridge University Press, Cambridge, 1991), which is founded on the unanimity theory, meet the two problems. I also show that the unanimity theory too reveals complex relations of causal relevance. I conclude that the two probabilistic theories of causality carve up the same causal structure in two formally different and conceptually consistent ways. Hitchcock’s ternary theory inspires several major philosophers (Maslen, Causation and counterfactuals, pp. 341–357. MIT Press, Cambridge, 2004; Schaffer, Philos Rev 114, 297–328, 2005; Northcott, Phil Stud 139, 111–123, 2007; Hausman, The place of probability in science: In honor of Eelleys Eells (1953–2006), pp. 47–64, Springer, Dordrecht, 2010) who have recently developed the ternary theory or the quaternary theory. This paper leads them to reconsider the relation between the ternary theory and the binary theory.  相似文献   

11.
The recombining binomial tree approach, which has been initiated by Cox et?al. (J Financ Econ 7: 229?C263, 1979) and extended to arbitrary diffusion models by Nelson and Ramaswamy (Rev Financ Stud 3(3): 393?C430, 1990) and Hull and White (J Financ Quant Anal 25: 87?C100, 1990a), is applied to the simultaneous evaluation of price and Greeks for the amortized fixed and variable rate mortgage prepayment option. We consider the simplified binomial tree approximation to arbitrary diffusion processes by Costabile and Massabo (J Deriv 17(3): 65?C85, 2010) and analyze its numerical applicability to the mortgage valuation problem for some Vasicek and CIR-like interest rate models. For fixed rates and binomial trees with about thousand steps, we obtain very good results. For the Vasicek model, we also compare the closed-form analytical approximation of the callable fixed rate mortgage price by Xie (IAENG Int J Appl Math 39(1): 9, 2009) with its binomial tree counterpart. With respect to the binomial tree values one observes a systematic underestimation (overestimation) of the callable mortgage price (prepayment option price) analytical approximation. This numerical discrepancy increases at longer maturities and becomes impractical for a valuable estimation of the prepayment option price.  相似文献   

12.
Classical optimal strategies are notorious for producing remarkably volatile portfolio weights over time when applied with parameters estimated from data. This is predominantly explained by the difficulty to estimate expected returns accurately. In Lindberg (Bernoulli 15:464–474, 2009), a new parameterization of the drift rates was proposed with the aim to circumventing this difficulty, and a continuous time mean–variance optimal portfolio problem was solved. This approach was further developed in Alp and Korn (Decis Econ Finance 34:21–40, 2011a) to a jump-diffusion setting. In the present paper, we solve a different portfolio problem under the market parameterization in Lindberg (Bernoulli 15:464–474, 2009). Here, the admissible investment strategies are given as the amounts of money to be held in each stock and are allowed to be adapted stochastic processes. In the references above, the admissible strategies are the deterministic and bounded fractions of the total wealth. The optimal strategy we derive is not the same as in Lindberg (Bernoulli 15:464–474, 2009), but it can still be viewed as investing equally in each of the n Brownian motions in the model. As a consequence of the problem assumptions, the optimal final wealth can become non-negative. The present portfolio problem is solved also in Alp and Korn (Submitted, 2011b), using the L 2-projection approach of Schweizer (Ann Probab 22:1536–1575, 1995). However, our method of proof is direct and much easier accessible.  相似文献   

13.
14.
This paper considers three ratio estimators of the population mean using known correlation coefficient between the study and auxiliary variables in simple random sample when some sample observations are missing. The suggested estimators are compared with the estimators of Singh and Horn (Metrika 51:267–276, 2000), Singh and Deo (Stat Pap 44:555–579, 2003) and Kadilar and Cingi (Commun Stat Theory Methods 37:2226–2236, 2008). They are compared with other imputation estimators based on the mean or a ratio. It is found that the suggested estimators are approximately unbiased for the population mean. Also, it turns out that the suggested estimators perform well when compared with the other estimators considered in this study.  相似文献   

15.
This paper shows how scale efficiency can be measured from an arbitrary parametric hyperbolic distance function with multiple outputs and multiple inputs. It extends the methods introduced by Ray (J Product Anal 11:183–194, 1998), and Balk (J Product Anal 15:159–183, 2001) and Ray (2003) that measure scale efficiency from a single-output multi-input distance function and from a multi-output and multi-input distance function, respectively. The method developed in the present paper is different from Ray’s and Balk’s in that it allows for simultaneous contraction of inputs and expansion of outputs. Theorems applicable to an arbitrary parametric hyperbolic distance function are introduced first, and then their uses in measuring scale efficiency are illustrated with the translog functional form.  相似文献   

16.
Sangun Park 《Metrika》2014,77(5):609-616
The representation of the entropy in terms of the hazard function and its extensions have been studied by many authors including Teitler et al. (IEEE Trans Reliab 35:391–395, 1986). In this paper, we consider a representation of the Kullback–Leibler information of the first \(r\) order statistics in terms of the relative risk (Park and Shin in Statistics, 2012), the ratio of hazard functions, and extend it to the progressively Type II censored data. Then we study the change in Kullback–Leibler information of the first \(r\) order statistics according to \(r\) and discuss its relation with Fisher information in order statistics.  相似文献   

17.
In this paper, we have employed the non-standard log-linear models to fit the double symmetry models and some of its decompositions to square contingency tables having ordered categories. SAS PROC GENMOD was employed to fit these models although we could similarly have used GENLOG in SPSS or GLM in STATA. A SAS macro generates the factor or scalar variables required to fit these models. Two sets of \(4 \times 4\) unaided distance vision data that have been previously analyzed in (Tahata and Tomizawa, Journal of the Japan Statistical Society 36:91–106, 2006) were employed for verification of results. We also extend the approach to the Danish \(5 \times 5\) Mobility data as well as to the \(3 \times 3\) Danish longitudinal study data of subjective health, firstly reported in (Andersen, The Statistical Analysis of Categorical Data, Springer:Berlin, 1994) and analyzed in (Tahata and Tomizawa, Statistical Methods and Applications 19:307–318, 2010). Results obtained agree with those published in previous literature on the subject. The approaches suggest here eliminate any programming that might be required in order to apply these class of models to square contingency tables.  相似文献   

18.
Social scientists often consider multiple empirical models of the same process. When these models are parametric and non-nested, the null hypothesis that two models fit the data equally well is commonly tested using methods introduced by Vuong (Econometrica 57(2):307–333, 1989) and Clarke (Am J Political Sci 45(3):724–744, 2001; J Confl Resolut 47(1):72–93, 2003; Political Anal 15(3):347–363, 2007). The objective of each is to compare the Kullback–Leibler Divergence (KLD) of the two models from the true model that generated the data. Here we show that both of these tests are based upon a biased estimator of the KLD, the individual log-likelihood contributions, and that the Clarke test is not proven to be consistent for the difference in KLDs. As a solution, we derive a test based upon cross-validated log-likelihood contributions, which represent an unbiased KLD estimate. We demonstrate the CVDM test’s superior performance via simulation, then apply it to two empirical examples from political science. We find that the test’s selection can diverge from those of the Vuong and Clarke tests and that this can ultimately lead to differences in substantive conclusions.  相似文献   

19.
This paper investigates the relationship between secure implementability (Saijo et al. in Theor Econ 2:203–229, 2007) and full implementability in truthful strategies (Nicolò in Rev Econ Des 8:373–382, 2004). Although secure implementability is in general stronger than full implementability in truthful strategies, this paper shows that both properties are equivalent under the social choice function that satisfies non-wastefulness (Li and Xue in Econ Theory, doi:10.1007/s00199-012-0724-0) in pure exchange economies with Leontief utility functions.  相似文献   

20.
We study a keyword auction model where bidders have constrained budgets. In the absence of budget constraints, Edelman et al. (Am Econ Rev 97(1):242–259, 2007) and Varian (Int J Ind Organ 25(6):1163–1178, 2007) analyze “locally envy-free equilibrium” or “symmetric Nash equilibrium” bidding strategies in generalized second-price auctions. However, bidders often have to set their daily budgets when they participate in an auction; once a bidder’s payment reaches his budget, he drops out of the auction. This raises an important strategic issue that has been overlooked in the previous literature: Bidders may change their bids to inflict higher prices on their competitors because under generalized second-price, the per-click price paid by a bidder is the next highest bid. We provide budget thresholds under which equilibria analyzed in Edelman et al. (Am Econ Rev 97(1):242–259, 2007) and Varian (Int J Ind Organ 25(6):1163–1178, 2007) are sustained as “equilibria with budget constraints” in our setting. We then consider a simple environment with one position and two bidders and show that a search engine’s revenue with budget constraints may be larger than its revenue without budget constraints.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号