首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
It is generally believed that companies applying performance management practices outperform those that do not measure and manage their performance. Studies examining the link between performance management and performance improvement implicitly assume that performance management affects behavior of individuals in an organization, which then facilitates the achievement of organizational goals. This study takes a step towards understanding this implicit assumption. We investigate how performance management practices relate to improvement in performance by influencing behavior of individuals. We focus on operational performance management, i.e. the definition and use of performance measures on the shopfloor in production and distribution. We use a survey among 102 companies to identify the relations between performance management practices, shopfloor behavior and improvement in performance. We identified three independent clusters of operator behavior that positively correlate with performance improvement: “Understanding”, “Motivation” and “Focus on Improvement”. We show that 17 out of the 20 performance management practices found in literature have a significant and positive relation with one or more clusters of operator behavior. We furthermore found that there is a positive correlation between the number of performance management practices applied and performance improvement, suggesting that it is not only which practices are applied but also how many. Recommendations emerging from this study enable managers to identify which behavioral changes are desired to improve performance and to select those performance management practices that positively influence the desired behavior.  相似文献   

2.
We examine the econometric implications of the decision problem faced by a profit/utility-maximizing lender operating in a simple “double-binary” environment, where the two actions available are “approve” or “reject”, and the two states of the world are “pay back” or “default”. In practice, such decisions are often made by applying a fixed cutoff to the maximum likelihood estimate of a parametric model of the default probability. Following (Elliott and Lieli, 2007), we argue that this practice might contradict the lender’s economic objective and, using German loan data, we illustrate the use of “context-specific” cutoffs and an estimation method derived directly from the lender’s problem. We also provide a brief discussion of how to incorporate legal constraints, such as the prohibition of disparate treatment of potential borrowers, into the lender’s problem.  相似文献   

3.
In this paper we focus primarily on the dynamic evolution of the world distribution of growth rates in per capita GDP. We propose new concepts and measures of “convergence,” or “divergence” that are based on entropy distances and dominance relations between groups of countries over time.  相似文献   

4.
Selecting the most promising candidates to fill an open position can be a difficult task when there are many applicants. Each applicant achieves certain performance levels in various categories and the resulting information can be overwhelming. We demonstrate how data envelopment analysis (DEA) can be used as a fair screening and sorting tool to support the candidate selection and decision-making process. Each applicant is viewed as an entity with multiple achievements. Without any a priori preference or information on the multiple achievements, DEA identifies the non-dominated solutions, which, in our case, represent the “best” candidates. A DEA-aided recruiting process was developed that (1) determines the performance levels of the “best” candidates relative to other applicants; (2) evaluates the degree of excellence of “best” candidates’ performance; (3) forms consistent tradeoff information on multiple recruiting criteria among search committee members, and, then, (4) clusters the applicants.  相似文献   

5.
System builders who plan to acquire information and communication technology (ICT) products must consider two key risk factors (among many) while planning for the acquisition and design of their systems. They must understand the inter-relationships of all assembled products in any new planned system in terms of its resilience under attack. These system owners will also increasingly assess the risks they may inherit from a global interconnected supply chain. To address these concerns, the recommendation in this paper is for providers of Commercial-Off-the-Shelf (COTS) technology products to perform a criticality analysis on their own products to gauge resilience, rather than later be confronted by an acquirer attempting to solely reverse engineer the system as part of supply chain due-diligence. This paper illustrates the roles that technology providers and system owners each play in following the outlined approach that highlights key risk factors of the tiered suppliers for product elements deemed most critical. ICT COTS providers who do not want to divulge sensitive information about their suppliers can use a “representational assurance” approach to convey meaningful information to potential acquirers without undue disclosure. Analytical graphics such as “Treemaps” can help all parties illustrate where to best focus their attention regarding critical operational risk and supply chain risk. The same data that providers track internally to manage product assurance can be leveraged to support meaningful representational assurance to acquirers. This approach improves the current state where data disclosure by technology providers is seen by acquirers, despite being unrealistic, as the best means to gain confidence in the technology supply chain.  相似文献   

6.
This paper empirically investigates aspects of risk management in young small enterprises? effort to survive and grow. We use a new dataset on several thousands small businesses in their “formative age” (2–8 years old) in 10 European countries and 18 sectors. Firms across all types of sectors use internal risk mitigation strategies to manage technology risk and operational risk. Financial risk is managed by tapping formal and informal networks. Market risk appears less amenable to internal management action. Formal network participation (strategic alliances) is a strategy cutting across all kinds of risk with the exception of operational risk. Firms in knowledge-intensive sectors (high-tech manufacturing and KIBS) engage in risk management activities more extensively. Firms led by more educated entrepreneurs and/or operating in demanding volatile markets tend to network more and to use internal risk mitigation strategies more extensively.  相似文献   

7.
Structural vs. atheoretic approaches to econometrics   总被引:1,自引:0,他引:1  
In this paper I attempt to lay out the sources of conflict between the so-called “structural” and “experimentalist” camps in econometrics. Critics of the structural approach often assert that it produces results that rely on too many assumptions to be credible, and that the experimentalist approach provides an alternative that relies on fewer assumptions. Here, I argue that this is a false dichotomy. All econometric work relies heavily on a priori assumptions. The main difference between structural and experimental (or “atheoretic”) approaches is not in the number of assumptions but the extent to which they are made explicit.  相似文献   

8.
The current competitive environment is characterized by new sources of information, new technologies, new management practices, new competitors, and shorter product life cycles, which highlights the importance of organizational knowledge in manufacturing companies. We integrate some of those knowledge-based approaches seeking to understand how aspects related to cross-functional orientation, new technologies, and increasing access to information affect manufacturing strategy. In this paper, “know-what” (where to find the needed information) and “know-how” (how to run operations smoothly) are considered key components of organizational knowledge in the process of manufacturing strategy formulation. Assuming that knowledge accumulation may lead to competitive advantage, we propose a model of manufacturing strategy process from a resource-based view perspective. We used a survey to collect field data from 104 companies. The results indicate that cross-functional activities integrate manufacturing knowledge and contribute to the creation of valuable and rare product characteristics.  相似文献   

9.
The focus of this paper is characterized by (1) an examination of the factors related to the “anticipation” of potential innovations in any organizational setting and (2) the identification of strategies for the diffusion and implementation of operations research/management science (OR/MS) techniques in a particular developing region. Based on the methodology used in studying change (innovation) in health care systems, a managerial innovation model incorporating four main components [the executive, the organization, the task environment of the organization and change agent(s), including the OR/MS manager and outside consultants] is developed and examined in terms of data obtained from top executives and other managers in forty industrial firms in Cali, Colombia. In the model developed the process of innovation is decomposed into the levels of; (a) attitudes and motivations of the executive, (b) “readiness” to take action, (c) action characteristics, (d) triggering cues and (e) actions taken and evaluation (feedback loop). The model was found useful for providing predictions indicating areas to which intervention and “marketing” of OR/MS strategies should be devoted. Overall, the study provides a base for comparative and longitudinal studies.  相似文献   

10.
Recent work finds evidence that the volatility of the U.S. economy fell dramatically around the first quarter of 1984. We trace the timing of this so-called “Great Moderation” across many subsectors of the economy in order to better understand its root cause. We find that the interest rate sensitive sectors generally experience a much earlier volatility decline than other large sectors of the economy. The changes in Federal Reserve stabilization policies that occurred during the early 1980s support the view that an improved monetary policy played an important role in stabilizing real economic activity. We find only mild evidence that “good luck” was important and little evidence to support the claim that improved inventory management was important.  相似文献   

11.
We take as a starting point the existence of a joint distribution implied by different dynamic stochastic general equilibrium (DSGE) models, all of which are potentially misspecified. Our objective is to compare “true” joint distributions with ones generated by given DSGEs. This is accomplished via comparison of the empirical joint distributions (or confidence intervals) of historical and simulated time series. The tool draws on recent advances in the theory of the bootstrap, Kolmogorov type testing, and other work on the evaluation of DSGEs, aimed at comparing the second order properties of historical and simulated time series. We begin by fixing a given model as the “benchmark” model, against which all “alternative” models are to be compared. We then test whether at least one of the alternative models provides a more “accurate” approximation to the true cumulative distribution than does the benchmark model, where accuracy is measured in terms of distributional square error. Bootstrap critical values are discussed, and an illustrative example is given, in which it is shown that alternative versions of a standard DSGE model in which calibrated parameters are allowed to vary slightly perform equally well. On the other hand, there are stark differences between models when the shocks driving the models are assigned non-plausible variances and/or distributional assumptions.  相似文献   

12.
Government supported technological research and development can help the private sector to compete globally. A more accurate evaluation system considering multi-factor performance is highly desired. This study offers an alternative perspective and characterization of the performance of Technology Development Programs (TDPs) via a two-stage process that emphasizes research and development (R&D) and technology diffusion. This study shall employ a sequential data envelopment analysis (DEA) with a non-parametric statistical analysis to analyze differences in intellectual capital variables among various TDPs. The results reveal that R&D performance is better than technology diffusion performance for the TDPs. In addition, the “Mechanical, Mechatronic, and Transportation field” is more efficient than the other fields in both R&D and technology diffusion performance models. The findings of this study point to the importance of intellectual capital in achieving high levels of TDP efficiency. The potential applications and strengths of DEA and intellectual capital in assessing the performance of TDP are also highlighted.  相似文献   

13.
Open Innovation presses the case for timely and thorough intelligence concerning research and development activities conducted outside one’s organization. To take advantage of this wealth of R&D, one needs to establish a systematic “tech mining” process. We propose a 5-stage framework that extends literature review into research profiling and pattern recognition to answer posed technology management questions. Ultimately one can even discover new knowledge by screening research databases.Once one determines the value in mining external R&D, tough issues remain to be overcome. Technology management has developed a culture that relies more on intuition than on evidence. Changing that culture and implementing effective technical intelligence capabilities is worth the effort. P&G's reported gains in innovation call attention to the huge payoff potential.  相似文献   

14.
This paper proposes a test of the null hypothesis of stationarity that is robust to the presence of fat-tailed errors. The test statistic is a modified version of the so-called KPSS statistic. The modified statistic uses the “sign” of the data minus the sample median, whereas KPSS used deviations from means. This “indicator” KPSS statistic has the same limit distribution as the standard KPSS statistic under the null, without relying on assumptions about moments, but a different limit distribution under unit root alternatives. The indicator test has lower power than standard KPSS when tails are thin, but higher power when tails are fat.  相似文献   

15.
Drug use and problems change dramatically over time in ways that are often described as reflecting an “epidemic cycle”. We use simulation of a model of drug epidemics to investigate how the relative effectiveness of different types of prevention varies over the course of such an epidemic. Specifically we use the so-called LHY model (see, Discussion Paper No. 251 of the Institute of Econometrics, OR, and Systems Theory, Vienna University of Technology, Vienna, Austria, 2000) which includes both “contagious” spread of initiation (a positive feedback) and memory of past use (a negative feedback), which dampens initiation and, hence, future use. The analysis confirms the common sense intuition that prevention is more highly leveraged early in an epidemic, although the extent to which this is true in this model is striking, particularly for campaigns designed to preserve or amplify awareness of the drug's dangers. The findings also suggest that the design of “secondary” prevention programs should change over the course of an epidemic.  相似文献   

16.
We study the coevolution of networks and action choices in a Prisoners' Dilemma. Agents in our model learn about both action choices and choices of interaction partners (links) by imitating successful behavior of others. The resulting dynamics yields outcomes where both cooperators and defectors coexist under a wide range of parameters. Two scenarios can arise. Either there is “full separation” of defectors and cooperators, i.e. they are found in two different, disconnected components. Or there is “marginalization” of defectors, i.e. connected networks emerge with a center of cooperators and a periphery of defectors.  相似文献   

17.
This paper studies the decision-theoretic foundation for the notion of stability in the dynamic context of strategic interaction. We formulate and show that common knowledge of rationality implies a “stable” pattern of behavior in extensive games with perfect information. In the “generic” case, our approach is consistent with Aumann’s [Aumann, R.J., 1995. Backward induction and common knowledge of rationality. Games and Economic Behavior 8, 6–19] result that common knowledge of rationality leads to the backward induction outcome.  相似文献   

18.
Based on a tractable “endogeneous technology choice” framework, we provide a microfoundation for aggregate normalized constant elasticity of substitution (CES) production functions with non-neutral, factor-augmenting technical change. In this framework, firms are allowed to choose unit productivities of capital and labor optimally from a technology menu constructed under the assumption that unit factor productivities (UFPs) are independently Weibull-distributed. The Weibull distribution itself is also microfounded here: based on extreme value theory, it is found to be an accurate and robust approximation of the true UFP distribution if technologies consist of a large number of complementary components.  相似文献   

19.
“The quiet life hypothesis” (QLH) by Hicks (1935) argues that, due to management’s subjective cost of reaching optimal profits, firms use their market power to allow inefficient allocation of resources. Increasing competitive pressure is therefore likely to force management to work harder to reach optimal profits. Another hypothesis, which also relates market power to efficiency is “the efficient structure hypothesis” (ESH) by Demsetz (1973). ESH argues that firms with superior efficiencies or technologies have lower costs and therefore higher profits. These firms are assumed to gain larger market shares which lead to higher concentration. Ignoring the efficiency levels of the firms in a market power model might cause both estimation and interpretation problems. Unfortunately, the literature on market power measurement largely ignores this relationship. In the context of a dynamic setting, we estimate the market power of US airlines in two city-pairs by both allowing inefficiencies of the firms and not allowing inefficiencies of the firms. Using industry level cost data, we estimate the cost function parameters and time-varying efficiencies. An instrumental variables version of the square root Kalman filter is used to estimate time-varying conduct parameters.  相似文献   

20.
Anitesh Barua  Honghui Deng 《Socio》2004,38(4):233-253
This paper applies Data Envelopment Analysis to determine relative efficiencies between internet dot com companies that produce only physical products and those that produce only digital products. To allow for the fact that the latter are relatively inexperienced, a distinction is made between long- and short-run efficiencies and inefficiencies, with a finding of no statistically significant difference in the short run but digital product companies are significantly more efficient in the long run. A new way of distinguishing between long- and short-run performances is utilized that avoids the need for identifying the time periods associated with long-run vs. short-run efficiencies and inefficiencies. In place of “time,” this paper utilizes differences in the “properties” that economic theory associates with long- and short-run performances.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号