首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 28 毫秒
1.
We propose the notion of multivariate predictability as a measure of goodness-of-fit in data reduction techniques which are useful for visualizing and screening data. For quantitative variables this leads to the usual sums-of-squares and variance accounted for criteria. For categorical variables we show how to predict the category-levels of all variables associated with every point (case). The proportion of predictions which agree with the true categories gives the measure of fit. The ideas are very general; as an illustration we use nonlinear principal components analysis (NLPCA) in association with ordered categorical variables. A detailed example using data from the International Social Survey Program (ISSP) will be given in Blasius and Gower (quality and quantity, 39, to appear). It will be shown that the predictability criterion suggests that the fits are rather better than is indicated by “percentage of variance accounted for”.This article was written while John Gower was a visiting professor at the ZA-Eurolab, at the Zentralarchiv für Empirische Sozialforschung, University of Cologne, Germany. The ZA is a Large Scale Facility funded by the Training and Mobility of Researchers program of the European Union.  相似文献   

2.
A fuzzy-QFD approach to supplier selection   总被引:5,自引:0,他引:5  
This article suggests a new method that transfers the house of quality (HOQ) approach typical of quality function deployment (QFD) problems to the supplier selection process. To test its efficacy, the method is applied to a supplier selection process for a medium-to-large industry that manufactures complete clutch couplings.The study starts by identifying the features that the purchased product should have (internal variables “WHAT”) in order to satisfy the company's needs, then it seeks to establish the relevant supplier assessment criteria (external variables “HOW”) in order to come up with a final ranking based on the fuzzy suitability index (FSI). The whole procedure was implemented using fuzzy numbers; the application of a fuzzy algorithm allowed the company to define by means of linguistic variables the relative importance of the “WHAT”, the “HOWWHAT” correlation scores, the resulting weights of the “HOW” and the impact of each potential supplier.Special attention is paid to the various subjective assessments in the HOQ process, and symmetrical triangular fuzzy numbers are suggested to capture the vagueness in people's verbal assessments.  相似文献   

3.
Statistical agencies often release a masked or perturbed version of survey data to protect the confidentiality of respondents' information. Ideally, a perturbation procedure should provide confidentiality protection without much loss of data quality, so that the released data may practically be treated as original data for making inferences. One major objective is to control the risk of correctly identifying any respondent's records in released data, by matching the values of some identifying or key variables. For categorical key variables, we propose a new approach to measuring identification risk and setting strict disclosure control goals. The general idea is to ensure that the probability of correctly identifying any respondent or surveyed unit is at most ξ, which is pre‐specified. Then, we develop an unbiased post‐randomisation procedure that achieves this goal for ξ>1/3. The procedure allows substantial control over possible changes to the original data, and the variance it induces is of a lower order of magnitude than sampling variance. We apply the procedure to a real data set, where it performs consistently with the theoretical results and quite importantly, shows very little data quality loss.  相似文献   

4.
E-Leadership and Virtual Teams   总被引:1,自引:0,他引:1  
In this paper we have identified some key challenges for E-leaders of virtual teams. Among the most salient of these are the following:
• The difficulty of keeping tight and loose controls on intermediate progress toward goals
• Promoting close cooperation among teams and team members in order to integrate deliverables
• Encouraging and recognizing emergent leaders in virtual teams
• Establishing explicit processes for archiving important written documentation
• Establishing and maintaining norms and procedures early in a team’s formation and development
• Establishing proper boundaries between home and work
Virtual team environments magnify the differences between good and bad projects, organizations, teams, and leaders. The nature of such projects is that there is little tolerance for ineffective leadership. There are some specific issues and techniques for mitigating the negative effects of more dispersed employees, but these are merely extensions of good leadership—they cannot make up for the lack of it.

SELECTED BIBLIOGRAPHY

An excellent reference for research on teams is M. E. Shaw, R. M. McIntyre, and E. Salas, “Measuring and Managing for Team Performance: Emerging Principles from Complex Environments,” in R. A. Guzzo and E. Salas, eds., Team Effectiveness and Decision Making in Organizations (San Francisco: Jossey-Bass, 1995). For a fuller discussion of teleworking and performance-management issues in virtual teams, see W. F. Cascio, “Managing a Virtual Workplace,” Academy of Management Executive, 2000, 14(3), 81–90, and also C. Joinson, “Managing Virtual Teams,” HRMagazine, June 2002, 69–73. Several sources discuss the issue of trust in virtual teams: D. Coutu, “Trust in Virtual Teams,” Harvard Business Review, May–June 1998, 20–21; S. L. Jarvenpaa, K. Knoll, and D. E. Leidner, “Is Anybody Out There? Antecedents of Trust in Global Virtual Teams,” Journal of Management Information Systems, 1998, 14(4), 29–64. See also Knoll and Jarvenpaa, “Working Together in Global Virtual Teams,” in M. Igbaria and M. Tan, eds., The Virtual Workplace (Hershey, PA: Idea Group Publishing, 1998).Estimates of the number of teleworkers vary. For examples, see Gartner Group, Report R-06-6639, November 18, 1998, and also Telework America survey, news release, October 23, 2001. We learned about CPP’s approach to managing virtual work arrangements through David Krantz, personal communication, August 20, 2002, Palo Alto, CA.There are several excellent references on emergent leaders. For example, see G. Lumsden and D. Lumsden, Communicating in Groups and Teams: Sharing Leadership (Belmont, CA: Wadsworth, 1993); Lumsden and Lumsden, Groups: Theory and Experience, 4th ed. (Boston: Houghton, 1993); R. W. Napier and M. K. Gershenfeld, Groups: Theory and Experience, 4th ed. (Boston: Houghton, 1989); and M. E. Shaw, Group Dynamics: The Psychology of Small Group Behavior, 3rd ed. (New York: McGraw-Hill, 1981).An excellent source for e-mail style is D. Angell and B. Heslop, The Elements of E-mail Style: Communicate Effectively via Electronic Mail (Reading, MA: Addison-Wesley Publishing Company, 1994). To read more on the growing demand for flexible work arrangements, see “The New World of Work: Flexibility is the Watchword,” Business Week, 10 January 2000, 36.For more on individualism and collectivism, see H. C. Triandis, “Cross-cultural Industrial and Organizational Psychology,” in H. C. Triandis, M. D. Dunnette, and L. M. Hough, eds., Handbook of Industrial and Organizational Psychology, 2nd ed., vol. 4 (Palo Alto, CA: Consulting Psychologists Press, 1994, 103–172).Executive SummaryAs the wired world brings us all closer together, at the same time as we are separated by time and distance, leadership in virtual teams becomes ever more important. Information technology makes it possible to build far-flung networks of organizational contributors, although unique leadership challenges accompany their formation and operation. This paper describes the growth of virtual teams, the various forms they assume, the kinds of information and support they need to function effectively, and the leadership challenges inherent in each form. We then provide workable, practical solutions to each of the leadership challenges identified.  相似文献   

5.
Increasing human and social capital by applying job embeddedness theory   总被引:4,自引:0,他引:4  
Most modern lives are complicated. When employees feel that their organization values the complexity of their entire lives and tries to do something about making it a little easier for them to balance all the conflicting demands, the employees tend to be more productive and stay with those organizations longer. Job embeddedness captures some of this complexity by measuring both the on-the-job and off-the-job components that most contribute to a person's staying. Research evidence as well as ample anecdotal evidence (discussed here and other places) supports the value of using the job embeddedness framework for developing a world-class retention strategy based on corporate strengths and employee preferences.To execute effectively their corporate strategy, different organizations require different knowledge, skills and abilities from their people. And because of occupational, geographic, demographic or other differences, these people will have needs that are different from other organizations. For that reason, the retention program of the week from international consultants won’t always work. Instead, organizations need to carefully assess the needs/desires of their unique employee base. Then, these organizations need to determine which of these needs/desires they can address in a cost effective fashion (confer more benefits than the cost of the program). Many times this requires an investment that will pay off over a longer term – not just a quarter or even year. Put differently, executives will need to carefully understand the fully loaded costs of turnover (loss of tacit knowledge, reduced customer service, slowed production, lost contracts, lack of internal candidates to lead the organization in the future, etc., in addition to the obvious costs like recruiting, selecting and training new people). Then, these executives need to recognize the expected benefits of various retention practices. Only then can leaders make informed decisions about strategic investments in human and social capital.

Selected bibliography

A number of articles have influenced our thinking about the importance of connecting employee retention strategies to business strategies:
• R. W. Beatty, M. A. Huselid, and C. E. Schneier. “New HR Metrics: Scoring on the Business Scorecard,” Organizational Dynamics, 2003, 32 (2), 107–121.
• Bradach. “Organizational Alignment: The 7-S Model,” Harvard Business Review, 1998.
• J. Pfeffer. “Producing Sustainable Competitive Advantage Through the Effective Management of People,” Academy of Management Executive, 1995 (9), 1–13.
• C. J. Collins, and K. D. Clark. “Strategic Human Resources Practices and Top Management Team Social Networks: An Examination of the Role of HR Practices in Creating Organizational Competitive Advantage,” Academy of Management Journal, 2003, 46, 740–752.
The theoretical development and empirical support for the Unfolding Model of turnover are captured in the following articles:
• T. Lee, and T. Mitchell. “An Alternative Approach: The Unfolding Model of Voluntary Employee Turnover,” Academy of Management Review, 1994, 19, 57–89.
• B. Holtom, T. Mitchell, T. Lee, and E.Inderrieden. “Shocks as Causes of Turnover: What They Are and How Organizations Can Manage Them,” Human Resource Management, 2005, 44(3), 337–352.
The development of job embeddedness theory is captured in the following articles:
• T. Mitchell, B. Holtom, T. Lee, C. Sablynski, and M. Erez. “Why People Stay: Using Job Embeddedness to Predict Voluntary Turnover,” Academy of Management Journal, 2001, 44, 1102–1121.
• T. Mitchell, B. Holtom, and T. Lee. “How To Keep Your Best employees: The Development Of An Effective Retention Policy,” Academy of Management Executive, 2001, 15(4), 96–108.
• B. Holtom, and E. Inderrieden. “Integrating the Unfolding Model and Job Embeddedness To Better Understand Voluntary Turnover,” Journal of Managerial Issues, in press.
• D.G. Allen. “Do Organizational Socialization Tactics Influence Newcomer Embeddedness and Turnover?” Journal of Management, 2006, 32, 237–257.
Executive SummaryEmployee turnover is costly to organizations. Some of the costs are obvious (e.g., recruiting, selecting, and training expenses) and others are not so obvious (e.g., diminished customer service ability, lack of continuity on key projects, and loss of future leadership talent). Understanding the value inherent in attracting and keeping excellent employees is the first step toward investing systematically to build the human and social capital in an organization. The second step is to identify retention practices that align with the organization's strategy and culture. Through extensive research, we have developed a framework for creating this alignment. We call this theory job embeddedness. Across multiple industries, we have found that job embeddedness is a stronger predictor of important organizational outcomes, such as employee attendance, retention and performance than the best, well-known and accepted psychological explanations (e.g., job satisfaction and organizational commitment). The third step is to implement the ideas. Throughout this article we discuss examples from the Fortune 100 Best Companies to Work For and many others to demonstrate how job embeddedness theory can be used to build human and social capital by increasing employee retention.  相似文献   

6.
In this paper, some new indices for ordinal data are introduced. These indices have been developed so as to measure the degree of concentration on the “small” or the “large” values of a variable whose level of measurement is ordinal. Their advantage in relation to other approaches is that they ascribe unequal weights to each class of values. Although, they constitute a useful tool in various fields of applications, the focus here is on their use in sample surveys and specifically in situations where one is interested in taking into account the “distance” of the responses from the “neutral” category in a given question. The properties of these indices are examined and methods for constructing confidence intervals for their actual values are discussed. The performance of these methods is evaluated through an extensive simulation study.  相似文献   

7.
Over twenty years ago, Mitchell [Mitchell, T.R. (1982) Motivation: New directions for theory, research, and practice. Academy of Management Review, 7:80–88.] called for research which integrates and competitively tests the multitude of motivation theories competitively. Yet, with few exceptions, theories of motivation tend to be narrow in focus. However, many motivation theories incorporate similar predictor variables such as job satisfaction, perceived equity, and organizational commitment, suggesting that theory integration is warranted. In this paper, several literatures are reviewed which deal with employee effort at different levels (e.g., withholding effort, offering extra effort). “Effort propensity” is offered as an appropriate integrating variable, and an integrative model of effort propensity which pulls these various literatures together and stimulates the type of research described by Mitchell [Mitchell, T.R. (1982) Motivation: New directions for theory, research, and practice. Academy of Management Review, 7:80–88.] is proposed.  相似文献   

8.
Abstract . In this study some of the causes of the differentials in Black-White economic participation in large U.S. cities are examined. Specific concern is with unemployment and withdrawal from the labor force. The underlying hypothesis is that these differentials are a response to compositional differences between the two populations. The effects of region, economic structure, and segregation are also examined. A causal model is explicated in which these variables are inter related and their direct and indirect effects upon the dependent variables analyzed. The data suggest that, while having similar patterns of causes, labor force withdrawal and unemployment differentials are distinct phenomena. It also appears that a major portion of the variance in these variables is not accounted for by compositional differences between Blacks and Whites on socio-economic variables. However, educational and occupational differentiation have theoretically significant effects. It is argued that a major portion of the residual variance is due to discrimination, although there is no way of directly testing this hypothesis. An important negative finding is the apparently minor impact of residential segregation, suggesting that the physical isolation of Blacks is not a key factor in their limited economic participation. Finally, the data suggest that it is meaningful to regard assimilation as a multidimensional phenomenon whose dimensions are causally interrelated.  相似文献   

9.
We consider a principal who is keen to induce his agents to work at their maximal effort levels. To this end, he samples n days at random out of the T days on which they work, and awards a prize of B dollars to the most productive agent. The principal’s policy (B, n) induces a strategic game Γ(B, n) between the agents. We show that to implement maximal effort levels weakly (or, strongly) as a strategic equilibrium (or, as dominant strategies) in Γ(B, n), at the least cost B to himself, the principal must choose a small sample size n. Thus less scrutiny by the principal induces more effort from the agents.The need for reduced scrutiny becomes more pronounced when agents have information of the history of past plays in the game. There is an inverse relation between information and optimal sample size. As agents acquire more information (about each other), the principal, so to speak, must “undo” this by reducing his information (about them) and choosing the sample size n even smaller.  相似文献   

10.
Summary This paper reviews research situations in medicine, epidemiology and psychiatry, in psychological measurement and testing, and in sample surveys in which the observer(rater or interviewer) can be an important source of measurement error. Moreover, most of the statistical literature in observer variability is surveyed with attention given to a notational unification of the various models proposed. In the continuous data case, the usual analysis of variance (ANOVA) components of variance models are presented with an emphasis on the intraclass correlation coefficient as a measure of reliability. Other modified ANOVA models, response error models in sample surveys, and related multivariate extensions are also discussed. For the categorical data case, special attention is given to measures of agreement and tests of hypotheses when the data consist of dichotomous responses. In addition, similarities between the dichotomous and continous cases are illustrated in terms of intraclass correlation coefficients. Finally, measures of agreement, such as kappa and weighted-kappa, are discussed in the context of nominal and ordinal data. A proposed unifying framework for the categorical data case is given in the form of concluding remarks.  相似文献   

11.
We propose new real‐time monitoring procedures for the emergence of end‐of‐sample predictive regimes using sequential implementations of standard (heteroskedasticity‐robust) regression t‐statistics for predictability applied over relatively short time periods. The procedures we develop can also be used for detecting historical regimes of temporary predictability. Our proposed methods are robust to both the degree of persistence and endogeneity of the regressors in the predictive regression and to certain forms of heteroskedasticity in the shocks. We discuss how the monitoring procedures can be designed such that their false positive rate can be set by the practitioner at the start of the monitoring period using detection rules based on information obtained from the data in a training period. We use these new monitoring procedures to investigate the presence of regime changes in the predictability of the US equity premium at the 1‐month horizon by traditional macroeconomic and financial variables, and by binary technical analysis indicators. Our results suggest that the 1‐month‐ahead equity premium has temporarily been predictable, displaying so‐called “pockets of predictability,” and that these episodes of predictability could have been detected in real time by practitioners using our proposed methodology.  相似文献   

12.
The paper reviews international literature on corporate governance and firm performance and investigates the relationship in the Indian context, taking into account the endogeneity in the relationship. Governance parameters include board size, directors’ shareholding, institutional and foreign shareholding, while the fragmentation in shareholding is captured by public shareholding. A simultaneous equation regression model for Tobin’s Q, as a measure of firm performance, is attempted using these variables, while controlling for industry effects and other non-governance variables. The data corresponds to a panel of 340 large, listed Indian firms for the period 1997–2001 spread across 24 industry groups.  相似文献   

13.
We study Pareto optimal partitions of a “cake” among n players. Each player uses a countably additive non-atomic probability measure to evaluate the size of pieces of cake. We present two geometric pictures appropriate for this study and consider the connection between these pictures and the maximization of convex combinations of measures, which we studied in Barbanel and Zwicker [Barbanel, J.B., Zwicker, W., 1997. Two applications of a theorem of Dvoretsky, Wald, and Wolfovitz to cake division. Theory and Decision 43, 203–207].  相似文献   

14.
Typical data that arise from surveys, experiments, and observational studies include continuous and discrete variables. In this article, we study the interdependence among a mixed (continuous, count, ordered categorical, and binary) set of variables via graphical models. We propose an ?1‐penalized extended rank likelihood with an ascent Monte Carlo expectation maximization approach for the copula Gaussian graphical models and establish near conditional independence relations and zero elements of a precision matrix. In particular, we focus on high‐dimensional inference where the number of observations are in the same order or less than the number of variables under consideration. To illustrate how to infer networks for mixed variables through conditional independence, we consider two datasets: one in the area of sports and the other concerning breast cancer.  相似文献   

15.
Due to the complexity of present day supply chains it is important to select the simplest supply chain scheduling decision support system (DSS) which will determine and place orders satisfactorily. We propose to use a generic design framework, termed the explicit filter methodology, to achieve this objective. In doing so we compare the explicit filter approach to the implicit filter approach utilised in previous OR research the latter focusing on minimising a cost function. Although the eventual results may well be similar with both approaches it is much clearer to the designer, both why and how, an ordering system will reduce the Bullwhip effect via the explicit filter approach. The “explicit filter” approach produces a range of DSS designs corresponding to best practice. These may be “mixed and matched” to generate a number of competitive delivery pipelines to suit the specific business scenario.  相似文献   

16.
This paper deals with the issue of arbitrage with differential information and incomplete financial markets, with a focus on information that no-arbitrage asset prices can reveal. Time and uncertainty are represented by two periods and a finite set S of states of nature, one of which will prevail at the second period. Agents may operate limited financial transfers across periods and states via finitely many nominal assets. Each agent i has a private information about which state will prevail at the second period; this information is represented by a subset Si of S. Agents receive no wrong information in the sense that the “true state” belongs to the “pooled information” set ∩iSi, hence assumed to be non-empty.Our analysis is two-fold. We first extend the classical symmetric information analysis to the asymmetric setting, via a concept of no-arbitrage price. Second, we study how such no-arbitrage prices convey information to agents in a decentralized way. The main difference between the symmetric and the asymmetric settings stems from the fact that a classical no-arbitrage asset price (common to every agent) always exists in the first case, but no longer in the asymmetric one, thus allowing arbitrage opportunities. This is the main reason why agents may need to refine their information up to an information structure which precludes arbitrage.  相似文献   

17.
Two random variables X and Y on a common probability space are mutually completely dependent (m.c.d.) if each one is a function of the other with probability one. For continuous X and Y, a natural approach to constructing a measure of dependence is via the distance between the copula of X and Y and the independence copula. We show that this approach depends crucially on the choice of the distance function. For example, the L p -distances, suggested by Schweizer and Wolff, cannot generate a measure of (mutual complete) dependence, since every copula is the uniform limit of copulas linking m.c.d. variables. Instead, we propose to use a modified Sobolev norm, with respect to which mutual complete dependence cannot approximate any other kind of dependence. This Sobolev norm yields the first nonparametric measure of dependence which, among other things, captures precisely the two extremes of dependence, i.e., it equals 0 if and only if X and Y are independent, and 1 if and only if X and Y are m.c.d. Examples are given to illustrate the difference to the Schweizer–Wolff measure.  相似文献   

18.
Dynamic stochastic general equilibrium (DSGE) models have recently become standard tools for policy analysis. Nevertheless, their forecasting properties have still barely been explored. In this article, we address this problem by examining the quality of forecasts of the key U.S. economic variables: the three-month Treasury bill yield, the GDP growth rate and GDP price index inflation, from a small-size DSGE model, trivariate vector autoregression (VAR) models and the Philadelphia Fed Survey of Professional Forecasters (SPF). The ex post forecast errors are evaluated on the basis of the data from the period 1994–2006. We apply the Philadelphia Fed “Real-Time Data Set for Macroeconomists” to ensure that the data used in estimating the DSGE and VAR models was comparable to the information available to the SPF.Overall, the results are mixed. When comparing the root mean squared errors for some forecast horizons, it appears that the DSGE model outperforms the other methods in forecasting the GDP growth rate. However, this characteristic turned out to be statistically insignificant. Most of the SPF's forecasts of GDP price index inflation and the short-term interest rate are better than those from the DSGE and VAR models.  相似文献   

19.
In this paper, we review the literature on information technology (IT) and culture. The construct of “culture” has alternately been defined and studied by international scholars as national culture, and by organizational scholars as organizational or corporate culture. We argue that, despite the considerable amount of research activity in these areas, the two research traditions have existed as “stovepipes,” operating in parallel but not communicating effectively with each other. After reviewing how the linkage between IT and culture has been conceptualized in the literatures on national and organizational culture, we identify some gaps in these research streams, and propose a new conceptualization of culture. Grounding our framework in social identity theory (SIT), we argue that it is necessary to advance from the fragmentary perspectives that exist at present to a more holistic view of culture. We believe that this novel perspective will enable scholars to move toward a more multi-faceted view of culture as a richly layered set of forces that shape individuals’ beliefs and actions. We also identify opportunities for mutual learning, areas of challenge, and domains of possible contradiction between the two research streams as one step toward further theoretical advances.  相似文献   

20.
Abstract . Several recent attempts to measure the quality of life directly are examined. The indicators developed for this purpose seem incapable of correctly ranking locations or time periods. Variable aggregation schemes have not summarized data in a way which is useful to policy makers. Failures arise partly because researchers do not agree on the set of variables to be measured. More importantly, the methodology used has been inconsistent with proposed theoretical models and has resulted in indicators which violate accepted laws of economics. For example, neither diminishing returns not substitutability between quality of life “inputs” is recognized. Priority, it is believed, should be given solution of existing problems.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号