首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 468 毫秒
1.
This paper covers the main findings of the doctoral research that was concerned with seeking to extend aspects of dilemma theory. In professional practice, the Trompenaars Hampden-Turner Dilemma Reconciliation ProcessTM is a vehicle delivering dilemma theory in application. It informs a manager or leader on how to explore the dilemmas they face, how to reconcile the tensions that result, and how to structure the action steps for implementing the reconciled solutions. This vehicle forms the professional practice of the author who seeks to bring more rigor to consulting practice and thereby also contribute to theory development in the domain. The critical review of dilemma theory reveals that previous authors are inconsistent and variously invalid in their use of the terms ‘dilemma theory,’ ‘dilemma methodology,’ ‘dilemma process,’ ‘dilemma reconciliation,’ etc., and therefore an attempt is made to resolve these inconsistencies by considering whether ‘dilemmaism’ at the meta-level might be positioned as a new paradigm of inquiry for (management) research that embodies ontological, epistemological, and methodical premises that frame an approach to the resolution of real world business problems in (multi) disciplinary; (multi) functional and (multi) cultural business environments. This research offers contributions to knowledge, professional practice and theory development from the exploration of the SPID model as a way to make the elicitation of dilemmas more rigorous and structured and in the broader context of exploring ‘dilemmaism’ as a new paradigm of inquiry.  相似文献   

2.
Based on ‘endogenous’ growth theory, the paper examines the effect of trade liberalization on long-run income per capita and economic growth in Turkey. Although the presumption must be that free trade has a beneficial effect on long run growth, counter examples can also be found. This controversy increases the importance of empirical work in this area. Using the most recent data we employ multivariate cointegration analysis to test the long run relationship among the variables in hand. In a multivariate context, the effect of determinants such as increasing returns to scale, investment in human and physical capital are also included in both theoretical and empirical works. Our causality evidence between the long run growth and a number of indicators of trade liberalizations confirms the predictions of the ‘new growth theory’. However, the overall effect of the possible breaks and/or policy change and unsustainability in the 1990s looks contradictory and deserves further investigation.  相似文献   

3.
In this study it will be argued that the perceived distribution of opinions among others is important for opinion research. Three different ways of measuring the perception of opinion distributions in survey research are compared: (a) by means of a questionwhat most people think about an issue, (b) by means of a questionhow many people are perceived to agree with an issue-statement, (c) by means of ‘line-production-boxes’, a special version ofmagnitude estimation. The results indicate that ‘line-production-boxes’ can improve data quality, but have also some drawbacks which will have to be dealt with. ‘Line-production-boxes’ give a wealth of information about individual differences in the forms of perceived opinion distributions. Although the normal distribution is used often, many other distribution forms are also used. The method of ‘line-production-boxes’ is compared with the method of estimating percentage points. Although high correlations suggest a good concurrent validity, some systematic differences do exist. New research directions are suggested.  相似文献   

4.
Postulating a linear regression of a variable of interest on an auxiliary variable with values of the latter known for all units of a survey population, we consider appropriate ways of choosing a sample and estimating the regression parameters. Recalling Thomsen’s (1978) results on non-existence of ‘design-cum-model’ based minimum variance unbiased estimators of regression coefficients we apply Brewer’s (1979) ‘asymptotic’ analysis to derive ‘asymptotic-design-cummodel’ based optimal estimators assuming large population and sample sizes. A variance estimation procedure is also proposed.  相似文献   

5.
The paper is a preliminary research report and presents a method for generating new records using an evolutionary algorithm (close to but different from a genetic algorithm). This method, called Pseudo-Inverse Function (in short P-I Function), was designed and implemented at Semeion Research Centre (Rome). P-I Function is a method to generate new (virtual) data from a small set of observed data. P-I Function can be of aid when budget constraints limit the number of interviewees, or in case of a population that shows some sociologically interesting trait, but whose small size can seriously affect the reliability of estimates, or in case of secondary analysis on small samples. The applicative ground is given by research design with one or more dependent and a set of independent variables. The estimation of new cases takes place according to the maximization of a fitness function and outcomes a number as large as needed of ‘virtual’ cases, which reproduce the statistical traits of the original population. The algorithm used by P-I Function is known as Genetic Doping Algorithm (GenD), designed and implemented by Semeion Research Centre; among its features there is an innovative crossover procedure, which tends to select individuals with average fitness values, rather than those who show best values at each ‘generation’. A particularly thorough research design has been put on: (1) the observed sample is half-split to obtain a training and a testing set, which are analysed by means of a back propagation neural network; (2) testing is performed to find out how good the parameter estimates are; (3) a 10% sample is randomly extracted from the training set and used as a reduced training set; (4) on this narrow basis, GenD calculates the pseudo-inverse of the estimated parameter matrix; (5) ‘virtual’ data are tested against the testing data set (which has never been used for training). The algorithm has been proved on a particularly difficult ground, since the data set used as a basis for generating ‘virtual’ cases counts only 44 respondents, randomly sampled from a broader data set taken from the General Social Survey 2002. The major result is that networks trained on the ‘virtual’ resample show a model fit as good as the one of the observed data, though ‘virtual’ and observed data differ on some features. It can be seen that GenD ‘refills’ the joint distribution of the independent variables, conditioned by the dependent one. This paper is the result of deep collaboration among all authors. Cinzia Meraviglia wrote § 1, 3, 4, 6, 7 and 8; Giulia Massini wrote §5; Daria Croce performed some elaborations with neural networks and linear regression; Massimo Buscema wrote §2.  相似文献   

6.
Within the class of weighted averages of ordered measurements on criteria of ‘equal importance’, we discuss aggregators that help assess to what extent ‘most’ criteria are satisfied, or a ‘good performance across the board’ is attained. All aggregators discussed are compromises between the simple mean and the very demanding minimum score. The weights of the aggregators are described in terms of the convex polytopes as induced by the characteristics of the aggregators, such as concavity. We take the barycentres of the polytopes as representative weights.  相似文献   

7.
This paper discusses ways forward in terms of making efficiency measurement in the area of health care more useful. Options are discussed in terms of the potential introduction of guidelines for the undertaking of studies in this area, in order to make them more useful to policy makers and those involved in service delivery. The process of introducing such guidelines is discussed using the example of the development of guidelines in economic evaluation of health technologies. This presents two alternative ways forward—‘revolution’, the establishment of a panel to establish initial guidelines, or ‘evolution’—the more gradual development of such guidelines over time. The third alternative of ‘status quo’, representing the current state of play, is seen as the base case scenario. It is concluded that although we are quite a way on in terms of techniques and publications, perhaps revolution, followed by evolution is the way forward.  相似文献   

8.
Batteries of questions with identical response items are commonly used in survey research. This paper suggests that question order has the potential to cause systematic positive or negative bias on responses to all questions in a battery. Whilst question order effects have been studied for many decades, almost no attention has been given to this topic. The primary aim is to draw attention to this effect, to demonstrate its possible magnitude, and to discuss a range of mechanisms through which it might occur. These include satisficing, anchoring and cooperativeness. The effect seems apparent in the results of a recent survey. This was a survey of Emergency Department patients presenting to Wollongong Hospital (Australia) with apparently less urgent conditions in 2004. Two samples were taken. Question order was fixed in the first sample (n = 104; response rate RR2 = 94%), but randomised in the second sample (n = 46; response rate RR2 = 96%). Respondents were asked to indicate whether each of 18 reasons for presenting to the ED was a ‘very important reason’ a ‘moderately important reason’ or ‘not a reason’ The mean number of very important reasons selected was 56% higher in the first sample as compared to the second sample.  相似文献   

9.
This paper analyzes whether respondents’ attitudes toward surveys explain their susceptibility to item nonresponse. In contrast to previous studies, the decision to refuse to provide income information, not to answer other questions and the probability of ‘don’t know’ responses is tested separately. Furthermore, the interviewers’ overall judgments of response willingness was included as well. Respondents with a positive and cognitively accessible attitude toward surveys were expected to adopt a cooperative orientation and were thus deemed more likely to answer difficult as well as sensitive questions. Attitudes were measured with a 16-item instrument and the response latencies were used as an indicator for attitude accessibility. We found that respondents with more favorable evaluations of surveys had lower values on all kinds of nonresponse indicators. Except for the strong effect on the prevalence of ‘don’t knows’, survey attitudes were increasingly more predictive for all other aspects of nonresponse when these attitude answers were faster and thus cognitively more accessible. This accessibility, and thus how relevant survey attitudes are for nonresponse, was found to increase with the subjects’ exposure to surveys in the past.  相似文献   

10.
Chen  Shun-Hsing 《Quality and Quantity》2012,46(4):1279-1296
This study is based on the SERVQUAL model and the Plan-Do-Check-Action (P-D-C-A) cycle of TQM to establish a higher education quality management system. In this system, it includes ‘Plan’ and ‘Do’ dimensions to execute ten factors, each dimension complements with each other. The ‘Check’ dimension has four factors and the ‘Action’ dimension has three factors. Each execution factors of the quality management system could reduce the occurrence of five gaps in the SERVQUAL model and facilitate education providers to provide a better service quality. Education providers strengthen an education system by carefully planning and implementation of quality auditing and continuous improvement. The desired result of this study is to possess a more explicit framework for high education industry and provide a proper service for students.  相似文献   

11.
While International Entrepreneurship has attracted scholars’ attention during the last two decades, the impact cognitive aspects exert has been studied on cursory level only. The purpose of this paper was to apply the Theory of Planned Behavior (TPB) to the very field of International Entrepreneurship in order to examine whether this theory contributes to clarify what influences Small and Medium-sized Enterprises’ (SME) decision-makers’ intention—an important cognitive antecedent to behavior—to play an active part in internationalization. In particular, it had to be clarified whether or not International Entrepreneurship—due to its contextual specificities—deserves to be extended by further elements, i.e. experience and knowledge. Based on more than 100 responses from German SME executives, the study yielded several interesting results. First, TPB indeed helps explain how intentions to actively participate in international business are built. Second, an extension of the theory’s basic model seems to make sense, probably due to the specificities of international entrepreneurial behavior. As for the extensions, direct and moderating effects have been observed. Furthermore, cognitive elements seem to be key entrepreneurial resources which serve as sort of enablers. From these results several conclusions can be drawn. Cognitive aspects are a promising starting point for understanding decision-making in SME. Thus, the intersection of international entrepreneurship and entrepreneurial cognition deserves further attention—several examples for possible future studies are presented. Policies supporting SME should be extended: pure resource-based approaches seem to be insufficient. Furthermore, entrepreneurship courses and curricula should reflect the relevance of cognitive aspects.  相似文献   

12.
Interpretative qualitative social science has attempted to distinguish itself from quantitative social science by rejecting traditional or ‘received’ notions of generalization. Traditional concepts of scientific generalization it is claimed are based on a misguided objectivism as to the mechanisms operating in the social world, and particularly the ability of statements to capture such mechanisms in any abstract sense. Instead they propose new versions of the generalizability concept e.g. ‘transferability’, which relies on the context dependent judgement of ‘fit’ between two or more cases instances made by a researcher. This paper argues that the transferability concept, as outlined and argued by interpretativist methodologists, is thoroughly coextensive with notions of generalizability formalized for natural science and naturalistic social science by philosophers and methodologists of science. Therefore, it may be concluded that the interpretativist claim to a break with received scientific traditions is a premature one, at least with regard to the issue of generalization.  相似文献   

13.
Although the Big Five Questionnaire for children (BFQ-C) (C. Barbaranelli et al., Manuale del BFQ-C. Big Five Questionnaire Children, O.S. Organizazioni, Firenze, 1998) is an ordinal scale, its dimensionality has often been studied using factor analysis with Pearson correlations. In contrast, the present study takes this ordinal metric into account and examines the dimensionality of the scale using factor analysis based on a matrix of polychoric correlations. The sample comprised 852 subjects (40.90% boys and 59.10% girls). As in previous studies the results obtained through exploratory factor analysis revealed a five-factor structure (extraversion, agreeableness, conscientiousness, emotional instability and openness). However, the results of the confirmatory factor analysis were consistent with both a four and five-factor structure, the former showing a slightly better fit and adequate theoretical interpretation. These data confirm the need for further research as to whether the factor ‘Openness’ should be maintained as an independent factor (five-factor model), or whether it would be best to omit it and distribute its items among the factors ‘Extraversion’ and ‘Conscientiousness’ (four-factor model).  相似文献   

14.
Coauthorship links actors at the micro-level of scientists. Through electronic databases we now have enough information to compare entire research disciplines over time. We compare the complete longitudinal coauthorship networks for four research disciplines (biotechnology, mathematics, physics and sociology) for 1986–2005. We examined complete bibliographies of all researchers registered at the national Slovene Research Agency. Known hypotheses were confirmed as were three new hypotheses. There were different coauthoring cultures. However, these cultures changed over time in Slovenia. The number of coauthored publications grew much faster than solo authored productions, especially after independence in 1991 and the integration of Slovenian science into broader EU systems. Trajectories of types of coauthorship differed across the disciplines. Using blockmodeling, we show how coauthorship structures change in all disciplines. The most frequent form was a core-periphery structure with multiple simple cores, a periphery and a semi-periphery. The next most frequent form had this structure but with bridging cores. Bridging cores consolidate the center of a discipline by giving it greater coherence. These consolidated structures appeared at different times in different disciplines, appearing earliest in physics and latest in biotechnology. In 2005, biotechnology had the most consolidated center followed by physics and sociology. All coauthorship networks expanded over time. By far, new recruits went into either the semi-periphery or the periphery in all fields. Two ‘lab’ fields, biotechnology and physics, have larger semi-peripheries than peripheries. The reverse holds for mathematics and sociology, two ‘office’ disciplines. Institutional affiliations and shared interests all impact the structure of collaboration in subtle ways.  相似文献   

15.
We characterize the equilibrium of the all-pay auction with general convex cost of effort and sequential effort choices. We consider a set of n players who are arbitrarily partitioned into a group of players who choose their efforts ‘early’ and a group of players who choose ‘late’. Only the player with the lowest cost of effort has a positive payoff in any equilibrium. This payoff depends on his own timing vis-a-vis the timing of others. We also show that the choice of timing can be endogenized, in which case the strongest player typically chooses ‘late’, whereas all other players are indifferent with respect to their choice of timing. In the most prominent equilibrium the player with the lowest cost of effort wins the auction at zero aggregate cost. We thank Dan Kovenock and Luis C. Corchón for discussion and helpful comments. The usual caveat applies. Wolfgang Leininger likes to express his gratitude to Wissenschaftszentrum Berlin (WZB) for its generous hospitality and financial support.  相似文献   

16.
Ding  Ji-Feng 《Quality and Quantity》2009,43(4):553-570
The main purpose of this paper is to apply fuzzy quality function deployment (QFD) model to identify solutions of service delivery system (SDS) for port of Kaohsiung from the viewpoints of customers. At first, to facilitate the main issue of the QFD problem, however, the ‘what’ question of customer needs and ‘how’ problem of the services have to be made, which are two major components and be emphasized on the house of quality (HOQ) matrices. In conjunction with fuzzy sets theory, hence, the systematic procedures using fuzzy QFD were proposed in this paper. Subsequently, a case study for port of Kaohsiung demonstrated the systematic appraisal process for identifying solutions of SDS. The results of empirical study show that (1) 10 key factors are deemed as to have priority to improve the quality of SDS for Kaohsiung port; and (2) eight feasible solutions for improving service quality performance are identified. Moreover, it is suggested that port Authority of Kaohsiung should listen attentively the voice of customers and emphasize on exploiting these customer requirements effectively. And then develop the ‘how’ issues of profiles of solutions, which should continuously strengthen the perspectives of customer, internal business process, and learning and growth, respectively.  相似文献   

17.
This paper reports on an experiment of corruption that was conducted in two treatments: one with the possibility of detection and one without. It turns out that monitoring reduces corruption through deterrence; at the same time, it destroys the intrinsic motivation for honesty. Thus the net effect on overall corruption is a priori undetermined. We show that the salary level has an influence on corruption through increased opportunity costs of corruption, but fail to find evidence for a ‘payment satisfaction’ effect. Interesting policy conclusions emerge. RID="*" ID="*" Acknowledgments: We are indebted to Johann Graf Lambsdorff for calling our attention to Fujimori's gender policy and to Ernst Fehr, Bruno Frey, Alireza Jay Naghavi, and two anonymous referees for valuable comments.  相似文献   

18.
Customer value from a customer perspective: a comprehensive review   总被引:1,自引:0,他引:1  
The value concept is one of marketing theory’s basic elements. Identifying and creating customer value (CV) – understood as value for customers – is regarded as an essential prerequisite for future company success. Nevertheless, not until quite recently has CV received much research attention. Ideas on how to conceptualize and link the concept to other constructs vary widely. The literature contains a multitude of different definitions, models, and measurement approaches. This article provides a broad overview, analysis, and critical evaluation of the different trends and approaches found to date in this research field, encompassing the development of perceived and desired customer value research, the relationships between the CV construct and other central marketing constructs, and the linkage between CV and the company interpretation of the value of the customer, like customer lifetime value (CLV). The article concludes by pointing out some of the challenges this field of research will face in the future.   相似文献   

19.
A classic puzzle in the economic theory of the firm concerns the fundamental cause of decreasing returns to scale. If a plant producing product quantityX at costC can be replicated as often as desired, then the quantityrX need never cost more thanrC. Traditionally the firm is imagined to take its identity from a fixednon-replicable input, namely a ‘top manager’; as more plants or divisions are added, the communication and computation burden imposed on the top manager (who has information not possessed by the divisions) grows more than proportionately. Decreasing returns are experienced as the top manager hires more variable inputs to cope with the rising burden. Suppose it turns out, however, that when the divisions are assembled, and are given exactly the same totally independent tasks that they fulfilled when they were autonomous, then asaving can be achieved if they adopt a joint procedure for performing those tasks rather than replicating their previous separate procedures. Then the top manager's rising burden must be shown to be particularly onerous—otherwise there may actually beincreasing returns. We show that for a certain model of the information-processing procedure used by the separate divisions and by the firm, there may indeed be such an odd unexpected saving. The saving occurs with respect to the size of the language in which members of each division, or of the firm, communicate with one another, provided that language is finite. If instead the language is a continuum then the saving cannot occur, provided that the procedures used obey suitable ‘smoothness’ conditions. We show that the saving for the finite case can be ruled out in two ways: by requiring the procedures used to obey a regularity condition that is a crude analogue of the smoothness conditions we impose on the continuum procedures, or by insisting that the procedure used be a ‘deterministic’ protocol. Such a protocol prescribes a conversation among the participants, in which a participant has only one choice, whenever that participant has to make an announcement to the others. The results suggest that a variety of information-processing models will have to be studied before the traditional explanation for decreasing returns to scale is understood in a rigorous way.  相似文献   

20.
In this paper, the validity of vignette analyses of various forms of deviant behavior in the presence of opportunities is analyzed on the basis of ideas derived from cognitive psychology. Abelson’s Script Theory together with insights into human memory of visual and verbal information, allow the assumption that vignette analyses using visual stimuli are valid measures of deviant behavior in particular. The study includes an empirical examination of these ideas (n = 450). Nonparticipant observations and vignette analyses with visual and verbal material were carried out with regard to three forms of deviant behavior occurring in the presence of opportunities presenting themselves in everyday life. Observed and self-reported frequencies of deviant behavior or deviant intentions were counted and cross-tabulated. Log-linear analyses with dummy coding using observation data as reference category were run. Data analyses yielded the result that frequencies of deviant behavior were related to the techniques of data collection under consideration. Especially vignette analyses of the return of ‘lost letters’ that use both visual and verbal stimuli overestimate ‘actual’ (i.e. observed) return rates. This result is discussed with regard to the underlying methodological assumptions as well as its implications.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号