首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
In this study it will be argued that the perceived distribution of opinions among others is important for opinion research. Three different ways of measuring the perception of opinion distributions in survey research are compared: (a) by means of a questionwhat most people think about an issue, (b) by means of a questionhow many people are perceived to agree with an issue-statement, (c) by means of ‘line-production-boxes’, a special version ofmagnitude estimation. The results indicate that ‘line-production-boxes’ can improve data quality, but have also some drawbacks which will have to be dealt with. ‘Line-production-boxes’ give a wealth of information about individual differences in the forms of perceived opinion distributions. Although the normal distribution is used often, many other distribution forms are also used. The method of ‘line-production-boxes’ is compared with the method of estimating percentage points. Although high correlations suggest a good concurrent validity, some systematic differences do exist. New research directions are suggested.  相似文献   

2.
D. G. Kabe 《Metrika》1970,15(1):15-18
Summary Likes obtains the distributions ofDixon’s statistics for an exponential population and tabulates upper 100α% points (α=0.1, 0.05, 0.01) of some of these distributions. The distribution of these statistics can be expressed in terms of finite series of beta functions and hence the probabilities of the rejection of suspected observed outliers may be easily calculated on a desk calculator. Thus we may avoid the difficult task of tabulating 100α% values of these statistics.  相似文献   

3.
This paper uses firm-level data recorded in the Amadeus database to investigate the distribution of labour productivity in different European countries. We find that the upper tail of the empirical productivity distributions follows a decaying power-law, whose exponent α is obtained by a semi-parametric estimation technique recently developed by Clementi et al. [Physica A 370(1):49–53, 2006]. The emergence of “fat tails” in productivity distribution has already been detected in Di Matteo et al. [Eur Phys J B 47(3):459–466, 2005] and explained by means of a model of social network. Here we show that this model is tested on a broader sample of countries having different patterns of social network structure. These different social attitudes, measured using a social capital indicator, reflect in the power-law exponent estimates, verifying in this way the existence of linkages among firms’ productivity performance and social network.  相似文献   

4.
We extend an earlier model of innovation dynamics based on percolation by adding endogenous R&D search by economically motivated firms. The {0, 1} seeding of the technology lattice is now replaced by draws from a lognormal distribution for technology ‘difficulty’. Firms are rewarded for successful innovations by increases in their R&D budget. We compare two regimes. In the first, firms are fixed in a region of technology space. In the second, they can change their location by myopically comparing progress in their local neighborhoods and probabilistically moving to the region with the highest recent progress. We call this the moving or self-organizational regime (SO). The SO regime always outperforms the fixed one, but its performance is a complex function of the ‘rationality’ of firm search (in terms of search radius and speed of movement). The clustering of firms in the SO regime grows rapidly and then fluctuates in a complex way around a high value that increases with the search radius. We also investigate the size distributions of the innovations generated in each regime. In the fixed one, the distribution is approximately lognormal and certainly not fat tailed. In the SO regime, the distributions are radically different. They are much more highly right skewed and show scaling over at least 2 decades with a slope around one, for a wide range of parameter settings. Thus we argue that firm self-organization leads to self-organized criticality. The online version of the original article can be found under doi:.  相似文献   

5.
Summary We consider in this paper the transient behaviour of the queuing system in which (i) the input, following a Poisson distribution, is in batches of variable numbers; (ii) queue discipline is ‘first come first served’, it being assumed that the batches are pre-ordered for service purposes; and (iii) service time distribution is hyper-exponential withn branches. The Laplace transform of the system size distribution is determined by applying the method of generating functions, introduced in queuing theory byBailey [1]. However, assuming steady state conditions to obtain, the problem is completely solved and it is shown that by suitably defining the traffic intensity factor,ϱ, the value,p 0, of the probability of no delay, remains the same in this case of batch arrivals also as in the case of single arrivals. The Laplace transform of the waiting time distribution is also calculated in steady state case from which the mean waiting time may be calculated. Some of the known results are derived as particular cases.  相似文献   

6.
Summary Murthy’s variance — estimator for the estimator obtained by him by unorderingDes Raj’s estimator is shown to be non-negative.  相似文献   

7.
A ‘trilemma’ procedure is introduced for collecting ‘dominance data’ (i.e. rankings of a set of items along a scale of relevance, preference, etc.). Trilemmas are three-way forced choices where the three items comprising each trilemma are selected on the basis of a multidimensional scaling solution (MDS) for the item set, ensuring that each choice is as stark and informative as possible. A questionnaire designed on this principle is easily understood and rapidly administered. The data are convenient to record and show less fluctuation among informants than existing techniques. We demonstrate the procedure with a set of 45 short generalisations about behaviour, designed for assessing child attachment. A three-dimensional ‘map’ of these items was obtained by applying MDS to multiple sets of similarity data. The same structure emerged from English-language and Japanese translations of the items. Thirty trilemmas based on this map were used to rank the items by degree of association with the Japanese concept of amae, characterising the concept in terms of its behavioural correlates.  相似文献   

8.
Within the class of weighted averages of ordered measurements on criteria of ‘equal importance’, we discuss aggregators that help assess to what extent ‘most’ criteria are satisfied, or a ‘good performance across the board’ is attained. All aggregators discussed are compromises between the simple mean and the very demanding minimum score. The weights of the aggregators are described in terms of the convex polytopes as induced by the characteristics of the aggregators, such as concavity. We take the barycentres of the polytopes as representative weights.  相似文献   

9.
This paper covers the main findings of the doctoral research that was concerned with seeking to extend aspects of dilemma theory. In professional practice, the Trompenaars Hampden-Turner Dilemma Reconciliation ProcessTM is a vehicle delivering dilemma theory in application. It informs a manager or leader on how to explore the dilemmas they face, how to reconcile the tensions that result, and how to structure the action steps for implementing the reconciled solutions. This vehicle forms the professional practice of the author who seeks to bring more rigor to consulting practice and thereby also contribute to theory development in the domain. The critical review of dilemma theory reveals that previous authors are inconsistent and variously invalid in their use of the terms ‘dilemma theory,’ ‘dilemma methodology,’ ‘dilemma process,’ ‘dilemma reconciliation,’ etc., and therefore an attempt is made to resolve these inconsistencies by considering whether ‘dilemmaism’ at the meta-level might be positioned as a new paradigm of inquiry for (management) research that embodies ontological, epistemological, and methodical premises that frame an approach to the resolution of real world business problems in (multi) disciplinary; (multi) functional and (multi) cultural business environments. This research offers contributions to knowledge, professional practice and theory development from the exploration of the SPID model as a way to make the elicitation of dilemmas more rigorous and structured and in the broader context of exploring ‘dilemmaism’ as a new paradigm of inquiry.  相似文献   

10.
The paper is a preliminary research report and presents a method for generating new records using an evolutionary algorithm (close to but different from a genetic algorithm). This method, called Pseudo-Inverse Function (in short P-I Function), was designed and implemented at Semeion Research Centre (Rome). P-I Function is a method to generate new (virtual) data from a small set of observed data. P-I Function can be of aid when budget constraints limit the number of interviewees, or in case of a population that shows some sociologically interesting trait, but whose small size can seriously affect the reliability of estimates, or in case of secondary analysis on small samples. The applicative ground is given by research design with one or more dependent and a set of independent variables. The estimation of new cases takes place according to the maximization of a fitness function and outcomes a number as large as needed of ‘virtual’ cases, which reproduce the statistical traits of the original population. The algorithm used by P-I Function is known as Genetic Doping Algorithm (GenD), designed and implemented by Semeion Research Centre; among its features there is an innovative crossover procedure, which tends to select individuals with average fitness values, rather than those who show best values at each ‘generation’. A particularly thorough research design has been put on: (1) the observed sample is half-split to obtain a training and a testing set, which are analysed by means of a back propagation neural network; (2) testing is performed to find out how good the parameter estimates are; (3) a 10% sample is randomly extracted from the training set and used as a reduced training set; (4) on this narrow basis, GenD calculates the pseudo-inverse of the estimated parameter matrix; (5) ‘virtual’ data are tested against the testing data set (which has never been used for training). The algorithm has been proved on a particularly difficult ground, since the data set used as a basis for generating ‘virtual’ cases counts only 44 respondents, randomly sampled from a broader data set taken from the General Social Survey 2002. The major result is that networks trained on the ‘virtual’ resample show a model fit as good as the one of the observed data, though ‘virtual’ and observed data differ on some features. It can be seen that GenD ‘refills’ the joint distribution of the independent variables, conditioned by the dependent one. This paper is the result of deep collaboration among all authors. Cinzia Meraviglia wrote § 1, 3, 4, 6, 7 and 8; Giulia Massini wrote §5; Daria Croce performed some elaborations with neural networks and linear regression; Massimo Buscema wrote §2.  相似文献   

11.
Zusammenfassung In der Diskussion verschiedener Konzentrationsma?e zeigt die Informationsentropie in der Form des normierten Entropiema?esE optimale Eigenschaften. Gleich demC-Index und demRosenbluth-Index reagiert sie wettbewerbstheoretisch „richtig“, weist aber wegen ihrer axiomatischen Fundierung zus?tzlich die Disaggregations-Eigenschaft auf. Zur ersten Orientierung über Konzentrationstendenzen hat sich der Konzentrationskoeffizient trotz mancher Nachteile als nützlich erwiesen. Einige Ergebnisse werden am Beispiel der Automobilproduktion Kanadas und der BRD gezeigt.
Summary Among the discussed measures of business concentration the standardized entropy-measureE reveals most favourable properties. Alike the indexes ofHirschman and ofRosenbluth it reacts ‘as expected’ in terms of the theory of industrial organization. In addition,E makes disaggregation feasible because of its axiomatic basis. For purposes of observing concentration at a first glance the method of concentration ratios proves to be a useful tool despite conceptual shortcomings. Some findings are shown for the Canadian and the West-German passenger-car industry.
  相似文献   

12.
Postulating a linear regression of a variable of interest on an auxiliary variable with values of the latter known for all units of a survey population, we consider appropriate ways of choosing a sample and estimating the regression parameters. Recalling Thomsen’s (1978) results on non-existence of ‘design-cum-model’ based minimum variance unbiased estimators of regression coefficients we apply Brewer’s (1979) ‘asymptotic’ analysis to derive ‘asymptotic-design-cummodel’ based optimal estimators assuming large population and sample sizes. A variance estimation procedure is also proposed.  相似文献   

13.
Chen  Shun-Hsing 《Quality and Quantity》2012,46(4):1279-1296
This study is based on the SERVQUAL model and the Plan-Do-Check-Action (P-D-C-A) cycle of TQM to establish a higher education quality management system. In this system, it includes ‘Plan’ and ‘Do’ dimensions to execute ten factors, each dimension complements with each other. The ‘Check’ dimension has four factors and the ‘Action’ dimension has three factors. Each execution factors of the quality management system could reduce the occurrence of five gaps in the SERVQUAL model and facilitate education providers to provide a better service quality. Education providers strengthen an education system by carefully planning and implementation of quality auditing and continuous improvement. The desired result of this study is to possess a more explicit framework for high education industry and provide a proper service for students.  相似文献   

14.
A Data Envelopment Analysis (DEA) cost minimization model is employed to estimate the cost to thrift institutions of achieving a rating of ‘outstanding’ under the anti-redlining Community Reinvestment Act, which is viewed as an act of voluntary Corporate Social Responsibility (CSR). There is no difference in overall cost efficiency between ‘outstanding’ and minimally compliant ‘satisfactory’ thrifts. However, the sources of cost inefficiency do differ, and an ‘outstanding’ rating involves annual extra cost of $6.547 million or, 1.2% of total costs. This added cost is the shadow price of CSR since it is not an explicit output or input in the DEA cost model. Before and after-tax rates of return are the same for the ‘outstanding’ and ‘satisfactory’ thrifts, which implies a recoupment of the extra cost. The findings are consistent with CSR as a management choice based on balancing marginal cost and marginal revenue. An incidental finding is that larger thrifts are less efficient.
Donald F. VitalianoEmail: Phone: +1-518- 276-8093
  相似文献   

15.
16.
This paper discusses ways forward in terms of making efficiency measurement in the area of health care more useful. Options are discussed in terms of the potential introduction of guidelines for the undertaking of studies in this area, in order to make them more useful to policy makers and those involved in service delivery. The process of introducing such guidelines is discussed using the example of the development of guidelines in economic evaluation of health technologies. This presents two alternative ways forward—‘revolution’, the establishment of a panel to establish initial guidelines, or ‘evolution’—the more gradual development of such guidelines over time. The third alternative of ‘status quo’, representing the current state of play, is seen as the base case scenario. It is concluded that although we are quite a way on in terms of techniques and publications, perhaps revolution, followed by evolution is the way forward.  相似文献   

17.
We characterize the equilibrium of the all-pay auction with general convex cost of effort and sequential effort choices. We consider a set of n players who are arbitrarily partitioned into a group of players who choose their efforts ‘early’ and a group of players who choose ‘late’. Only the player with the lowest cost of effort has a positive payoff in any equilibrium. This payoff depends on his own timing vis-a-vis the timing of others. We also show that the choice of timing can be endogenized, in which case the strongest player typically chooses ‘late’, whereas all other players are indifferent with respect to their choice of timing. In the most prominent equilibrium the player with the lowest cost of effort wins the auction at zero aggregate cost. We thank Dan Kovenock and Luis C. Corchón for discussion and helpful comments. The usual caveat applies. Wolfgang Leininger likes to express his gratitude to Wissenschaftszentrum Berlin (WZB) for its generous hospitality and financial support.  相似文献   

18.
Despite the growing recognition that many entrepreneurs conduct some or all of their trade off-the-books, few have evaluated whether there are variations in the rationales of men and women engaging in this shadow enterprise culture. Reporting face-to-face interviews with 331 entrepreneurs in Ukraine during 2005–06, of whom 90 per cent operated partially or wholly off-the-books, the finding is that women are largely ‘reluctant’ entrepreneurs and men more commonly chiefly ‘willing’ entrepreneurs, although both motives are normally co-present to differing degrees in entrepreneurs’ explanations and their relative importance changes over time. The paper concludes by discussing the implications for both further research and public policy.  相似文献   

19.
We consider the ability to detect interaction structure from data in a regression context. We derive an asymptotic power function for a likelihood-based test for interaction in a regression model, with possibly misspecified alternative distribution. This allows a general investigation of different types of interactions which are poorly or well detected via data. Principally we contrast pairwise-interaction models with ‘diffuse interaction models’ as introduced in Gustafson et al. (Stat Med 24:2089–2104, 2005).  相似文献   

20.
Interpretative qualitative social science has attempted to distinguish itself from quantitative social science by rejecting traditional or ‘received’ notions of generalization. Traditional concepts of scientific generalization it is claimed are based on a misguided objectivism as to the mechanisms operating in the social world, and particularly the ability of statements to capture such mechanisms in any abstract sense. Instead they propose new versions of the generalizability concept e.g. ‘transferability’, which relies on the context dependent judgement of ‘fit’ between two or more cases instances made by a researcher. This paper argues that the transferability concept, as outlined and argued by interpretativist methodologists, is thoroughly coextensive with notions of generalizability formalized for natural science and naturalistic social science by philosophers and methodologists of science. Therefore, it may be concluded that the interpretativist claim to a break with received scientific traditions is a premature one, at least with regard to the issue of generalization.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号