首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 984 毫秒
1.
Summary A natural conjugate prior distribution for the parameters involved in the noncentral chi-square leads to many known distributions. The applications of the distributions thus obtained are briefly pointed out in evaluating the ‘kill’ probability in the analysis of weapon systems effectiveness. The ‘kill’ probabilities or the expected coverage are obtained associated with a gamma prior distribution and compared with those obtained byMcnolty. This paper is read in a symposium on Mathematical Sciences held under the auspices of Delhi University, Delhi im January 1966.  相似文献   

2.
This paper covers the main findings of the doctoral research that was concerned with seeking to extend aspects of dilemma theory. In professional practice, the Trompenaars Hampden-Turner Dilemma Reconciliation ProcessTM is a vehicle delivering dilemma theory in application. It informs a manager or leader on how to explore the dilemmas they face, how to reconcile the tensions that result, and how to structure the action steps for implementing the reconciled solutions. This vehicle forms the professional practice of the author who seeks to bring more rigor to consulting practice and thereby also contribute to theory development in the domain. The critical review of dilemma theory reveals that previous authors are inconsistent and variously invalid in their use of the terms ‘dilemma theory,’ ‘dilemma methodology,’ ‘dilemma process,’ ‘dilemma reconciliation,’ etc., and therefore an attempt is made to resolve these inconsistencies by considering whether ‘dilemmaism’ at the meta-level might be positioned as a new paradigm of inquiry for (management) research that embodies ontological, epistemological, and methodical premises that frame an approach to the resolution of real world business problems in (multi) disciplinary; (multi) functional and (multi) cultural business environments. This research offers contributions to knowledge, professional practice and theory development from the exploration of the SPID model as a way to make the elicitation of dilemmas more rigorous and structured and in the broader context of exploring ‘dilemmaism’ as a new paradigm of inquiry.  相似文献   

3.
The paper is a preliminary research report and presents a method for generating new records using an evolutionary algorithm (close to but different from a genetic algorithm). This method, called Pseudo-Inverse Function (in short P-I Function), was designed and implemented at Semeion Research Centre (Rome). P-I Function is a method to generate new (virtual) data from a small set of observed data. P-I Function can be of aid when budget constraints limit the number of interviewees, or in case of a population that shows some sociologically interesting trait, but whose small size can seriously affect the reliability of estimates, or in case of secondary analysis on small samples. The applicative ground is given by research design with one or more dependent and a set of independent variables. The estimation of new cases takes place according to the maximization of a fitness function and outcomes a number as large as needed of ‘virtual’ cases, which reproduce the statistical traits of the original population. The algorithm used by P-I Function is known as Genetic Doping Algorithm (GenD), designed and implemented by Semeion Research Centre; among its features there is an innovative crossover procedure, which tends to select individuals with average fitness values, rather than those who show best values at each ‘generation’. A particularly thorough research design has been put on: (1) the observed sample is half-split to obtain a training and a testing set, which are analysed by means of a back propagation neural network; (2) testing is performed to find out how good the parameter estimates are; (3) a 10% sample is randomly extracted from the training set and used as a reduced training set; (4) on this narrow basis, GenD calculates the pseudo-inverse of the estimated parameter matrix; (5) ‘virtual’ data are tested against the testing data set (which has never been used for training). The algorithm has been proved on a particularly difficult ground, since the data set used as a basis for generating ‘virtual’ cases counts only 44 respondents, randomly sampled from a broader data set taken from the General Social Survey 2002. The major result is that networks trained on the ‘virtual’ resample show a model fit as good as the one of the observed data, though ‘virtual’ and observed data differ on some features. It can be seen that GenD ‘refills’ the joint distribution of the independent variables, conditioned by the dependent one. This paper is the result of deep collaboration among all authors. Cinzia Meraviglia wrote § 1, 3, 4, 6, 7 and 8; Giulia Massini wrote §5; Daria Croce performed some elaborations with neural networks and linear regression; Massimo Buscema wrote §2.  相似文献   

4.
Within the class of weighted averages of ordered measurements on criteria of ‘equal importance’, we discuss aggregators that help assess to what extent ‘most’ criteria are satisfied, or a ‘good performance across the board’ is attained. All aggregators discussed are compromises between the simple mean and the very demanding minimum score. The weights of the aggregators are described in terms of the convex polytopes as induced by the characteristics of the aggregators, such as concavity. We take the barycentres of the polytopes as representative weights.  相似文献   

5.
A Data Envelopment Analysis (DEA) cost minimization model is employed to estimate the cost to thrift institutions of achieving a rating of ‘outstanding’ under the anti-redlining Community Reinvestment Act, which is viewed as an act of voluntary Corporate Social Responsibility (CSR). There is no difference in overall cost efficiency between ‘outstanding’ and minimally compliant ‘satisfactory’ thrifts. However, the sources of cost inefficiency do differ, and an ‘outstanding’ rating involves annual extra cost of $6.547 million or, 1.2% of total costs. This added cost is the shadow price of CSR since it is not an explicit output or input in the DEA cost model. Before and after-tax rates of return are the same for the ‘outstanding’ and ‘satisfactory’ thrifts, which implies a recoupment of the extra cost. The findings are consistent with CSR as a management choice based on balancing marginal cost and marginal revenue. An incidental finding is that larger thrifts are less efficient.
Donald F. VitalianoEmail: Phone: +1-518- 276-8093
  相似文献   

6.
In this paper we study the endogenous determination of bureaucratic friction in a bureaucratic contest with () and without (n = 1) rent contestability. When n= 1 bureaucratic impediments induce the individual to undertake rent-securing activities at the same level as in the two-player rent-seeking contest. However, under rent contestability the bureaucracy no longer serves as a means of extracting resources from the public. The paper concludes with the study of the effect of ‘net costs’ on bureaucratic friction. It turns out that under cotestability the only reason for creating bureaucratic friction is the ‘negative costs’ it incurs while when n = 1 the effect of the bureaucrat's net costs of generating bureaucratic friction on the optimal degree of such friction is ambiguous. Received: October 30, 2000 / Accepted: December 28, 2001 RID="*" ID="*" The authors are grateful to two anonymous referees for their valuable comments  相似文献   

7.
In the process of coding open-ended questions, the evaluation of interjudge reliability is a critical issue. In this paper, using real data, the behavior of three coefficients of reliability among coders, Cohen’s K, Krippendorff’s α and Perreault and Leigh’s I r are patterned, in terms of the number of judges involved and the categories of answer defined. The outcome underlines the importance of both variables in the valuations of interjudge reliability, as well as the higher adequacy of Perreault and Leigh’s I r and Krippendorff’s α for marketing and opinion research.  相似文献   

8.
In this paper, the validity of vignette analyses of various forms of deviant behavior in the presence of opportunities is analyzed on the basis of ideas derived from cognitive psychology. Abelson’s Script Theory together with insights into human memory of visual and verbal information, allow the assumption that vignette analyses using visual stimuli are valid measures of deviant behavior in particular. The study includes an empirical examination of these ideas (n = 450). Nonparticipant observations and vignette analyses with visual and verbal material were carried out with regard to three forms of deviant behavior occurring in the presence of opportunities presenting themselves in everyday life. Observed and self-reported frequencies of deviant behavior or deviant intentions were counted and cross-tabulated. Log-linear analyses with dummy coding using observation data as reference category were run. Data analyses yielded the result that frequencies of deviant behavior were related to the techniques of data collection under consideration. Especially vignette analyses of the return of ‘lost letters’ that use both visual and verbal stimuli overestimate ‘actual’ (i.e. observed) return rates. This result is discussed with regard to the underlying methodological assumptions as well as its implications.  相似文献   

9.
Batteries of questions with identical response items are commonly used in survey research. This paper suggests that question order has the potential to cause systematic positive or negative bias on responses to all questions in a battery. Whilst question order effects have been studied for many decades, almost no attention has been given to this topic. The primary aim is to draw attention to this effect, to demonstrate its possible magnitude, and to discuss a range of mechanisms through which it might occur. These include satisficing, anchoring and cooperativeness. The effect seems apparent in the results of a recent survey. This was a survey of Emergency Department patients presenting to Wollongong Hospital (Australia) with apparently less urgent conditions in 2004. Two samples were taken. Question order was fixed in the first sample (n = 104; response rate RR2 = 94%), but randomised in the second sample (n = 46; response rate RR2 = 96%). Respondents were asked to indicate whether each of 18 reasons for presenting to the ED was a ‘very important reason’ a ‘moderately important reason’ or ‘not a reason’ The mean number of very important reasons selected was 56% higher in the first sample as compared to the second sample.  相似文献   

10.
We study the problem of predicting future k-records based on k-record data for a large class of distributions, which includes several well-known distributions such as: Exponential, Weibull (one parameter), Pareto, Burr type XII, among others. With both Bayesian and non-Bayesian approaches being investigated here, we pay more attention to Bayesian predictors under balanced type loss functions as introduced by Jafari Jozani et al. (Stat Probab Lett 76:773–780, 2006a). The results are presented under the balanced versions of some well-known loss functions, namely squared error loss, Varian’s linear-exponential loss and absolute error loss or L 1 loss functions. Some of the previous results in the literatures such as Ahmadi et al. (Commun Stat Theory Methods 34:795–805, 2005), and Raqab et al. (Statistics 41:105–108, 2007) can be achieved as special cases of our results. Partial support from Ordered and Spatial Data Center of Excellence of Ferdowsi University of Mashhad is acknowledged by J. Ahmadi. M. J. Jozani’s research supported partially by a grant of Statistical Research and Training Center. é. Marchand’s research supported by NSERC of Canada. A. Parsian’s research supported by a grant of the Research Council of the University of Tehran.  相似文献   

11.
Based on ‘endogenous’ growth theory, the paper examines the effect of trade liberalization on long-run income per capita and economic growth in Turkey. Although the presumption must be that free trade has a beneficial effect on long run growth, counter examples can also be found. This controversy increases the importance of empirical work in this area. Using the most recent data we employ multivariate cointegration analysis to test the long run relationship among the variables in hand. In a multivariate context, the effect of determinants such as increasing returns to scale, investment in human and physical capital are also included in both theoretical and empirical works. Our causality evidence between the long run growth and a number of indicators of trade liberalizations confirms the predictions of the ‘new growth theory’. However, the overall effect of the possible breaks and/or policy change and unsustainability in the 1990s looks contradictory and deserves further investigation.  相似文献   

12.
Statistical properties of order-driven double-auction markets with Bid–Ask spread are investigated through the dynamical quantities such as response function. We first attempt to utilize the so-called Madhavan–Richardson–Roomans model (MRR for short) to simulate the stochastic process of the price-change in empirical data sets (say, EUR/JPY or USD/JPY exchange rates) in which the Bid–Ask spread fluctuates in time. We find that the MRR theory apparently fails to simulate so much as the qualitative behaviour (‘non-monotonic’ behaviour) of the response function R(l) (l denotes the difference of times at which the response function is evaluated) calculated from the data. Especially, we confirm that the stochastic nature of the Bid–Ask spread causes apparent deviations from a linear relationship between the R(l) and the auto-correlation function C(l), namely, R(l) μ -C(l){R(l) \propto -C(l)}. To make the microscopic model of double-auction markets having stochastic Bid–Ask spread, we use the minority game with a finite market history length and find numerically that appropriate extension of the game shows quite similar behaviour of the response function to the empirical evidence. We also reveal that the minority game modeling with the adaptive (‘annealed’) look-up table reproduces the non-linear relationship R(l) μ -f(C(l)){R(l) \propto -f(C(l))} (f(x) stands for a non-linear function leading to ‘λ-shapes’) more effectively than the fixed (‘quenched’) look-up table does.  相似文献   

13.
The cultural theory pioneered by Dame Mary Douglas has been tested with a range of research methods, but it has not yet been made subject to a ‘structured observation’. This method has been developed in psychology and management studies, and is especially useful for testing cultural theory’s prediction that fatalistic, hierarchical, egalitarian, and individualistic ways of perceiving and justifying tend to emerge in group debates about pressing social and environmental issues. We present the results of a structured observation of this prediction. Groups of high school students (aged 17–19) were asked for their opinions concerning three to five ‘wicked’ (i.e., highly complex) problems, and to discuss how to resolve them. Each utterance was coded according to the rationalities proposed by cultural theory. The results confirm cultural theory’s hypothesis that all four specific ways of defining, perceiving and resolving a wicked problem emerge when a number of people debate such an issue. We also discuss how Douglas’ cultural theory can be further developed and tested. Finally, we use our study to outline how the method of structured observation can contribute to political culture research in general.  相似文献   

14.
We characterize the equilibrium of the all-pay auction with general convex cost of effort and sequential effort choices. We consider a set of n players who are arbitrarily partitioned into a group of players who choose their efforts ‘early’ and a group of players who choose ‘late’. Only the player with the lowest cost of effort has a positive payoff in any equilibrium. This payoff depends on his own timing vis-a-vis the timing of others. We also show that the choice of timing can be endogenized, in which case the strongest player typically chooses ‘late’, whereas all other players are indifferent with respect to their choice of timing. In the most prominent equilibrium the player with the lowest cost of effort wins the auction at zero aggregate cost. We thank Dan Kovenock and Luis C. Corchón for discussion and helpful comments. The usual caveat applies. Wolfgang Leininger likes to express his gratitude to Wissenschaftszentrum Berlin (WZB) for its generous hospitality and financial support.  相似文献   

15.
A ‘trilemma’ procedure is introduced for collecting ‘dominance data’ (i.e. rankings of a set of items along a scale of relevance, preference, etc.). Trilemmas are three-way forced choices where the three items comprising each trilemma are selected on the basis of a multidimensional scaling solution (MDS) for the item set, ensuring that each choice is as stark and informative as possible. A questionnaire designed on this principle is easily understood and rapidly administered. The data are convenient to record and show less fluctuation among informants than existing techniques. We demonstrate the procedure with a set of 45 short generalisations about behaviour, designed for assessing child attachment. A three-dimensional ‘map’ of these items was obtained by applying MDS to multiple sets of similarity data. The same structure emerged from English-language and Japanese translations of the items. Thirty trilemmas based on this map were used to rank the items by degree of association with the Japanese concept of amae, characterising the concept in terms of its behavioural correlates.  相似文献   

16.
This paper uses firm-level data recorded in the Amadeus database to investigate the distribution of labour productivity in different European countries. We find that the upper tail of the empirical productivity distributions follows a decaying power-law, whose exponent α is obtained by a semi-parametric estimation technique recently developed by Clementi et al. [Physica A 370(1):49–53, 2006]. The emergence of “fat tails” in productivity distribution has already been detected in Di Matteo et al. [Eur Phys J B 47(3):459–466, 2005] and explained by means of a model of social network. Here we show that this model is tested on a broader sample of countries having different patterns of social network structure. These different social attitudes, measured using a social capital indicator, reflect in the power-law exponent estimates, verifying in this way the existence of linkages among firms’ productivity performance and social network.  相似文献   

17.
We extend an earlier model of innovation dynamics based on percolation by adding endogenous R&D search by economically motivated firms. The {0, 1} seeding of the technology lattice is now replaced by draws from a lognormal distribution for technology ‘difficulty’. Firms are rewarded for successful innovations by increases in their R&D budget. We compare two regimes. In the first, firms are fixed in a region of technology space. In the second, they can change their location by myopically comparing progress in their local neighborhoods and probabilistically moving to the region with the highest recent progress. We call this the moving or self-organizational regime (SO). The SO regime always outperforms the fixed one, but its performance is a complex function of the ‘rationality’ of firm search (in terms of search radius and speed of movement). The clustering of firms in the SO regime grows rapidly and then fluctuates in a complex way around a high value that increases with the search radius. We also investigate the size distributions of the innovations generated in each regime. In the fixed one, the distribution is approximately lognormal and certainly not fat tailed. In the SO regime, the distributions are radically different. They are much more highly right skewed and show scaling over at least 2 decades with a slope around one, for a wide range of parameter settings. Thus we argue that firm self-organization leads to self-organized criticality. The online version of the original article can be found under doi:.  相似文献   

18.
The 1981 PATCO strike stands out as a symbol of union decline. The penchant to stigmatize PATCO detracts from important aspects of the union’s unorthodox strategy. Preparations for 1981 negotiations were coordinated by rank-and-file activists who referred to themselves as ‘choir boys’. An extensive mobilization network cultivated by the ‘choir boys’ contributed to cohesiveness and in effect democratized PATCO. The union’s effectiveness in building internal solidarity was its most notable accomplishment. Twenty-first-century labor-movement revitalization will require not only strong, creative leadership but also rank-and-file mobilization in the mold of PATCO’s ‘choir boy’ system. It is this type of grassroots activism that has the potential to promote an internal culture of militant action which can serve as the foundation for union growth.
Richard W. HurdEmail: Phone: +1-607-2552765
  相似文献   

19.
The aim of this paper is to derive methodology for designing ‘time to event’ type experiments. In comparison to estimation, design aspects of ‘time to event’ experiments have received relatively little attention. We show that gains in efficiency of estimators of parameters and use of experimental material can be made using optimal design theory. The types of models considered include classical failure data and accelerated testing situations, and frailty models, each involving covariates which influence the outcome. The objective is to construct an optimal design based of the values of the covariates and associated model or indeed a candidate set of models. We consider D-optimality and create compound optimality criteria to derive optimal designs for multi-objective situations which, for example, focus on the number of failures as well as the estimation of parameters. The approach is motivated and demonstrated using common failure/survival models, for example, the Weibull distribution, product assessment and frailty models.  相似文献   

20.
A classic puzzle in the economic theory of the firm concerns the fundamental cause of decreasing returns to scale. If a plant producing product quantityX at costC can be replicated as often as desired, then the quantityrX need never cost more thanrC. Traditionally the firm is imagined to take its identity from a fixednon-replicable input, namely a ‘top manager’; as more plants or divisions are added, the communication and computation burden imposed on the top manager (who has information not possessed by the divisions) grows more than proportionately. Decreasing returns are experienced as the top manager hires more variable inputs to cope with the rising burden. Suppose it turns out, however, that when the divisions are assembled, and are given exactly the same totally independent tasks that they fulfilled when they were autonomous, then asaving can be achieved if they adopt a joint procedure for performing those tasks rather than replicating their previous separate procedures. Then the top manager's rising burden must be shown to be particularly onerous—otherwise there may actually beincreasing returns. We show that for a certain model of the information-processing procedure used by the separate divisions and by the firm, there may indeed be such an odd unexpected saving. The saving occurs with respect to the size of the language in which members of each division, or of the firm, communicate with one another, provided that language is finite. If instead the language is a continuum then the saving cannot occur, provided that the procedures used obey suitable ‘smoothness’ conditions. We show that the saving for the finite case can be ruled out in two ways: by requiring the procedures used to obey a regularity condition that is a crude analogue of the smoothness conditions we impose on the continuum procedures, or by insisting that the procedure used be a ‘deterministic’ protocol. Such a protocol prescribes a conversation among the participants, in which a participant has only one choice, whenever that participant has to make an announcement to the others. The results suggest that a variety of information-processing models will have to be studied before the traditional explanation for decreasing returns to scale is understood in a rigorous way.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号