首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 328 毫秒
1.
Thomson (1995a) proved that the uniform allocation rule is the only allocation rule for allocation economies with single-peaked preferences that satisfies Pareto efficiency, no-envy,one-sided population-monotonicity, and replication-invariance on a restricted domain of single-peaked preferences. We prove that this result also holds on the unrestricted domain of single-peaked preferences. Next, replacing one-sided population-monotonicity by one-sided replacement-domination yields another characterization of the uniform allocation rule, Thomson (1997a). We show how this result can be extended to the more general framework of reallocation economies with individual endowments and single-peaked preferences. Following Thomson (1995b) we present allocation and reallocation economies in a unified framework of open economies. Received: 20 February 1999 / Accepted: 15 February 2000  相似文献   

2.
Authorities working on economic-crime investigation in Finland are trying to change their form of collaboration from the sequential passing of documents towards parallel, interorganizational collaboration: the on-line investigation of an ongoing crime. The synchronization of events and the outputs of various participants proved to be difficult in this emerging process. This new model of crime investigation also requires a new kind of time management. This article explores how the change is being constructed in everyday practice by examining three economic-crime-investigation cases. It is claimed that individual efforts to manage time allocation suffice only in terms of co-ordination of events. It is suggested that a successful shift to parallel, interorganizational collaboration requires more than the common marking of calendars. The object of the work and the forms of interaction should be taken as subjects of reflective negotiation. New kinds of collective time-management tools are needed in this effort.  相似文献   

3.
Heteroskedasticity and autocorrelation consistent (HAC) estimation commonly involves the use of prewhitening filters based on simple autoregressive models. In such applications, small sample bias in the estimation of autoregressive coefficients is transmitted to the recolouring filter, leading to HAC variance estimates that can be badly biased. The present paper provides an analysis of these issues using asymptotic expansions and simulations. The approach we recommend involves the use of recursive demeaning procedures that mitigate the effects of small‐sample autoregressive bias. Moreover, a commonly used restriction rule on the prewhitening estimates (that first‐order autoregressive coefficient estimates, or largest eigenvalues, >0.97 be replaced by 0.97) adversely interferes with the power of unit‐root and [ Kwiatkowski, Phillips, Schmidt and Shin (1992) Journal of Econometrics, Vol. 54, pp. 159–178] (KPSS) tests. We provide a new boundary condition rule that improves the size and power properties of these tests. Some illustrations of the effects of these adjustments on the size and power of KPSS testing are given. Using prewhitened HAC estimates and the new boundary condition rule, the KPSS test is consistent, in contrast to KPSS testing that uses conventional prewhitened HAC estimates [ Lee, J. S. (1996) Economics Letters, Vol. 51, pp. 131–137].  相似文献   

4.
In this paper, a simple and obvious procedure is presented that allows to estimate π, the population proportion of a sensitive group, in addition to T, the probability that the respondents belonging to the sensitive group tell the truth whenever questioning directly. Properties of the estimators of π and T as well as sample size allocation are studied. And, efficiency comparisons are carried out to investigate the performance of the proposed method. It is found that the proposed strategy is more efficient than Warner's (1965) strategy, and has an additional advantage of deciding the optimal survey technique for practical situations. Received: October 2000  相似文献   

5.
M. Z. Khan 《Metrika》1976,23(1):211-219
The problem of optimally allocating a sample amongk-strata at the second phase of a two-phase sampling procedure when the sampling is for proportion has been discussed byNewbold [1971] and then by the author [Khan, 1972].This paper deals with optimum allocation of the sample amongk-strata at the second phase of a two-phase sampling procedure, when the sampling is form-attributes. The problem is to estimate the proportion of each attribute in the population. Here (m–1) dimensional Dirichlet distribution is taken as the prior distribution.  相似文献   

6.
Despite certain advances for non‐randomized response (NRR) techniques in the past 6 years, the existing non‐randomized crosswise and triangular models have several limitations in practice. In this paper, I propose a new NRR model, called the parallel model with a wider application range. Asymptotical properties of the maximum likelihood estimator (and its modified version) for the proportion of interest are explored. Theoretical comparisons with the crosswise and triangular models show that the parallel model is always more efficient than the two existing NRR models for most of the possible parameter ranges. Bayesian methods for analyzing survey data from the parallel model are developed. A case study on college students' premarital sexual behavior at Wuhan and a case study on plagiarism at the University of Hong Kong are conducted and are used to illustrate the proposed methods. © 2014 The Authors. Statistica Neerlandica © 2014 VVS.  相似文献   

7.
The concept of strict proportional power is introduced, as a means of formalizing a desire to avoid discrepancy between the seat distribution in a voting body and the actual voting power in that body, as measured by power indices in common use. Proportionality is obtained through use of a randomized decision rule (majority rule). Some technical problems which arise are discussed in terms of simplex geometry. Practical implications and problems in connection with randomized decision rules are indicated.  相似文献   

8.
A house allocation rule should be flexible in its response to changes in agents’ preferences. We propose a specific notion of this flexibility. An agent is said to be swap-sovereign over a pair of houses at a profile of preferences if the rule assigns her one of the houses at that profile and assigns her the other house when she instead reports preferences that simply swap the positions of the two houses. A pair of agents is said to be mutually swap-sovereign over their assignments at a profile if the rule exchanges their assignments when they together report such ‘swap preferences’. An allocation rule is individually swap-flexible if any pair of houses has a swap-sovereign agent, and is mutually swap-flexible if any pair of houses has either a swap-sovereign agent or mutually swap-sovereign agents. We show for housing markets that the top-trading-cycles rule is the unique strategy-proof, individually rational and mutually swap-flexible rule. In house allocation problems, we show that queue-based priority rules are uniquely strategy-proof, individually swap-flexible and envy non-bossy. Varying the strength of non-bossiness, we characterise the important subclasses of sequential priority rules (additionally non-bossy) and serial priority rules (additionally pair-non-bossy and pair-sovereign).  相似文献   

9.
Abstract

In this paper, we investigate the impact of accounting standards on the information content of stock prices using a sample of 44 countries from around the world. We find that the adoption of International Financial Reporting Standards or US Generally Accepted Accounting Principles per se does not make stock prices more informative, but that better accounting standards are helpful only in countries having effective legal environments. In particular, we find a significantly negative relationship between stock price synchronicity and the quality of accounting standards in countries with a common-law legal origin and generally better shareholder protection. Our findings are consistent with the theoretical prediction in Zhang [(2013). Accounting standards, cost of capital, resource allocation, and welfare in a large economy. The Accounting Review, 88(4), 1459–1488] that improving accounting standards effectively would increase social welfare in general.  相似文献   

10.
Since college development officers supply recognition to alumni and seek donations in return a model is devised wherein this exercise of market power in the exchange is included. Data for three years is used from eighteen universities and colleges-public and private, large and small, research and teaching oriented. The findings indicate that schools with higher development costs generate substantially more donations. Several demographic characteristics of the student body were tested. Schools with higher participation in fraternities and sororities have higher giving, while schools with a higher proportion of part-time students have lower giving. Surprisingly, having an NCAA Division I athletic program has no significant effect on alumni giving. Likewise, alumni donations seem independent of whether a school is public or private, or is or is not a research institution. Finally, the size of a school's endowment has no predictive value, but the level of annual bequests is strongly positively related to alumni giving.  相似文献   

11.
When a large number of time series are to be forecast on a regular basis, as in large scale inventory management or production control, the appropriate choice of a forecast model is important as it has the potential for large cost savings through improved accuracy. A possible solution to this problem is to select one best forecast model for all the series in the dataset. Alternatively one may develop a rule that will select the best model for each series. Fildes (1989) calls the former an aggregate selection rule and the latter an individual selection rule. In this paper we develop an individual selection rule using discriminant analysis and compare its performance to aggregate selection for the quarterly series of the M-Competition data. A number of forecast accuracy measures are used for the evaluation and confidence intervals for them are constructed using bootstrapping. The results indicate that the individual selection rule based on discriminant scores is more accurate, and sometimes significantly so, than any aggregate selection method.  相似文献   

12.
Abstract . Job search has profound implications for both the extent and duration of unemployment and hence for the efficient allocation of human resources. Yet, little is known about the relative effectiveness of alternative methods of job search. This study uses the National Longitudinal Survey of Youth Labor Market Experience to examine whether different methods influence the duration of job search and job satisfaction. Methods of search do seem to differ significantly in influencing duration of job search but not so in respect of job satisfaction, contrary to a widely held view. Some have held that informal channels of job search convey a particular type of qualitative information which makes for better and more efficient job choice and that this largely explains their extensive usage, but the data do not support this position. Both findings have potentially important implications for job search theory and government intervention in the labor market.  相似文献   

13.
In the face of intractable societal grand challenges, organizations increasingly resort to responsible innovation – that is, they pledge to create value for multiple stakeholders through developing new products or services that avoid doing harm and improve conditions for people and the planet. While the link between responsible innovation and societal improvements has been established, organizations pursuing responsible innovation lack governance mechanisms to guide the allocation of the value created – both economic and social – among heterogeneous stakeholders, in line with their responsible intent. We combine the value-based strategy and stakeholder perspectives and infuse a deliberative process to design a three-stage model of value allocation that rests on three key organizational decisions: i) what value to create and for whom, ii) how to appropriate the value created vis-à-vis unintended value appropriators, and iii) how to distribute the value appropriated among intended stakeholders. We propose a framework of stakeholder governance comprised of four novel mechanisms by which organizations can allocate value among their multiple principal stakeholders as part of participative processes. Our study contributes to responsible innovation and corporate governance research by unpacking how new value is managed to solve societal grand challenges.  相似文献   

14.
This paper introduces a new dispatching rule to job shop scheduling, extending earlier results to a multi-machine environment. This new rule, which uses modified due dates is compared to other popular dispatching methods over a range of due date tightnesses at two utilization levels. The results for mean tardiness indicate that the modified operation due date (MOD) rule compares very favorably with other prominent dispatching methods.The modified due date is a job's original due date or its early finish time, whichever is larger. For an individual operation, it is the operation's original due date or the operation's early finish time, whichever is larger. A comparison of the job-based version with the operation-based version indicated that the operation-based version tended to be more effective at meeting due dates.The main performance measure was mean job tardiness, although the proportion of tardy jobs was also reported, and the two measures together imply the conditional mean tardiness. The MOD rule was compared to several well-known tardiness-oriented priority rules, such as minimum slack-per-operation (S/OPN), smallest critical ratio (SCR), and COVERT. The MOD rule tended to achieve lower levels of mean tardiness than the other rules, except under conditions where due dates were quite loose. In this situation, very little tardiness occurs for any of the rules. The MOD rule appeared to be more robust than the other rules to changes in the tightness of due dates, and similar results occurred at both high and low utilizations.  相似文献   

15.
B. H. Eichhorn 《Metrika》1986,33(1):189-195
Summary Two treatments are being compared as to which has the higher mean response. The response for the 2 treatments is assumed to be normally distributed, with unknown means. Observations are being allocated sequentially to the different treatments, the purpose is to locate the superior treatment, the one with the higher mean response. We are going to find a treatment for which we have 1— confidence of its being superior, we call it then an -superior treatment, Allocation schemes are discussed for which there will be a convergence on an -superior treatment, while heuristically trying to reduce the number of allocations to the inferior treatment. No stopping rule is applied and there is no point at which the experiment is terminated and the use begins, the scheme assures that the allocation will eventually converge on one treatment.  相似文献   

16.
Abstract . The median voter model is a frequently used tool in analyzing local public sector issues. However, its use in analyzing multidimensional issues has been criticized. This study examines the criticism in the context of the decision by a municipality to establish a local constitution by voting for a home rule charter. A model is developed and tested using a variety of income measures. This model reveals that median income is not a better explanatory variable than other income measures. Based on this evidence, the authors conclude that the median voter model is not appropriate for analyzing multidimensional issues.  相似文献   

17.
This paper provides a solution to Proebsting’s Paradox, an argument that appears to show that the investment rule known as the Kelly criterion can lead a decision maker to invest a higher fraction of his wealth the more unfavorable the odds he faces are and, as a consequence, risk an arbitrarily high proportion of his wealth on the outcome of a single event. The paper shows that a large class of investment criteria, including ‘fractional Kelly’, also suffer from the same shortcoming and adapts ideas from the literature on price discrimination and surplus extraction to explain why this is so. The paper also presents a new criterion, dubbed the doubly conservative criterion, that is immune to the problem identified above. Immunity stems from the investor’s attitudes toward capital preservation and from him becoming rapidly pessimistic about his chances of winning the better odds he is offered.  相似文献   

18.
We examine the asymptotic behavior of two strategyproof mechanisms discussed by Moulin for public goods – the conservative equal costs rule (CER) and the serial cost sharing rule (SCSR) – and compare their performance to that of the pivotal mechanism (PM) from the Clarke–Groves family. Allowing the individuals’ valuations for an excludable public project to be random variables, we show under very general assumptions that expected welfare loss generated by the CER, as the size of the population increases, becomes arbitrarily large. However, all moments of the SCSR’s random welfare loss asymptotically converge to zero. The PM does better than the SCSR, with its welfare loss converging even more rapidly to zero.  相似文献   

19.
Empirical Bayes methods of estimating the local false discovery rate (LFDR) by maximum likelihood estimation (MLE), originally developed for large numbers of comparisons, are applied to a single comparison. Specifically, when assuming a lower bound on the mixing proportion of true null hypotheses, the LFDR MLE can yield reliable hypothesis tests and confidence intervals given as few as one comparison. Simulations indicate that constrained LFDR MLEs perform markedly better than conventional methods, both in testing and in confidence intervals, for high values of the mixing proportion, but not for low values. (A decision‐theoretic interpretation of the confidence distribution made those comparisons possible.) In conclusion, the constrained LFDR estimators and the resulting effect‐size interval estimates are not only effective multiple comparison procedures but also they might replace p‐values and confidence intervals more generally. The new methodology is illustrated with the analysis of proteomics data.  相似文献   

20.
New characterizations of a classical bankruptcy rule   总被引:1,自引:0,他引:1  
Concede-and-divide is a well-known and widely accepted procedure to solve bankruptcy situations involving two agents. In a recent paper, Moreno-Ternero and Villar (2004) characterize it by means of a new property, called securement, that imposes a lower bound on the awards agents might obtain. This property can be naturally decomposed in two more elementary ones. We show that each of these components, together with a suitable version of monotonicity and the standard property of self-duality, also characterize concede-and-divide. We also show that one of these components characterizes the rule, when combined with the standard property of minimal rights first.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号