首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 727 毫秒
1.
This paper develops a sequential scheduling algorithm for consultation periods not divided into slots. Patients call a scheduler and request appointments with a specified provider. The scheduler provides the patient with an appointment time before the call terminates. In making the appointment, the scheduler cannot alter the appointments of previously scheduled patients. Service times are random and each scheduled patient has a probability of “no-showing” for the appointment. Each arriving patient generates a given amount of revenue, and costs are incurred from patient waiting and provider overtime. The scheduling method selects the calling patient's appointment time by minimizing the total expected cost. We prove that total expected cost is a convex function of appointment times and that the expected profit of the schedule is unimodal, which provides a unique stopping criterion for the scheduling algorithm. Computational studies compare this approach with no-show based sequential scheduling methods for out-patient clinics where a predefined slot structure is assumed. The new method yields higher expected profit and less overtime than when service periods are pre-divided into slots. Because slot scheduling is ingrained in healthcare, we use the model to design slot structures that account for no-show and service time variation.  相似文献   

2.
We propose a model studying the random assignments of bundles with no free disposal. The key difference between our model and the one where objects are allocated (see Bogomolnaia and Moulin (2001)) is one of feasibility. The implications of this difference are significant. Firstly, the characterization of sd-efficient random assignments is more complex. Secondly, we are able to identify a preference restriction, called essential monotonicity, under which the random serial dictatorship rule (extended to the setting with bundles) is equivalent to the probabilistic serial rule (extended to the setting with bundles). This equivalence implies the existence of a rule on this restricted domain satisfying sd-efficiency, sd-strategy-proofness, and equal treatment of equals. Moreover, this rule only selects random assignments which can be decomposed as convex combinations of deterministic assignments.  相似文献   

3.
We show that every strategy-proof and unanimous probabilistic rule on a binary restricted domain has binary support, and is a probabilistic mixture of strategy-proof and unanimous deterministic rules. Examples of binary restricted domains are single-dipped domains, which are of interest when considering the location of public bads. We also provide an extension to infinitely many alternatives.  相似文献   

4.
This paper replacesGibbard’s (Econometrica 45:665-681, 1977) assumption of strict ordinal preferences by themore natural assumption of cardinal preferences on the set pure social alternatives and we also admit indifferences among the alternatives. By following a similar line of reasoning to the Gibbard-Satterthwaite theoremin the deterministic framework, we first show that if a decision scheme satisfies strategy proofness and unanimity, then there is an underlying probabilistic neutrality result which generates an additive coalitional power function. This result is then used to prove that a decision scheme which satisfies strategy proofness and unanimity can be represented as a weak random dictatorship. A weak random dictatorship assigns each individual a chance to be a weak dictator. An individual has weak dictatorial power if the support of the social choice lottery is always a subset of his/her maximal utility set. In contrast to Gibbard’s complete characterization of randomdictatorship, we also demonstrate with an example that strategy proofness and unanimity are sufficient but not necessary conditions for a weak random dictatorship.  相似文献   

5.
It is proved that every strategy-proof, peaks-only or unanimous, probabilistic rule defined over a minimally rich domain of single-peaked preferences is a probability mixture of strategy-proof, peaks-only or unanimous, deterministic rules over the same domain. The proof employs Farkas’ Lemma and the max-flow min-cut theorem for capacitated networks.  相似文献   

6.
We study a probabilistic assignment problem when agents have multi-unit demands for objects. We first introduce two fairness requirements to accommodate different demands across agents. We show that each of these requirements is incompatible with stochastic dominance efficiency (henceforth, we use the prefix “sd” for stochastic dominance) and weak sd-strategy-proofness, unless all agents have unitary demands. We next introduce a new incentive requirement which we call limited invariance. We explore implications of these requirements in combination of consistency or converse consistency.Our main result is that the generalized serial rule, which we propose as an adaptation of the serial rule to our setting, is the only rule satisfying sd-efficiency, the sd proportional-division lower-bound, limited invariance, and consistency. Uniqueness persists if we replace the sd proportional-division lower-bound by sd normalized-no-envy, or consistency by converse consistency, or both. The serial rule in Bogomolnaia and Moulin (2001) is characterized as a special case of our generalized serial rule.  相似文献   

7.
Future changes in population size, composition, and spatial distribution are key factors in the analysis of climate change, and their future evolution is highly uncertain. In climate change analyses, population uncertainty has traditionally been accounted for by using alternative scenarios spanning a range of outcomes. This paper illustrates how conditional probabilistic projections offer a means of combining probabilistic approaches with the scenario-based approach typically employed in the development of greenhouse gas emissions projections. The illustration combines a set of emissions scenarios developed by the Intergovernmental Panel on Climate Change (IPCC) with existing probabilistic population projections from IIASA. Results demonstrate that conditional probabilistic projections have the potential to account more fully for uncertainty in emissions within conditional storylines about future development patterns, to provide a context for judging the consistency of individual scenarios with a given storyline, and to provide insight into relative likelihoods across storylines, at least from a demographic perspective. They may also serve as a step toward more comprehensive quantification of uncertainty in emissions projections.  相似文献   

8.
This paper introduces a notion of consistency for the probabilistic assignment model, which we call probabilistic consistency. We show that the axioms equal treatment of equals and probabilistic consistency characterize the uniform rule, which is the rule which randomizes uniformly over all possible assignments.  相似文献   

9.
We characterize the class of dominant-strategy incentive-compatible (or strategy-proof) random social choice functions in the standard multi-dimensional voting model where voter preferences over the various dimensions (or components) are lexicographically separable. We show that these social choice functions (which we call generalized random dictatorships) are induced by probability distributions on voter sequences of length equal to the number of components. They induce a fixed probability distribution on the product set of voter peaks. The marginal probability distribution over every component is a random dictatorship. Our results generalize the classic random dictatorship result in Gibbard (1977) and the decomposability results for strategy-proof deterministic social choice functions for multi-dimensional models with separable preferences obtained in LeBreton and Sen (1999).  相似文献   

10.
Sprumont (1991) has established that the only allocation rule for the division problem that is strategy-proof, efficient, and anonymous is the uniform rule when the domain is the set of all possible profiles of continuous single-peaked preferences. Sprumont's characterization of the uniform rule is shown to hold on any larger domain of single-peaked preferences. Received: 15 December 1998 / Accepted: 12 April 1999  相似文献   

11.
We extend the Shapley-Scarf (1974) model - where a finite number of indivisible objects is to be allocated among a finite number of individuals - to the case where the primary endowment set of an individual may contain none, one, or several objects and where property rights may be transferred (objects inherited) as the allocation process unfolds, under the retained assumption that an individual consumes at most one object. In this environment we analyze the core of the economy and characterize the set of strategy-proof and Pareto efficient mechanisms. As an alternative approach, we consider property rights implicitly defined by a strategy-proof and Pareto efficient mechanism and show a core property for the mechanism-induced endowment rule.Received: 19 February 2004, Accepted: 14 April 2005, JEL Classification: C71, C78, D71, D78We would like to thank two anonymous referees for valuable comments. Financial support from The Swedish Council for Research in the Humanities and Social Sciences is gratefully acknowledged by Lars-Gunnar Svensson. Financial support from The Jan Wallander and Tom Hedelius Foundation is gratefully acknowledged by Bo Larsson.  相似文献   

12.
In this paper we assess the relative contribution of working conditions to wage determination with an emphasis on differences along the earnings distribution. A survey of British employees in 2001 rich in questions regarding the job post enables us to separate the contribution of working conditions, job attributes and individual characteristics to the process of wage determination. Standard wage equations reveal that covariates such as having “repetitive job” and using generic skills such as “literacy” or “customer handling skills” are associated with significant premiums and penalties. Quantile regressions confirm the presence of penalties to poor working conditions, such as “working to tight deadlines”, that are significant in the middle section of the earnings distribution and robust to the inclusion of a wide range of controls for person, firm and other job characteristics. Counterfactual decompositions at quantiles show that, despite the apparent penalty, there are pecuniary compensations to poor working conditions around the first quartile and the median of the earnings distribution.  相似文献   

13.
Exchangeability and non-self-averaging   总被引:1,自引:1,他引:0  
To pass from a deterministic dynamics of aggregate quantities to a probabilistic dynamics of a system of microvariables that describe the individual strategies of a population of economic agents, the route is that of Boltzmann??s kinetic theory at the half of XIX century (more suitable than that of Gibbs?? statistical mechanics), that is the introduction of n ??elements?? (molecules, agents,??), submitted to some microdynamics, wherefrom to derive the macroscopic behavior. The macrovariate is interpreted as a (time) mean of the average (on all elements) of the individual study-property at time t. The micro-derivation looks unproblematic if means and averages tend to constant values in the limit n ?? ??. If this property, defined ??self-averaging?? in some recent papers by Aoki, holds, it would separate a deterministic result from fluctuations; consequently well defined macroeconomic deterministic relations prevail. However it is easy to show that in most cases in economy this property does not hold, due to long-range correlation existing among economic agents. If individual agents are not independent but exchangeable, also in the limit n ?? ?? the coefficient of variation of the macrovariable is finite, which tends to a random limit rather than a constant. Finally the term ??indistinguishable agent?? is criticized, and the alternative ??exchangeable agent?? is discussed.  相似文献   

14.
We provide a set of conditions sufficient for consistency of a general class of fixed effects instrumental variables (FE-IV) estimators in the context of a correlated random coefficient panel data model, where one ignores the presence of individual-specific slopes. We discuss cases where the assumptions are met and violated. Monte Carlo simulations verify that the FE-IV estimator of the population averaged effect performs notably better than other standard estimators, provided a full set of period dummies is included. We also propose a simple test of selection bias in unbalanced panels when we suspect the slopes may vary by individual.  相似文献   

15.
Using longitudinal data on individuals from the European Community Household Panel (ECHP) for eleven countries during 1995-2001, I investigate temporary job contract duration and job search effort. The countries are Austria, Belgium, Denmark, Finland, France, Greece, Ireland, Italy, the Netherlands, Portugal and Spain. I construct a search model for workers in temporary jobs which predicts that shorter duration raises search intensity. Calibration of the model to the ECHP data implies that at least 75% of the increase in search intensity over the life of a 2+ year temporary contract occurs in the last six months of the contract. I then estimate regression models for search effort that control for human capital, pay, local unemployment, and individual and time fixed effects. I find that workers on temporary jobs indeed search harder than those on permanent jobs. Moreover, search intensity increases as temporary job duration falls, and roughly 84% of this increase occurs on average in the shortest duration jobs. These results are robust to disaggregation by gender and by country. These empirical results are noteworthy, since it is not necessary to assume myopia or hyperbolic discounting in order to explain them, although the data clearly also do not rule out such explanations.  相似文献   

16.
I study a market where agents with unit demand jointly own heterogeneous goods. In this market, the existence of an efficient, incentive compatible, individually rational, and budget balanced mechanism depends on the shares of the agents. I characterize the set of shares for which having such a mechanism is possible. This set includes the symmetric allocation and excludes the allocation in which every agent owns a separate good.  相似文献   

17.
Abstract  In B ierens (1981) we have derived a uniform weak law of large numbers for stochastically stable processes with respect to a finite-dependent base. In this paper we show that this uniform weak law carries over to stochastically stable processes with respect to a, more general, φ-mixing base. This generalization will be used for relaxing the conditions for weak consistency and asymptotic normality of nonlinear least squares estimators.  相似文献   

18.
Many hypotheses made by experimental researchers can be formulated as a stochastic labelling of a given image. Some stochastic labelling methods for random closed sets are proposed in this paper. Molchanov (I. Molchanov, 1984, Theor. Probability and Math. Statist. 29 , 113–119) provided the probabilistic background for this problem. However, there is a lack of specific labelling models. Ayala and Simó (G. Ayala and A. Simó, 1995, Advances in Applied Probability 27 , 293–305) proposed a method in which, given the whole set of connected components, every component is classified in a certain phase or category in a completely random way. Alternative methods are necessary in case the random labelling hypothesis is not reliable. A different kind of labelling method is proposed that considers the environment: the type of every connected component is a function of its location.
Two different biphase images are studied: a cross section of a nerve from a rat, and a cross section of an optic nerve from a lizard.  相似文献   

19.
If missing observations in a panel data set are not missing at random, many widely applied estimators may be inconsistent. In this paper we examine empirically several ways to reveal the nature and severity of the selectivity problem due to nonresponse, as well as a number of methods to estimate the resulting models. Using a life cycle consumption function and data from the Expenditure Index Panel from the Netherlands, we discuss simple procedures that can be used to assess whether observations are missing at random, and we consider more complicated estimation procedures that can be used to obtain consistent or efficient estimates in case of selectivity of attrition bias. Finally, some attention is paid to the differences in identification, consistency, and efficiency between inferences from a single wave of the panel, a balanced sub-panel, and an unbalanced panel.  相似文献   

20.
This paper provides a first attempt at conceptualizing and operationalizing the notion of commitment to customer service (CCS) as part of a broader concern to explore the determinants of key aspects of service quality and of individual-level performance in service organizations. Based on an explicitly behavioral definition of commitment to customer service, we first set out a model of the antecedents of CCS. We then test it using data from a representative sample of 717 employees of a major food-retailing organization in the UK. The results suggest that commitment to customer service is primarily a non-calculative phenomenon driven above all by affective. normative altruistic concerns, rather than by overtly instrumental considerations. Additional significant determinants of CCS were job pressure, job routinization. job competence and employees' understanding of customer service requirements. Research and policy implications of the study are discussed.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号