首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 125 毫秒
1.
Drug use and problems change dramatically over time in ways that are often described as reflecting an “epidemic cycle”. We use simulation of a model of drug epidemics to investigate how the relative effectiveness of different types of prevention varies over the course of such an epidemic. Specifically we use the so-called LHY model (see, Discussion Paper No. 251 of the Institute of Econometrics, OR, and Systems Theory, Vienna University of Technology, Vienna, Austria, 2000) which includes both “contagious” spread of initiation (a positive feedback) and memory of past use (a negative feedback), which dampens initiation and, hence, future use. The analysis confirms the common sense intuition that prevention is more highly leveraged early in an epidemic, although the extent to which this is true in this model is striking, particularly for campaigns designed to preserve or amplify awareness of the drug's dangers. The findings also suggest that the design of “secondary” prevention programs should change over the course of an epidemic.  相似文献   

2.
C.Yalç?n Kaya 《Socio》2004,38(1):57-72
Behrens et al. developed a control model for the cocaine epidemic in the US. They investigated the problem of allocating resources to treatment, where the objective is to minimize social and control costs. The optimal control strategy they found for a particular case of constrained budget resembles bang-bang control: a treatment-only period follows a prevention-only period. This motivated us to look at the problem of bringing the size of the epidemic down to a target level in minimum possible time, find the necessary number of switchings between treatment and prevention, and the instants these switchings occur. We carry out a computational analysis for the so-called early- and late-action scenarios, where the controls take effect from years 1967 and 1990, respectively. We investigate the problem in terms of the minimized process time, cost of control, and target ratio of heavy users. The best strategy in either scenario emerges as “first prevention, then treatment” with a rather surprising additional requirement: to reach the target in minimum possible time, the ratio of heavy users at the target should be prescribed as small as possible. This also yields the lowest cost. Further comparisons and interpretations are given.  相似文献   

3.
Extreme price dispersion is a hallmark of illegal drug markets, and this apparent contradiction to the law of one price has long puzzled drug market economists. We propose a novel explanation for this dispersion: the coupling of dealers’ unwillingness to hold inventory with dealers’ imperfect foresight concerning future prices and/or random lead times when “ordering” drugs from higher-level suppliers. Unwillingness to hold inventory means drug markets might operate consistent with a cobweb model. The classic cobweb model was inspired by the observation of cyclic (typically annual) fluctuations in commodity prices. However, with minor changes that make the model more realistic the resulting price trajectories can be highly variable or even chaotic, not just periodic. Cobweb dynamics can also amplify the variability created by supply chain disruptions.  相似文献   

4.
We provide a preference foundation for decision under risk resulting in a model where probability weighting is linear as long as the corresponding probabilities are not extreme (i.e., 0 or 1). This way, most of the elegance and mathematical tractability of expected utility is maintained and also much of its normative foundation. Yet, the new model can accommodate the extreme sensitivity towards changes from 0 to almost impossible and from almost certain to 1 that has widely been documented in the experimental literature. The model can be viewed as “expected utility with the best and worst in mind” as suggested by Chateauneuf, Eichberger and Grant (Chateauneuf, Alain, Eichberger, Jürgen, Grant, Simon, 2007. Choice under uncertainty with the best and worst in mind: NEO-Additive capacities. Journal of Economic Theory 137, 538–567) or, following our preference foundation, interpreted as “expected utility with consistent optimism and pessimism”.  相似文献   

5.
Microeconometric treatments of discrete choice under risk are typically homoscedastic latent variable models. Specifically, choice probabilities are given by preference functional differences (given by expected utility, rank-dependent utility, etc.) embedded in cumulative distribution functions. This approach has a problem: Estimated utility function parameters meant to represent agents’ degree of risk aversion in the sense of Pratt (1964) do not imply a suggested “stochastically more risk averse” relation within such models. A new heteroscedastic model called “contextual utility” remedies this, and estimates in one data set suggest it explains (and especially predicts) as well as or better than other stochastic models.  相似文献   

6.
We examine the econometric implications of the decision problem faced by a profit/utility-maximizing lender operating in a simple “double-binary” environment, where the two actions available are “approve” or “reject”, and the two states of the world are “pay back” or “default”. In practice, such decisions are often made by applying a fixed cutoff to the maximum likelihood estimate of a parametric model of the default probability. Following (Elliott and Lieli, 2007), we argue that this practice might contradict the lender’s economic objective and, using German loan data, we illustrate the use of “context-specific” cutoffs and an estimation method derived directly from the lender’s problem. We also provide a brief discussion of how to incorporate legal constraints, such as the prohibition of disparate treatment of potential borrowers, into the lender’s problem.  相似文献   

7.
We take as a starting point the existence of a joint distribution implied by different dynamic stochastic general equilibrium (DSGE) models, all of which are potentially misspecified. Our objective is to compare “true” joint distributions with ones generated by given DSGEs. This is accomplished via comparison of the empirical joint distributions (or confidence intervals) of historical and simulated time series. The tool draws on recent advances in the theory of the bootstrap, Kolmogorov type testing, and other work on the evaluation of DSGEs, aimed at comparing the second order properties of historical and simulated time series. We begin by fixing a given model as the “benchmark” model, against which all “alternative” models are to be compared. We then test whether at least one of the alternative models provides a more “accurate” approximation to the true cumulative distribution than does the benchmark model, where accuracy is measured in terms of distributional square error. Bootstrap critical values are discussed, and an illustrative example is given, in which it is shown that alternative versions of a standard DSGE model in which calibrated parameters are allowed to vary slightly perform equally well. On the other hand, there are stark differences between models when the shocks driving the models are assigned non-plausible variances and/or distributional assumptions.  相似文献   

8.
Alcohol consumption is a function of social dynamics, environmental contexts, individuals' preferences and family history. Empirical surveys have focused primarily on identification of risk factors for high-level drinking but have done little to clarify the underlying mechanisms at work. Also, there have been few attempts to apply nonlinear dynamics to the study of these mechanisms and processes at the population level. A simple framework where drinking is modeled as a socially contagious process in low- and high-risk connected environments is introduced. Individuals are classified as light, moderate (assumed mobile), and heavy drinkers. Moderate drinkers provide the link between both environments, that is, they are assumed to be the only individuals drinking in both settings. The focus here is on the effect of moderate drinkers, measured by the proportion of their time spent in “low-” versus “high-” risk drinking environments, on the distribution of drinkers.A simple model within our contact framework predicts that if the relative residence times of moderate drinkers are distributed randomly between low- and high-risk environments then the proportion of heavy drinkers is likely to be higher than expected. However, the full story even in a highly simplified setting is not so simple because “strong” local social mixing tends to increase high-risk drinking on its own. High levels of social interaction between light and moderate drinkers in low-risk environments can diminish the importance of the distribution of relative drinking times on the prevalence of heavy drinking.  相似文献   

9.
I develop an omnibus specification test for diffusion models based on the infinitesimal operator. The infinitesimal operator based identification of the diffusion process is equivalent to a “martingale hypothesis” for the processes obtained by a transformation of the original diffusion model. My test procedure is then constructed by checking the “martingale hypothesis” via a multivariate generalized spectral derivative based approach that delivers a N(0,1) asymptotical null distribution for the test statistic. The infinitesimal operator of the diffusion process is a closed-form function of drift and diffusion terms. Consequently, my test procedure covers both univariate and multivariate diffusion models in a unified framework and is particularly convenient for the multivariate case. Moreover, different transformed martingale processes contain separate information about the drift and diffusion specifications. This motivates me to propose a separate inferential test procedure to explore the sources of rejection when a parametric form is rejected. Simulation studies show that the proposed tests have reasonable size and excellent power performance. An empirical application of my test procedure using Eurodollar interest rates finds that most popular short-rate models are rejected and the drift misspecification plays an important role in such rejections.  相似文献   

10.
Amy R. Wilson  James G. Kahn 《Socio》2003,37(4):269-288
Injection drug users (IDUs) transmit the human immunodeficiency virus (HIV) via both needle sharing and sex. Available interventions for this population have varying costs and effectiveness and focus on different risk behaviors. In this analysis, we look at two interventions. One is inexpensive, broad-based and provides modest risk reductions (street outreach (SO)); the other is narrowly focused, expensive and relatively effective (methadone maintenance). This analysis explores the effects of population risk behavior, intervention effectiveness, intervention costs, and decision constraints when allocating funds between these two interventions to maximize effectiveness. We develop a model of the spread of HIV, dividing IDUs into susceptibles (uninfected) and infectives, and separately portraying sex and injection risk. We simulate the epidemic in New York City for time periods from the mid-1980s to the early 1990s, and incorporate the behavioral effects of two interventions performed singly or in combination to find the allocation that maximizes the number of infections averted in the IDUs and their noninjecting sex partners, assuming interventions have increasing marginal costs. We find that the optimal allocation nearly always involves spending the maximum allowable amount on SO. This result is largely insensitive to variations in risk parameters, intervention efficacy, or cost. The model's structure, however, makes clear that many factors contribute to this insensitivity, namely the scope of the interventions, the dual drug/sex nature of HIV risk in the population, the asymmetry of sexual risk for men and women, and the potential benefits to nonIDUs.  相似文献   

11.
The recent general equilibrium theory of trade and multinationals emphasizes the importance of third countries and the complex integration strategies of multinationals. Little has been done to test this theory empirically. This paper attempts to rectify this situation by considering not only bilateral determinants, but also spatially weighted third-country determinants of foreign direct investment (FDI). Since the dependency among host markets is particularly related to multinationals’ trade between them, we use trade costs (distances) as spatial weights. Using panel data on U.S. industries and host countries observed over the 1989–1999 period, we estimate a “complex FDI” version of the knowledge-capital model of U.S. outward FDI by various recently developed spatial panel data generalized moments (GM) estimators. We find that third-country effects are significant, lending support to the existence of various modes of complex FDI.  相似文献   

12.
The focus of this paper is characterized by (1) an examination of the factors related to the “anticipation” of potential innovations in any organizational setting and (2) the identification of strategies for the diffusion and implementation of operations research/management science (OR/MS) techniques in a particular developing region. Based on the methodology used in studying change (innovation) in health care systems, a managerial innovation model incorporating four main components [the executive, the organization, the task environment of the organization and change agent(s), including the OR/MS manager and outside consultants] is developed and examined in terms of data obtained from top executives and other managers in forty industrial firms in Cali, Colombia. In the model developed the process of innovation is decomposed into the levels of; (a) attitudes and motivations of the executive, (b) “readiness” to take action, (c) action characteristics, (d) triggering cues and (e) actions taken and evaluation (feedback loop). The model was found useful for providing predictions indicating areas to which intervention and “marketing” of OR/MS strategies should be devoted. Overall, the study provides a base for comparative and longitudinal studies.  相似文献   

13.
We study the coevolution of networks and action choices in a Prisoners' Dilemma. Agents in our model learn about both action choices and choices of interaction partners (links) by imitating successful behavior of others. The resulting dynamics yields outcomes where both cooperators and defectors coexist under a wide range of parameters. Two scenarios can arise. Either there is “full separation” of defectors and cooperators, i.e. they are found in two different, disconnected components. Or there is “marginalization” of defectors, i.e. connected networks emerge with a center of cooperators and a periphery of defectors.  相似文献   

14.
A regression discontinuity (RD) research design is appropriate for program evaluation problems in which treatment status (or the probability of treatment) depends on whether an observed covariate exceeds a fixed threshold. In many applications the treatment-determining covariate is discrete. This makes it impossible to compare outcomes for observations “just above” and “just below” the treatment threshold, and requires the researcher to choose a functional form for the relationship between the treatment variable and the outcomes of interest. We propose a simple econometric procedure to account for uncertainty in the choice of functional form for RD designs with discrete support. In particular, we model deviations of the true regression function from a given approximating function—the specification errors—as random. Conventional standard errors ignore the group structure induced by specification errors and tend to overstate the precision of the estimated program impacts. The proposed inference procedure that allows for specification error also has a natural interpretation within a Bayesian framework.  相似文献   

15.
Structural vs. atheoretic approaches to econometrics   总被引:1,自引:0,他引:1  
In this paper I attempt to lay out the sources of conflict between the so-called “structural” and “experimentalist” camps in econometrics. Critics of the structural approach often assert that it produces results that rely on too many assumptions to be credible, and that the experimentalist approach provides an alternative that relies on fewer assumptions. Here, I argue that this is a false dichotomy. All econometric work relies heavily on a priori assumptions. The main difference between structural and experimental (or “atheoretic”) approaches is not in the number of assumptions but the extent to which they are made explicit.  相似文献   

16.
The “global game with strategic substitutes and complements” of Karp et al. (2007) is used to model the decision of where to fish. A complete information game is assumed, but the model is generalized to S>1S>1 sites. In this game, a fisherman’s payoff depends on fish density in each site and the actions of other fishermen which can lead to congestion or agglomeration effects. Stable and unstable equilibria are characterized, as well as notions of equilibrium dominance. The model is applied to the Alaskan flatfish fishery by specifying a strategic interaction function (response to congestion) that is a non-linear function of the degree of congestion present in a given site. Results suggest that the interaction function may be non-monotonic in congestion.  相似文献   

17.
A. -B. El-Sayed 《Metrika》1978,25(1):193-208
The variation of the equivocation and the average mutual information over cascaded channels is studied, using the generalized information measures (entropies of degree ) and Renyi information measures (entropies of order ). The equivocation inequality, which indicates that the equivocation can never decrease as we go further from the input on a sequence of cascaded channels, is shown to be satisfied by Renyi entropies (for 0<<1) for all channels and all probability distributions. The generalized entropies are shown to satisfy this inequality for binary symmetric channels for certain probability distributions The relations among the mutual information measures between the different terminals of the sequence of cascaded channels are studied, considering both the generalized and Renyi entropies. A necessary and sufficient condition for transmitting information without any loss across cascaded channels is obtained, when Renyi entropies are applied. The same condition is proved to be sufficient for achieving the equivocation equality (which indicates the case of no information loss across cascaded channels) over the class of binary symmetric channels, when the generalized entropies are applied.  相似文献   

18.
Airline traffic forecasting is important to airlines and regulatory authorities. This paper examines a number of approaches to forecasting short- to medium-term air traffic flows. It contributes as a rare replication, testing a variety of alternative modelling approaches. The econometric models employed include autoregressive distributed lag (ADL) models, time-varying parameter (TVP) models and an automatic method for econometric model specification. A vector autoregressive (VAR) model and various univariate alternatives are also included to deliver unconditional forecast comparisons. Various approaches for taking into account interactions between contemporaneous air traffic flows are examined, including pooled ADL models and the enhanced models with the addition of a “world trade” variable. Based on the analysis of a number of forecasting error measures, it is concluded that pooled ADL models that include the “world trade” variable outperform the alternatives, and in particular univariate methods; and, second, that automatic modelling procedures are enhanced through judgmental intervention. In contrast to earlier results, the TVP models do not improve accuracy. Depending on the preferred error measure, the difference in accuracy may be substantial.  相似文献   

19.
Selecting the most promising candidates to fill an open position can be a difficult task when there are many applicants. Each applicant achieves certain performance levels in various categories and the resulting information can be overwhelming. We demonstrate how data envelopment analysis (DEA) can be used as a fair screening and sorting tool to support the candidate selection and decision-making process. Each applicant is viewed as an entity with multiple achievements. Without any a priori preference or information on the multiple achievements, DEA identifies the non-dominated solutions, which, in our case, represent the “best” candidates. A DEA-aided recruiting process was developed that (1) determines the performance levels of the “best” candidates relative to other applicants; (2) evaluates the degree of excellence of “best” candidates’ performance; (3) forms consistent tradeoff information on multiple recruiting criteria among search committee members, and, then, (4) clusters the applicants.  相似文献   

20.
Job competition between workers has important implications for “downgrading” and “bumping down”. To account for these phenomena, a matching model is considered in which highly educated and poorly educated workers compete for skilled jobs. An exogenous increase in the proportion of highly skilled workers increases the proportion of these workers in low-level jobs (downgrading). Another of the paper's findings is that changes in the composition of the workforce affect workers' opportunities to accumulate experience. An increase in the relative supply of highly educated workers reduces the opportunities for poorly educated workers to learn on the job. Both education and experience are required in order to access skilled jobs.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号