首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Due to the complexity of present day supply chains it is important to select the simplest supply chain scheduling decision support system (DSS) which will determine and place orders satisfactorily. We propose to use a generic design framework, termed the explicit filter methodology, to achieve this objective. In doing so we compare the explicit filter approach to the implicit filter approach utilised in previous OR research the latter focusing on minimising a cost function. Although the eventual results may well be similar with both approaches it is much clearer to the designer, both why and how, an ordering system will reduce the Bullwhip effect via the explicit filter approach. The “explicit filter” approach produces a range of DSS designs corresponding to best practice. These may be “mixed and matched” to generate a number of competitive delivery pipelines to suit the specific business scenario.  相似文献   

2.
In a previous experiment, we have shown that risk assessments of purchasing experts are certainly not better than that of subjects untrained in purchasing, and worse than the decisions made by formal models (J. Purchas. Supply Manage. 9 (2003) 191–198). Since both these results are rather counterintuitive, we conducted a series of experiments geared at replication and extension of these findings. These new experiments show that our previous results are robust, and reveal an additional finding that is both worrying and puzzling. It actually seems to be the case that for the purchasing decision tasks in our experiments, experts perform worse with growing experience. It therefore seems that, at least for the kinds of purchasing decisions under study, it does not make much sense to use expert judgments at all. However, we show that there is a way in which expert judgments can be used in combination with formal models to improve the predictive accuracy of purchasing predictions. In our case, superior predictions are made when we combine the prediction of a formal model with the prediction of the ‘average expert’, thereby combining the robust linear trends as encapsulated in the formal model with the more intuitive configural rules used by experts. We provide several explanations for this phenomenon.  相似文献   

3.
We extend the strategic contract model where the owner designs incentive schemes for her manager before the latter takes output decisions. Firstly, we introduce private knowledge regarding costs within each owner–manager pair. Under adverse selection, we show that delegation involves a trade‐off between strategic commitment and the cost of an extra informational rent linked to decentralization. Which policies will arise in equilibrium? We introduce in the game an initial stage where owners can simultaneously choose between control and delegation. We show that if decision variables are strategic substitutes, choosing output control through a quantity‐lump sum transfer contract is a dominating strategy. If decision variables are strategic complements, this policy is a dominated strategy. Further, two types of dominant‐strategies equilibrium may arise: in the first type, both principals use delegation; in the second one, both principals implement delegation for a low‐cost manager and output control for a high‐cost one. Copyright © 2005 John Wiley & Sons, Ltd.  相似文献   

4.
It is the aim of the paper to study within the framework of an ‘overlapping generation model’ the evolution of temporary equilibria. At date t, there are ‘newborn’ agents and ‘old’ agents who were born in previous periods; the old agents hold cash balances (fiat money) that they carried over from the previous period. At the beginning of period t, all agents receive a random endowment of consumption goods. Then the agents exchange these endowments and money on spot markets at date t (trading in future markets is not considered). Once a temporary equilibrium is reached, the economy move to the next date. Agents who were born at date t then become old and meet agents born at period t+1.It is shown that the evolution of temporary equilibria in this model leads to analyse the ergodic properties of a certain class of Markov processes with stationary transition probability.  相似文献   

5.
Annual data on U.S. hospitals from 1985–1988 are evaluated by ownership type—profit, nonprofit, state and local government, and U.S. Department of Veterans Affairs (VA)—for changes in hospital productivity over time. Distance functions are used to measure Malmquist indices of productivity change, which are then decomposed into indices of efficiency change and technology change. In contrast to previous studies using this approach, we allow for variable returns to scale and use both input and output orientations. We find that changes in technology dominate changes in inefficiency in determining changes in productivity.  相似文献   

6.
We investigate the relationship between the knowledge requirements of projects and clients’ decisions whether to procure services from external management consultants for the execution of these projects. Using data from interviews with client decision-makers regarding the execution of 86 projects, we find that knowledge requirements are strongly associated with the decision whether or not to involve external consultants. The results highlight the closeness of the relationships between clients and consultants, supporting Kitay and Wright's [Kitay, J., & Wright, C. (2004). Take the money and run? Organisational boundaries and consultants’ roles, The Service Industries Journal, 24(3) 1–18] view of the permeability of the boundaries between many client organizations and their consultants. The findings also confirm our expectation that clients use the services of external management consultants in order to procure functional or industry-specific knowledge which consultants can pool and apply efficiently across many projects.  相似文献   

7.
Optimality criteria are derived for stochastic programs with convex objective and convex constraints. The problem consists in selecting x1Rn1 and so as to satisfy the constraints and minimize total expected cost, where σ is a probability measure. The (basic) Kuhn–Tucker conditions are obtained in terms of conditions on the existence of saddle points of a Lagrangian associated with the stochastic program. We also give an interpretation of these results in terms of equilibrium theory with particular emphasis on a nonstandard price system associated with the restriction that the (first stage) decision x1 must be chosen independent of the random elements of the problem.  相似文献   

8.
We examine how the use of high‐frequency data impacts the portfolio optimization decision. Prior research has documented that an estimate of realized volatility is more precise when based upon intraday returns rather than daily returns. Using the framework of a professional investment manager who wishes to track the S&P 500 with the 30 Dow Jones Industrial Average stocks, we find that the benefits of using high‐frequency data depend upon the rebalancing frequency and estimation horizon. If the portfolio is rebalanced monthly and the manager has access to at least the previous 12 months of data, daily data have the potential to perform as well as high‐frequency data. However, substantial improvements in the portfolio optimization decision from high‐frequency data are realized if the manager rebalances daily or has less than a 6‐month estimation window. These findings are robust to transaction costs. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

9.
This paper investigates behavioral factors influencing a supply manager's decision to insource or outsource the manufacture of a product component. To do so we posit a theoretical framework that integrates the heretofore distinct operational make–buy literature and the behavioral decision-making literature. Within the framework three factors influencing the make–buy decision are brought into account: the decision-maker's perception of supply risk or “strategic vulnerability”, the degree of core competency represented by the product component under consideration and the formality of the information about supply alternatives. The results of a controlled experimental survey show that: strategic vulnerability and core competency do influence the make–buy decision, strategic vulnerability has greater influence than core competency and information formality moderates the make–buy decision when the strategic vulnerability and core competency conditions are mixed. The practical implications of these results include the notion that management can ensure a more rational make–buy decision if they understand the biases that influence the decision and point these biases out to the decision maker.  相似文献   

10.
Various rational and behavioral models have been proposed to explain contrarian portfolio returns. In this article, I test the gradual information diffusion model of Hong and Stein [Hong, H., & Stein J. C. (1999). A unified theory of underreaction, momentum trading, and overreaction in asset markets. Journal of Finance, 54, 2143–2184]. Specifically, I study contrarian strategies based on past long-term returns and fundamental value-to-price ratios. Using ex post returns as a proxy for expected returns and size-controlled analyst coverage as a proxy for the rate of information diffusion, I show that contrarian portfolio returns decline monotonically with increasing rates of information diffusion. These results are consistent with the predictions of the Hong and Stein model. In addition, I show that analyst coverage is more important among glamour than value stocks, supporting the view that investors are more prone to decision biases when it comes to pricing hard-to-value glamour stocks for which information is relatively more ambiguous.  相似文献   

11.
In most democracies, at least two out of any three individuals vote for the same party in sequential elections. This paper presents a model in which vote‐persistence is partly due to the dependence of the utility on the previous voting decision. This dependence is termed ‘habit formation’. The model and its implications are supported by individual‐level panel data on the presidential elections in the USA in 1972 and 1976. For example, it is found that the voting probability is a function of the lagged choice variable, even when the endogeneity of the lagged variable is accounted for, and that the tendency to vote for different parties in sequential elections decreased with the age of the voter. Furthermore, using structural estimation the effect of habit is estimated, while allowing unobserved differences among respondents. The structural habit parameter implies that the effect of previous votes on the current decision is quite strong. The habit model fits the data better than the traditional ‘party identification’ model. Copyright © 2003 John Wiley & Sons, Ltd.  相似文献   

12.
Technical and scale efficiencies of Data Envelope Analysis are associated with a two dimensionalsection (a convex set) representing the amounts by which the input and output vectors of a reference decision making unit, may be scaled and still lie in the production possibility set. We describe a simple algorithm, closely resembling the simplex algorithm of linear programming, to traverse the boundary of this set. Given the output of our algorithm, the scalar efficiency measures and return-to-scale characterization are trivially determined. Moreover, the set may be graphically displayed for any problem in any number of dimensions with only a minimum of additional computing effort.  相似文献   

13.
In human resource management (HRM) and allied fields (e.g., organizational behavior, management, and industrial and organizational psychology), tests of mediation are frequently conducted using the hierarchical multiple regression (HMR) strategy of Baron and Kenny [Baron, R. M., & Kenny, D. A. (1986). The moderator–mediator variable distinction in social psychological research: Conceptual, strategic, and statistical considerations. Journal of Personality and Social Psychology, 51, 1173–1182]. Although previous research has identified a number of serious problems with this approach, the present study adds to the literature by identifying yet additional problems with its use in inferring the existence of mediation. Using a statistical simulation, we found that certain patterns of correlation coefficients guarantee inferences about mediation, whereas other patterns preclude such inferences. On the basis of various analyses including logistic regression and inspection of three-dimensional plots, we identified patterns of correlation coefficients needed to satisfy Baron and Kenny's [Baron, R. M., & Kenny, D. A. (1986). The moderator–mediator variable distinction in social psychological research: Conceptual, strategic, and statistical considerations. Journal of Personality and Social Psychology, 51, 1173–1182] conditions for inferring mediation. The same patterns have no necessary relation to actual causal connections among variables in mediation models. Moreover, as a consequence of the failure of the HMR strategy to detect mediating effects, many instances of actual mediation in HRM and allied fields may have gone undetected. In view of the foregoing, we conclude that the HMR strategy should no longer be used in testing for mediation.  相似文献   

14.
Are productivity shocks the only driving force of international business fluctuations? In this paper I argue that another source of uncertainty—changes in market expectations or ‘sunspots’ – is also important. One major shortcoming of existing IRBC models is the ‘cross-country correlation puzzle’: models tend to generate cross-country consumption correlations that are too high and output, investment and employment correlations that are too low when compared to the data. I show that with empirically supported level of increasing returns, an otherwise standard model possesses multiple, indeterminate convergent paths to the steady state, which allow for sunspots to influence the economy. The model displays time series properties that in many ways match the data better than the conventional model. It is especially successful in generating realistic consumption and output correlations.  相似文献   

15.
We consider an n-good model of optimal accumulation determined by a technology, a utility function, and a discount factor. A technology is δ-productive if it contains an input-output pair such that the discounted output vector strictly dominates the input vector. We show that a δ-productive technology has a non-trivial modified golden rule. We also report a counter- example of David Starrett showing that the modified golden rule need not have turnpike properties, and that there may exist non-trivial periodic optimal consumption plans even when there is no non-trivial modified golden rule.  相似文献   

16.
In a previous paper (“Land Use in a Circular City”, Journal of Economic Theory, 1974), I considered efficient land use and travel patterns in a circular city consisting of a homogeneous economic activity and a network of radial and circumferential roads. My analysis assumed that under decentralized optimum conditions, the price of traveling circumferentially through a radian would increase with distance from the city center. Under this and a second pricing assumption, an optimum would involve either restricting inward trip penetration or providing travelers with an inner ring road. This paper provides numerical illustrations of the optimum when trip penetration is restricted. The results suggest that the underlying pricing assumptions are likely not valid.  相似文献   

17.
A fuzzy-QFD approach to supplier selection   总被引:5,自引:0,他引:5  
This article suggests a new method that transfers the house of quality (HOQ) approach typical of quality function deployment (QFD) problems to the supplier selection process. To test its efficacy, the method is applied to a supplier selection process for a medium-to-large industry that manufactures complete clutch couplings.The study starts by identifying the features that the purchased product should have (internal variables “WHAT”) in order to satisfy the company's needs, then it seeks to establish the relevant supplier assessment criteria (external variables “HOW”) in order to come up with a final ranking based on the fuzzy suitability index (FSI). The whole procedure was implemented using fuzzy numbers; the application of a fuzzy algorithm allowed the company to define by means of linguistic variables the relative importance of the “WHAT”, the “HOWWHAT” correlation scores, the resulting weights of the “HOW” and the impact of each potential supplier.Special attention is paid to the various subjective assessments in the HOQ process, and symmetrical triangular fuzzy numbers are suggested to capture the vagueness in people's verbal assessments.  相似文献   

18.
In this paper a system of competing firms is considered in which adjustments of output are subject to delays. Under the Cournot strategy, stability of the oligopoly problem is considered. If each firm calculates its optimal output based on a knowledge of its own production at that time, and of its competitor's outputs at a previous time, stability is not affected by the information delays. However, if all the information available to each firm is subject to a delay, then stability is affected, and the likelihood of stability increases with decreasing delays.  相似文献   

19.
In the early 1980s, the Japanese government decided to pursue policies of liberalization that opened Japan's telecommunication market to competition and moved toward the privatization of Nippon Telegraph and Telephone (NTT), the dominant domestic telecommunication service provider. The purpose of this paper is to assess the effects of liberalization on the productive performance of NTT. To assess the productive consequences of liberalization, we first provide basic productivity measures for NTT. The Total Factor Productivity (TFP) measures are then decomposed to separate out the effects of liberalization from other factors such as scale, technology, and capacity utilization. The TFP decomposition is based on the parameter estimates of a dynamic cost model.During the 1958–87 period, NTT's TFP level increased at an average annual rate of 3.4%. However, TFP improved at a significantly faster rate following the decision to adopt policies of liberalization. The NTT's average annual TFP growth rate was 5.12% for the 1982–87 period as compared to a 0.26% per year growth rate for the previous five year (1977–82) period. The decomposition of TFP growth appears to indicate that liberalization was a major source of productivity improvement for NTT.  相似文献   

20.
How do individuals' spatial decisions affect the institutions for public goods provision over time? This paper describes a dynamic model in which the provision mechanism for a public project is itself the object of locational choice of individuals. Individuals in an ongoing society must choose between a location with a Majority Rule mechanism and one with a Voluntary Contribution mechanism. Each mechanism determines a funding decision for a local public project which is repeated over time. Generations of individuals asynchronously supercede their ‘parents’, creating an entry/exit process that allows individuals with possibly different beliefs to enter society. A self-confirming equilibrium (SCE) belief process describes an evolution of beliefs in this society consistent with a self-confirming equilibrium (Fudenberg and Levine, 1993) of the repeated location/provision game. It is shown that the process with belief mutation as new individuals enter society results in a globally absorbing state in which the Majority Rule mechanism is the unique survivor of the two.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号