首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 203 毫秒
1.
A continuity axiom for bargaining solutions is introduced, which is satisfied by all Pareto optimal and continuous (in the Hausdorff metric) solutions. It is shown by two examples how this axiom can be used to characterize solutions having certain kind of monotonicity properties. One of the solutions is the lexicographic maximin solution. The other is the lexicographic extension of the Kalai-Smorodinsky solution. The former is an efficient (Pareto optimal) extension of the symmetric proportional solution. The latter is an efficient extension of the Kalai-Smorodinsky solution.  相似文献   

2.
We study environments where a production process is jointly shared by a finite group of agents. The social decision involves the determination of input contribution and output distribution. We define a competitive solution when there is decreasing-returns-to-scale which leads to a Pareto optimal outcome. Since there is a finite number of agents, the competitive solution is prone to manipulation. We construct a mechanism for which the set of Nash equilibria coincides with the set of competitive solution outcomes. We define a marginal-cost-pricing equilibrium (MCPE) solution for environments with increasing returns to scale. These solutions are Pareto optimal under certain conditions. We construct another mechanism that realizes the MCPE.  相似文献   

3.
Though “teams” are supposed to work together for the benefit of the firm, suboptimal outcomes may emerge when individuals within a team are more concerned with their own status and outcomes relative to their “teammates,” behaving as if they are competitors. Using a version of the stag hunt coordination game, we develop hypotheses regarding the role of status and competitiveness on coordination on Pareto optimal solutions. We test these hypotheses using three studies, with manipulations for both role and status. Status is found to play a significant role, resulting in suboptimal outcomes for competitors but not teammates.  相似文献   

4.
Elia Werczberger 《Socio》1981,15(6):331-339
This paper is concerned with multi-objective linear programming problems in which the objective functions can be partially ranked. We represent the set of admissible weight vectors by a system of linear constraints and solve for the policy most likely to be optimum. If each admissible weight vector has the same probability of being correct, the optimum policy maximizes the hypervolume of the polytope of weight vectors having this policy as a solution. The proposed algorithm requires the enumeration of the subset of admissible efficient solutions of a multi-objective linear program. For each admissible solution, we estimate the ratio of the volumes of the corresponding polytope of weight vectors and the polytope of all admissible weight vectors. An algorithm is outlined for numerical integration using the Monte Carlo method. The model is extended to the case where several objectives are expressed as linear constraints with multiple parameter vectors and there is uncertainty about the weighting of these parameters. A numerical example is provided.  相似文献   

5.
A rather general class of strategic games is described where the coalitional improvements are acyclic and hence strong Nash equilibria exist: The players derive their utilities from the use of certain facilities; all players using a facility extract the same amount of local utility therefrom, which amount depends both on the set of users and on their actions, and is decreasing in the set of users; the ultimate utility of each player is the minimum of the local utilities at all relevant facilities. Two important subclasses are “games with structured utilities,” basic properties of which were discovered in 1970s and 1980s, and “bottleneck congestion games,” which attracted researchers’ attention quite recently. The former games are representative in the sense that every game from the whole class is isomorphic to one of them. The necessity of the minimum aggregation for the existence of strong Nash equilibria, actually, just Pareto optimal Nash equilibria, in all games of this type is established.  相似文献   

6.
This study was prompted by a recently published article in this journal on facility location by D. R. Sule. We show that the claim made by Sule of a novel and extremely simple algorithm yielding optimum solutions is not true. Otherwise, the algorithm would represent a breakthrough in decision-making for which a number of notoriously hard problems could be efficiently recast as location problems and easily solved.In addition, several variants of common location problems addressed by Sule are reviewed. In the last twenty years, many methods of accurately solving these problems have been proposed. In spite of their increased sophistication and efficiency, none of them claims to be a panacea. Therefore, researchers have concurrently developed a battery of fast but approximate solution techniques. Sule's method was essentially proposed by Kuehn and Hamburger in 1963, and has been adapted many times since.We exhibit several examples (including the one employed by Sule) in which Sule's algorithms lead to nonoptimal solutions. We present computational results on problems of size even greater than those utilized by Sule, and show that a method devised by Erlenkotter is both faster and yields better results.In a cost-benefit analysis of exact and approximate methods, we conclude that planning consists of generating an array of corporate scenarios, submitting them to the “optimizing black box,” and evaluating their respective merits. Therefore, much is to be gained by eliminating the vagaries of the black box—that is, by using an exact method—even if the data collection and the model representation introduce sizable inaccuracies. Ironically, large problems (those that typically require most attention) cannot be solved exactly in acceptable computational times. Pending the imminent design of a new generation of exact algorithms, the best heuristics are those that guarantee a certain degree of accuracy of their solutions.  相似文献   

7.
The use of equations to describe agent-based model dynamics allows access to mathematical theory that is not otherwise available. In particular, equation models can be effective at solving optimization problems—that is, problems concerning how an agent-based model can be most effectively steered into a particular state. In order to illustrate this strategy, we describe a modified version of the well-known SugarScape model and implement taxation. The optimization problem is to determine tax structures that minimize deaths but maximize tax income. Tax rates are dependent upon the amount of sugar available in a particular region; the rates change over time. A system of discrete difference equations is built to capture agent-based model dynamics. The equations are shown to capture the dynamics very well both with and without taxation. A multi-objective optimization technique known as Pareto optimization is then used to solve the problem. Rather than focusing on a cost function in which the two objectives are assigned weights, Pareto optimization is a heuristic method that determines a suite of solutions, each of which is optimal depending on the priorities of the researcher. In this case, Pareto optimization allows analysis of the tradeoff between taxes collected and deaths caused by taxation. The strategies contained here serve as a framework for a broad class of models.  相似文献   

8.
In financial markets, different investors have different attitudes or preferences on the investment policies and reinsurance problems. For investors with different investment utilities, how to provide an optimal investment strategy is not only a very hard problem, but also an urgent problem to be solved. In this paper, we derive an analytical solution for the optimal allocation problem of investment-reinsurance with general-form utility function. The general utility function allows for varying relative risk aversion coefficient, which is an important feature in finance theory. However, obtaining analytical solutions for general utility function has been difficult or impossible. The solution presented in this paper is constructed through the homotopy analysis method (HAM) and written in the form of a Taylor series expansion. The fully nonlinear Hamilton–Jacobi–Bellman (HJB) equation is decomposed into an infinite series of linear PDEs, which can be solved analytically. In the end, three examples are presented to illustrate the convergence and accuracy of the method, it also demonstrates that different risk reference investors have different investment-reinsurance strategies.  相似文献   

9.
In this paper, we study the structure of optimal contracts in banking system when there is no risk of moral hazard. We consider a risk management problem under a policy that reduces the excessive risk-taking behavior by making all banks bear part of the risk that they transfer to other parties in the market. First, we characterize the optimal solutions to the risk management problem, and, second, we find a necessary and sufficient condition under which the “risk of the tail events” will not be transferred. In particular, we will study the problem using two known risk measures, value at risk and conditional value at risk, and will show that in these cases, the optimal solutions are in the form of stop-loss policies.  相似文献   

10.
This article applies a variant of game theory to the Pareto multi-value problematique, that is situations where members of a group, community or society are faced with alternative allocations, institutional arrangements, or states of the world and may collectively choose an allocation, institutional arrangement or state of the world if they can agree on it. This type of multiple value decision situation is increasingly prevalent not only on the level of societal and political issues but on the level of many enterprises, particularly those advocating corporate social responsibility. Because actors hold and apply values from different perspectives, there are potential contradictory value judgments and incompatible equilibria. In a world of contradiction, incommensurability, and disequilibrium, to what extent can conflicts be resolved and social equilibrium accomplished? Force works but it is inherently unstable. Drawing on an extension of classical game theory, generalized game theory (GGT), this article addresses the multi-value problematique in terms of collective “resolution procedures.” These regulative procedures—or social algorithms—are applied to problems of conflict and suboptimality in a multiple value world such as Pareto envisioned. This paper (the first of two) outlines key elements of GGT, defines the Pareto multi-value problematique, pointing out several of the critical weaknesses, theoretical as well as empirical, of the Pareto approach. GGT is then applied in defining and analyzing several major procedures to realize improvements in a multi-value world characterized by conflict and sub-optimality. A second article conceptualizes a complex of societal games making up a social system with 2-phase multi-level game processes; it applies the conceptualization to the different societal procedures for multi-value choice under conditions of conflict. Procedures such as democratic voting, adjudication and administrative decision-making, and multi-lateral negotiation are capable of producing outcomes that in many cases are widely accepted as legitimate and become social equilibria (at least within some range of conditions). These procedures and the conditions for their activation and implementation are modelled and explicated through a generalized game approach.  相似文献   

11.
In the last decade, a number of models for the dynamic facility location problem have been proposed. The various models contain differing assumptions regarding the revenues and costs realized in the opening, operation, and closure of a facility as well as considering which of the facility sites are candidates for acquisition or disposal at the beginning of a time period. Since the problem becomes extremely large for practical applications, much of the research has been directed toward developing efficient solution techniques. Most of the models and solutions assume that the facilities will be disposed of at the end of the time horizon since distant future conditions usually can't be forecasted with any reasonable degree of accuracy. The problem with this approach is that the “optimal” solution is optimal for only one hypothesized post horizon facility configuration and may become nonoptimal under a different configuration. Post-optimality analysis is needed to assure management that the “optimal” decision to open or close a facility at a given point in time won't prove to be “nonoptimal” when the planning horizon is extended or when design parameters in subsequent time periods change. If management has some guarantee that the decision to open or close a facility in a given time period won't change, it can safely direct attention to the accuracy of the design parameters within that time period.This paper proposes a mixed integer linear programming model to determine which of a finite set of warehouse sites will be operating in each time period of a finite planning horizon. The model is general in the sense that it can reflect a number of acquisition alternatives—purchase, lease or rent. The principal assumptions of the model are: a) Warehouses are assumed to have infinite capacity in meeting customer demand, b) In each time period, any non-operating warehouse is a candidate for becoming operational, and likewise any operating warehouse is a candidate for disposal, c) During a given time period, the fixed costs of becoming operational at a site are greater than the disposal value at that site to reflect the nonrecoverable costs involved in operating a warehouse. These costs are separate from the acquisition and liquidation values of the site. d) During a time period the operation of a warehouse incurs overhead and maintenance costs as well as a depreciation in the disposal value.To solve the model, it is first simplified and a partial optimal solution is obtained by the iterative examination by both lower and upper bounds on the savings realized if a site is opened in a given time period. An attempt is made to fix each warehouse open or closed in each time period. The bounds are based on the delta and omega tests proposed by Efroymson and Ray (1966) and Khumawala (1972) with adjustment for changes in the value of the warehouse between the beginning and end of a time period. A complete optimal solution is obtained by solving the reduced model with Benders' decomposition procedure. The optimal solution is then tested to determine which time periods contain “tentative” decisions that may be affected by post horizon data by analyzing the relationship between the lower (or upper) bounds used in the model simplification time period. If the warehouse decisions made in a time period satisfy these relationships and are thus unaffected by data changes in subsequent time periods, then the decisions made in earlier time periods will also be unaffected by future changes.  相似文献   

12.

Forming quality is the most important elements of higher education in the world. Even in countries where systems in higher education are fully succeed in performance, quality is under ongoing observation. Barrier-free campuses are among the most crucial criteria in quality development and evaluation. While such campuses are strictly considered in American and European higher education, it has been observed that awareness of this issue has become a focus point in Turkey and necessary measures are taken. It is a fact that the disabled learners need to get benefit from educational services less than others. All individuals in a community have equal rights. Therefore, higher education has great responsibilities for “raising awareness”, “adapting standards by the community”, “understanding”, and “meeting the needs” of the disabled learners. The disability in our country are facing several problems such as education, employment, transportation and communication. Parallel to developing technology, there are solutions to such problems. Universities are institutions where scientist, educationalists, doctors, lawyers, engineers, politicians etc. are educated to shape the future of a country. Young people learn at universities, broaden their horizon, produce new ideas, and shape their future. At this stage, undoubtedly, universities are not expected to ignore the learners with disabilities. They, too, have the right, as other individuals, to receive education in their interest areas and contribute to production, benefit from all the facilities provided by the university and be active members of the society.

  相似文献   

13.
This paper summarizes an actual multi-item periodic-review inventory control study conducted in a developing country. The firms of developing countries are usually less fortunate than their counterparts in industrialized countries in terms of management information systems, personnel and other infrastructure to successfully implement the results of research studies. Therefore, the management scientists of these countries are more obliged to formulate managerial problems in such a way that “decision-framing” models thus developed require the least estimating and updating efforts possible, yet not sacrifice much from the optimal solution which otherwise could have been obtained. In this study, two formulations of an actualinventory control problem, one much easier and less costly to implement than the other, are shown to be equivalent with respect to the optimal solution. This result offers an instrument for implementation improvements in inventory control systems.  相似文献   

14.
近年来,中国区域发展总体战略不断充实完善,逐步形成区域发展的新格局。概括起来,区域发展新格局的特征呈现“四三二一格局”。总体来看,中国区域发展正在梯度跨域“中等收入陷进”,但与此同时,区域发展也出现了一些不容忽视的问题,比如,城市发展中的“双核带动”问题,区域发展中的南北分化、对工业化和城市化的认识偏差等,这些问题的解决对于中国区域经济的发展实践至关重要,值得深入研究和关注。  相似文献   

15.
The research program of theoretical pluralism would imply for sociology, to confront marxist with nonmarxist sociology. This seems useful only if both sociologies are refutable in principle. However, certain principles of marxist sociology prevent the refutation of its most fundamental hypotheses. This is shown in analyzing an often cited book by E. Hahn. The “basis” of marxist sociology is historical materialism. Thus, if hypotheses from marxist sociology are falsified, historical materialism is falsified too. Hahn maintains-without presenting any empirical evidence-that historical materialism has found the solution of all problems and thus cannot be refuted. Marxist sociology, however, is-according to Hahn-falsifiable. This statement and the thesis mentioned before are inconsistent, for if historical materialism is true, marxist sociology-which follows from historical materialism-cannot be wrong. But even if there would be no inconsistency, the following situation would be possible: Marxist sociology proves entirely wrong, but historical materialism (the “basis” of marxist sociology) will not be abandoned. Furthermore, Hahn maintains that by means of empirical research (observation and experiment) the central hypotheses of marxist sociology cannot be tested, namely the “essence of appearances.” But he is at a loss for arguments; so every hypothesis can be immunized against falsification by declaring that it describes the “essence” of certain “appearances.” One further strategy of immunization Hahn introduces, is the rule that in an explanation only marxist laws may be used. So it is not possible to confront marxist and nonmarxist hypotheses. This rule eliminates a very effective kind of criticism, namely the confrontation of inconsistent theories.  相似文献   

16.
Complex systems that are required to perform very reliably are often designed to be “fault-tolerant,” so that they can function even though some component parts have failed. Often fault-tolerance is achieved through redundancy, involving the use of extra components. One prevalent redundant component configuration is the m-out-of-n system, where at least m of n identical and independent components must function for the system to function adequately.Often machines containing m-out-of-n systems are scheduled for periodic overhauls, during which all failed components are replaced, in order to renew the machine's reliability. Periodic overhauls are appropriate when repair of component failures as they occur is impossible or very costly. This will often be the case for machines which are sent on “missions” during which they are unavailable for repair. Examples of such machines include computerized control systems on space vehicles, military and commercial aircraft, and submarines.An interesting inventory problem arises when periodic overhauls are scheduled. How many spare parts should be stocked at the maintenance center in order to meet demands? Complex electronic equipment is rarely scrapped when it fails. Instead, it is sent to a repair shop, from which it eventually returns to the maintenance center to be used as a spare. A Markov model of spares availability at such a maintenance center is developed in this article. Steady-state probabilities are used to determine the initial spares inventory that minimizes total shortage cost and inventory holding cost. The optimal initial spares inventory will depend upon many factors, including the values of m and n, component failure rate, repair rate, time between overhauls, and the shortage and holding costs.In a recent paper, Lawrence and Schaefer [4] determined the optimal maintenance center inventories for fault-tolerant repairable systems. They found optimal maintenance center inventories for machines containing several sets of redundant systems under a budget constraint on total inventory investment. This article extends that work in several important ways. First, we relax the assumption that the parts have constant failure rates. In this model, component failure rates increase as the parts age. Second, we determine the optimal preventive maintenance policy, calculating the optimal age at which a part should be replaced even if it has not failed because the probability of subsequent failure has become unacceptably high. Third, we relax the earlier assumption that component repair times are independent, identically distributed random variables. In this article we allow congestion to develop at the repair shop, making repair times longer when there are many items requiring repair. Fourth, we introduce a more efficient solution method, marginal analysis, as an alternative to dynamic programming, which was used in the earlier paper. Fifth, we modify the model in order to deal with an alternative objective of maximizing the job-completion rate.In this article, the notation and assumptions of the earlier model are reviewed. The requisite changes in the model development and solution in order to extend the model are described. Several illustrative examples are included.  相似文献   

17.
We search for (Nash) implementable solutions on a class of one-to-one matching problems which includes both the housing market (Shapley and Scarf, Journal of Mathematical Economics, 1974, 1, 23–28) and marriage problems (Gale and Shapley, American Mathematical Monthly, 1962, 69, 9–15). We show that the core correspondence is implementable. We show, furthermore, that any solution that is Pareto efficient, individually rational, and implementable is a supersolution of the core correspondence. That is, the core correspondence is the minimal solution that is Pareto efficient, individually rational, and implementable. A corollary of independent interest in the context of the housing market is that the core correspondence is the only single-valued solution that is Pareto efficient, individually rational, and implementable.  相似文献   

18.
James A. Yunker 《Socio》1976,10(4):173-179
Among the problems confronting those who aspire to the development of a realistic and practicable optimal growth theory is that the human population is not homogeneous with respect to age. Those who are relatively young are apt to prefer a different pattern of capital accumulation from that preferred by those who are relatively old. This paper proposes a tentative solution to this particular problem. Essentially the proposal is that society should remain with what is an optimal private plan of an individual who is at the median age of the population at the beginning of the planning period, for one planning period, after which it revises the plan to switch to the optimal private plan of another individual (one planning period younger than the first) who is at the median age at the commencement of the new planning period. Thus the optimal social plan consists of a succession of one planning period implementations of the first periods of the optimal private plans of individuals who are at the median age at that time period. An example of the application of the method is given. An important sidelight of the paper is a critique of standard constant-rate exponential discounting in social planning of optimal capital accumulation, and the proposal that it be replaced by “mortality discounting.”  相似文献   

19.
This paper considers scheduling spatially distributed jobs with degradation. A mixed integer programming (MIP) model is developed for the linear degradation case in which no new jobs arrive. Properties of the model are analyzed, following which three heuristics are developed, enhanced greedy, chronological decomposition and simulated annealing. Numerical tests are conducted to: (i) establish limits of the exact MIP solution, (ii) identify the best heuristic based on an analysis of performance on small problem instances for which exact solutions are known, (iii) solve large problem instances and obtain lower bounds to establish solution quality, and (iv) study the effect of three key model parameters. Findings from our computational experiments indicate that: (i) exact solutions are limited to instances with less than 14 jobs; (ii) the enhanced greedy heuristic followed by the application of the simulated annealing heuristic yields high quality solutions for large problem instances in reasonable computation time; and (iii) the factors “degradation rate” and “work hours” have a significant effect on the objective function. To demonstrate applicability of the model, a case study is presented based on a pothole repair scenario from Buffalo, New York, USA. Findings from the case study indicate that scheduling spatially dispersed jobs with degradation such as potholes requires: (i) careful consideration of the number of servers assigned, degradation rate and depot location; (ii) appropriate modeling of continuously arriving jobs; and (iii) appropriate incorporation of equity consideration.  相似文献   

20.
In this paper, for each solution for TU games, we define its “dual” and “anti-dual”. Then, we apply these notions to axioms: two axioms are (anti-)dual to each other if whenever a solution satisfies one of them, its (anti-)dual satisfies the other. It turns out that these definitions allow us not only to organize existing axiomatizations of various solutions but also to find new axiomatizations of some solutions. As an illustration, we show that two well-known axiomatizations of the core are essentially equivalent in the sense that one can be derived from the other, and derive new axiomatizations of the Shapley value and the Dutta–Ray solution.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号