共查询到20条相似文献,搜索用时 341 毫秒
1.
Radner (1993) proposed a model of decentralized associative computation as a means to understand information processing in
organizations. In the model, in which an organization processes a single cohort of data, resources are measured by the number
of managers. This paper (i) explains why resources should instead be measured by the time the managers are busy, (ii) shows
that, nevertheless, the characterization of sufficient conditions for efficient networks in Radner (1993) and Keren and Levhari
(1979) is valid for either measure, (iii) shows that measuring resources by the number of operations leads to sharper results
on necessary conditions for efficiency, (iv) strengthens Radner's results on the irregularity of efficient hierarchies, and
(v) compares the relative costs of parallelization under the two measures.
Received: 28 February 1997 / Accepted: 30 September 1997 相似文献
2.
This paper examines the effects of changes in information-processing technology on the efficient organizational forms of
data-processing in decision-making systems. Data-processing is modelled in the framework of the dynamic parallel processing
model of associative computation with an endogenous set-up costs of the processors. In such a model, the conditions for efficient
organization of information-processing are defined and the architecture of the efficient structures is considered. It is shown
that decreasing returns to scale of the function describing data- processing technology and the information overload of the
system are necessary and sufficient conditions for the hierarchical information- processing, respectively. Moreover, the size
of the efficient structures is determined exclusively by their information workload and the current state of information-processing
technology.
Received: 5 June 1996 / Accepted: 17 June 1999 相似文献
3.
4.
Timothy Van Zandt 《Review of Economic Design》1997,3(1):15-27
This paper defines and characterizes essential decentralized networks for calculating the associative aggregate of one or
more cohorts of data. A network is essential if it is not possible to eliminate an instruction or manager and still calulate
the aggregate of each cohort. We show that for essential networks, the graphs that depict the operations and data dependencies
are trees or forests. These results assist in the characterization of efficient networks.
Received: 15 October 1994 / Accepted: 6 March 1997 相似文献
5.
The paper examines the application of the concept of economic efficiency to organizational issues of collective information processing in decision making. Information processing is modeled in the framework of the dynamic parallel processing model of associative computation with an endogenous setup cost of the processors. The model is extended to include the specific features of collective information processing in the team of decision makers which may lead to an error in data analysis. In such a model, the conditions for efficient organization of information processing are defined and the architecture of the efficient structures is considered. We show that specific features of collective decision making procedures require a broader framework for judging organizational efficiency than has traditionally been adopted. In particular, and contrary to the results available in economic literature, we show that there is no unique architecture for efficient information processing structures, but a number of various efficient forms. The results indicate that technological progress resulting in faster data processing (ceteris paribus) will lead to more regular information processing structures. However, if the relative cost of the delay in data analysis increases significantly, less regular structures could be efficient. Copyright © 2007 John Wiley & Sons, Ltd. 相似文献
6.
A strategic analysis of network reliability 总被引:1,自引:0,他引:1
Abstract. We consider a non-cooperative model of information networks where communication is costly and not fully reliable. We examine
the nature of Nash networks and efficient networks.
We find that if the society is large, and link formation costs are moderate, Nash networks as well as efficient networks will
be ‘super-connected’ i.e. every link is redundant in the sense that the network remains connected even after the link is deleted.
This contrasts with the properties of a deterministic model of information decay, where Nash networks typically involve unique
paths between agents. We also find that if costs are very low or very high, or if links are highly reliable then there is
virtually no conflict between efficiency and stability. However, for intermediate values of costs and link reliability, Nash
networks may be underconnected relative to the social optimum. 相似文献
7.
Hao Li 《Review of Economic Design》1999,4(2):101-126
This paper analyzes organizational structures that minimize information processing costs for a specific organizational task.
Organizations consist of agents of limited ability connected in a network. These agents collect and process information, and
make decisions. Organizations implement strategies – mappings from environmental circumstances to decisions. The strategies
are exogenously given from a class of “pie” problems to be defined in this paper. The notion of efficiency is lexicographic:
the primary criterion is minimizing the number of agents, and the secondary criterion is minimizing the number of connections
between the agents. In this modeling framework, efficient organizations are not hierarchical for a large number of problems.
Hierarchies often fail to exploit fully the information processing capabilities of the agents because in a hierarchy, subordinates
have a single superior.
Received: 1 December 1995 / Accepted: 11 October 1998 相似文献
8.
This paper introduces a non-cooperative game-theoretic model of sequential network formation, in which players propose links
and demand payoffs. Payoff division is therefore endogenous. We show that if the value of networks satisfies size monotonicity,
then each and every equilibrium network is efficient. The result holds not only when players make absolute participation demands,
but also when they are allowed to make link-specific demands. 相似文献
9.
Robert M. O’Brien 《Quality and Quantity》2011,45(6):1429-1444
In general the age–period–cohort (APC) conundrum refers to the problem of separating the effects of age-groups, periods, and
cohorts. This formulation, however, fails to differentiate two fundamental problems in APC analysis: (1) the problem of the
complete confounding of the linear effects of age with the effects of period and cohort, the linear effects of cohorts with
period and age, and the linear effects of period with age and cohort; and (2) the problem of model identification. We elucidate
both problems and show how the first problem makes the partitioning of variance between cohort effects, period effects, and
age effects and the deviation of their effects from linearity problematic even when these approaches do not suffer from the
problems associated with model identification. We conclude by examining the affects of this linear confounding on estimates
of the individual effect coefficients for age-groups, periods, and cohorts when a linear constraint it imposed on the matrix
of independent variables to produce an identifiable model. 相似文献
10.
Termination and Coordination in Partnerships 总被引:2,自引:0,他引:2
It is common practice for firms to pool their expertise by forming partnerships such as joint ventures and strategic alliances. A central organization problem in such partnerships is that managers may behave noncooperatively in order to advance the interests of their parent firms. We ask whether contracts can be designed so that managers will maximize total profits. We characterize first best contracts for a variety of environments and show that efficiency imposes some restrictions on the ownership shares. In addition, we evaluate the performance of two termination contracts that are widely used in practice: the shotgun rule and price competition. We find that although these contracts do not achieve full efficiency, they both perform well. We provide insight into when each rule is more efficient. 相似文献
11.
Production and operations planning in organizations quite often is a multi-level sequential process, involving aggregate planning, master production scheduling, and detailed operations planning and scheduling. To obtain good planning results, it is desirable to have a proper planning horizon for each level of planning. There have been a considerable number of studies dealing with planning horizons for aggregate planning or production smoothing problems. There are also many planning horizon studies for single-item lot sizing problems. No study has addressed the issues associated with the planning horizons for master production schedules (which is a multi-item lot sizing problem in nature), particularly with respect to the relationship to the aggregate plan.This study addresses the issue of planning horizons for companies employing a make-to-stock competitive strategy facing a seasonal demand for their products. We formulate the aggregate planning problem and the master scheduling problem as two separate mathematical programs to approximate the two-stage process that typically takes place in practice. Rolling planning horizons are used to approximate the periodic updates of the plans commonly done in practice. The models also incorporate resource requirements planning concepts to estimate loads on the critical work centers.The planning process is simulated as a single pass procedure where the results of aggregate planning are passed to the master production scheduling model once per month and the results of the master scheduling model (i.e., the portion of the master schedule actually implemented) are passed back to the aggregate planning model for the next planning session.The experimental results show that when the planner faces extreme cost structures such as high smoothing costs/high setup costs or low smoothing costs/low setup costs, the planning horizon effects are reduced to a minimum. Master schedule planning horizons need not be as long as aggregate planning horizons. Alternatively, non-extreme cost structures such as high smoothing costs/low setup costs and low smoothing costs/high setup costs should be handled with equal planning horizons for both aggregate planning and master scheduling.It is also found that the firm's cost structure has an impact on the appropriate planning horizon for both aggregate planning and master scheduling. Some cost conditions allow for smaller master schedule horizons. The best horizon choice seems to be equal planning horizons for both aggregate planning and master scheduling, even though the cost savings is slight in some cases.Finally, the proper length of the planning horizon for master scheduling is affected by the planning horizon of the aggregate plans. 相似文献
12.
Spatial social networks 总被引:2,自引:1,他引:1
We introduce a spatial cost topology in the network formation model analyzed by Jackson and Wolinsky, Journal of Economic Theory (1996), 71: 44–74. This cost topology might represent geographical, social, or individual differences. It describes variable
costs of establishing social network connections. Participants form links based on a cost-benefit analysis. We examine the
pairwise stable networks within this spatial environment. Incentives vary enough to show a rich pattern of emerging behavior.
We also investigate the subgame perfect implementation of pairwise stable and efficient networks. We construct a multistage
extensive form game that describes the formation of links in our spatial environment. Finally, we identify the conditions
under which the subgame perfect Nash equilibria of these network formation games are stable.
We are very grateful for the constructive comments of Matt Jackson and an anonymous referee. We also like to thank Vince Crawford,
Marco Slikker, Edward Droste, Hans Haller, Dimitrios Diamantaras, and Sudipta Sarangi for comments on previous drafts of this
paper.We acknowledge Jay Hogan for his programming support.
Part of this research was done while visiting the CentER for Economic Research, Tilburg University, Tilburg, The Netherlands.Financial
support from the Netherlands Organization for Scientific Resrarch (NWO), grant B46-390, is gratefully acknowledged.-->, 相似文献
13.
Many resources such as supercomputers, legal advisors, and university classrooms are shared by many members of an organization.
When the supply of shared resources is limited, conflict usually results between contending demanders. If these conflicts
can be adequately resolved, then value is created for the organization. In this paper we use the methodology of applied mechanism
design to examine alternative processes for the resolution of such conflicts for a particular class of scheduling problems.
We construct a laboratory environment, within which we evaluate the outcomes of various allocation mechanisms. In particular,
we are able to measure efficiency, the value attained by the resulting allocations as a percentage of the maximum possible
value. Our choice of environment and parameters is guided by a specific application, the allocation of time on NASA's Deep
Space Network, but the results also provide insights relevant to other scheduling and allocation applications. We find (1)
experienced user committees using decision support algorithms produce reasonably efficient allocations in lower conflict situations
but perform badly when there is a high level of conflict between demanders, (2) there is a mechanism, called the Adaptive
User Selection Mechanism (AUSM), which charges users for time and yields high efficiencies in high conflict situations but,
because of the prices paid, in which the net surplus available to the users is less than that resulting from the inefficient
user committee (a reason why users may not appreciate ‘market solutions’ to organization problems) and (3) there is a modification
of AUSM in which tokens, or internal money, replaces real money, which results in highly efficient allocations without extracting
any of the users' surplus. Although the distribution of surplus is still an issue, the significant increase in efficiency
provides users with a strong incentive to replace inefficient user committees with the more efficient AUSM. 相似文献
14.
Searching for efficient networks can prove a very difficult analytical and even computational task. In this paper, we explore
the possibility of using the genetic algorithms (GA) technique to identify efficient network structures in the case of non-trivial
payoff functions. The robustness of this method in predicting optimal networks is tested on the two simple stylized models
introduced by Jackson and Wolinsky (1996), for which the efficient networks are known over the whole state space of the parameters’
values. This approach allows us to obtain new exploratory results in the case of the linear-spatialized connections model
proposed by Johnson and Gilles (Rev Econ Des 5:273–299, 2000), for which the efficient allocation of bilateral connections
is driven by contradictory forces that push either for a centralized structure around a coordinating agent, or for only locally
and evenly distributed connections.
Murat Yıldızoğlu gratefully acknowledges the support of the CCRDT program of Aquitaine Region. 相似文献
15.
一种分布式呼叫中心排班模型 总被引:1,自引:0,他引:1
人员排班问题是呼叫中心运作的关键问题,因为呼叫中心60%-70%的成本是人员成本.近年来,分布式呼叫中心的出现给排班的建模与求解带来了新的挑战.基于对分布式呼叫中心排班的理解,论文提出了公平排班的原则,并提出了基于公平原则的混合整数排班模型,通过算例实验证明,排班模型能够较好地平衡多个呼叫中心分中心的工作量,并能在较短的时间内获得满意解. 相似文献
16.
钢铁生产企业首先根据订单合同对数据结构进行优化,将生产过程中可以一起生产的订单合成一个批次,再以炉次为单位进行生产。使得在满足生产规程和考虑组炉要求的基础上对多个目标进行优化,将优化问题建模成一个混合整数规划模型,并通过lingo软件对模型进行了求解,实例计算结果说明了该模型的有效性和可行性。 相似文献
17.
This paper provides two theorems which characterize the domains of valuation functions for which there exist Pareto efficient
and truth dominant strategy mechanisms (balanced Groves mechanisms). Theorem 1 characterizes the existence of balanced Groves
mechanisms for a general class of valuation functions. Theorem 2 provides new balance-permitting domains of valuation functions
by reducing the problem of solving partial differential equations to the problem of solving a polynomial function. It shows
that a balanced Groves mechanism exists if and only if each valuation function in the family under consideration can be obtained
by solving a polynomial function with order less than , where n is the number of individuals.
Received: 5 January 1997 / Accepted: 25 May 1999 相似文献
18.
John A. Weymark 《Review of Economic Design》1999,4(4):389-393
Sprumont (1991) has established that the only allocation rule for the division problem that is strategy-proof, efficient,
and anonymous is the uniform rule when the domain is the set of all possible profiles of continuous single-peaked preferences.
Sprumont's characterization of the uniform rule is shown to hold on any larger domain of single-peaked preferences.
Received: 15 December 1998 / Accepted: 12 April 1999 相似文献
19.
We consider a simple case of team production, where a set of workers have to contribute a single input (say labour) and then
share the joint output amongst themselves. Different incentive issues arise when the skills as well as the levels of effort
expended by workers are not publicly observable. We study one of these issues in terms of a very simple model in which two
types of workers, skilled and unskilled, supply effort inelastically. Thus, we assume away the problem of moral hazard in
order to focus on that of adverse selection. We also consider a hierarchical structure of production in which the workers
need to be organised in two tiers. We look for reward schemes which specify higher payments to workers who have been assigned
to the top-level jobs when the principal detects no lies, distribute the entire output in all circumstances, and induce workers
to revel their true abilities. We contemplate two scenarios. In the first one, each individual worker knows only her own type,
while in the second scenario each worker also knows the abilities of all other workers. Our general conclusion is that the
adverse selection problem can be solved in our context. However, the range of satisfactory reward schemes depends on the informational
framework. 相似文献
20.
This study develops an integrative model that explains the relationship between Chinese culture, managers' strategic decision making (SDM) processes, and organizational performance. For the study 1200 participants were randomly selected from a business club's company register, resulting in 204 valid respondents. The results highlighted two significant SDM paths used by managers: (1) the cognitive-speed path, which suggested that Overseas Chinese managers (the Chinese who live outside of Mainland China) focus on the big picture, draw analogies from past experiences, and use extensive networks to reduce the duration of the decision process; and (2) the social-political path which shows that Overseas Chinese managers focus on collective interests, strive to maintain harmony, and to save face while using a collaborative style to handle conflict; this approach reduces dysfunctional political behavior, while reinforcing the decision team's focus on common goals. From these results we concluded that a speedier decision making process (based on intuition, experience, and networks) accompanied by the appropriate use of political behavior (that created harmony, through a hierarchical structure, during conflict management) in the Overseas Chinese managers' strategic decision making process could positively influence organizational performance. 相似文献