首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Summary In this paper we consider Anonymous Sequential Games with Aggregate Uncertainty. We prove existence of equilibrium when there is a general state space representing aggregate uncertainty. When the economy is stationary and the underlying process governing aggregate uncertainty Markov, we provide Markov representations of the equilibria.Table of notation Agents' characteristics space ( ) - A Action space of each agent (aA) - Y Y = x A - Aggregate distribution on agents' characteristics - (X) Space of probability measures onX - C(X) Space of continuous functions onX - X Family of Borel sets ofX - State space of aggregate uncertainty ( ) - x t=1 aggregate uncertainty for the infinite game - = (1,2,...,t,...) - t t (1, 2,..., t) - L1(t,C ×A),v t Normed space of measurable functions from t toC( x A) - 8o(t,( x A)) Space of measurable functions from tto( x A) - Xt Xt= x s=1 t X - X t Borel field onX t - v Distribution on - vt Marginal distribution of v on t - v(t)((¦t)) Conditional distribution on given t - vt(s)(vts)) Conditional distribution on t given s (wheres) - t Periodt distributional strategy - Distributional strategy for all periods =(1,2,...,t,...) - t Transition process for agents' types - ( t,t,y)(P t+1(, t , t ,y)) Transition function associated with t - u t Utility function - V t (, a, , t) Value function for each collection (, a, , t ) - W t (, , t ) Value function given optimal action a - C() Consistency correspondence. Distributions consistent with and characteristics transition functions - B() Best response correspondence (which also satisfy consistency) - E Set of equilibrium distributional strategies - x t=1 ( t , (x A)) - S Expanded state space for Markov construction - (, a, ) Value function for Markov construction - P( t * , t y)(P(, t * , t , y )) Invariant characteristics transition function for Markov game We wish to acknowledge very helpful conversations with C. d'Aspremont, B. Lipman, A. McLennan and J-F. Mertens. The financial support of the SSHRCC and the ARC at Queen's University is gratefully acknowledged. This paper was begun while the first author visited CORE. The financial support of CORE and the excellent research environment is gratefully acknowledged. The usual disclaimer applies.  相似文献   

2.
Zusammenfassung Es konnten vier Typen der Fluktuation isoliert werden, die neben persönlichen Komponenten für die Fluktuation der Arbeitskräfte maßgebend sind. Ein Teil der Fluktuation geht auf das Ausprobieren des Arbeitsplatzes — angesichts der Unvollkommenheit der Information — zurück (Probe-Fluktuation). Die Wechselneigung ist daher bei Personen mit kurzer Betriebszugehörigkeitsdauer sowie jüngeren Arbeitskräften (20 bis 30 Jahre) relativ hoch.Eine weitere Form der Fluktuation ist die Reaktion der Arbeinehmer auf Unterschiede in den Nettovorteilen verschiedener Arbeitsplätze (Lohn-Fluktuation). Die Arbeitskräfte wandern per Saldo aus Niedrig- in Hochlohnbetriebe, aus Klein- in Großbetriebe und aus schrumpfenden in expandierende.Das Bestehen eines dualen Arbeitsmarktes impliziert, daß benachteiligte Arbeitskräfte (Ungeschulte mit geringem betriebsspezifischem Training) häufig kündigen, während begünstigte Arbeitskräfte (Geschulte) relativ selten wechseln (Hilfskräfte-Fluktuation). Tatsächlich ist die Wechselneigung von Ungeschulten und Arbeitern (vor allem im industriell-gewerblichen Sektor) weit überdurch-schnittlich. Die Zahl der Wechselfälle wird neben der Wechselneigung durch die alternativen Beschäftigungsmöglichkeiten bestimmt (Konjunktur-Fluktuation). Die Fluktuation schwankt so deutlich mit der Anspannung auf dem Arbeitsmarkt, daß sie als Anspannungsindikator angesehen werden kann.die Branchenunterschiede in verschiedenen Maßzahlen der Fluktuation lassen sich im wesentlichen auf die dargestellten vier Fluktuationstypen zurückführen. (Zur Messung des Arbeitsplatzwechsels empfiehlt es sich, Verbleibenswahrscheinlichkeiten der betrieblichen Zugänge und dienstalterspezifische Fluktuationsraten zu errechnen.)
Summary There are four main features of labour turnover. A substantial part of turnover is due to job shopping in view of imperfect information. Therefore, workers with short job tenure and younger employees (20–30 years) reveal a high propensity to quit. Another type of quit behaviour is the reaction of workers to differences in the net advantages of various jobs. The employed move from the low wage to the high wage sector, from small to largescale enterprises and from shrinking to expanding firms. The existence of a dual labour market implies that disprivileged workers (with low general and specific training) quit frequently and privileged workers rarely change the job. In fact, the turnover rate of white-collar workers and persons with higher formal education is far below average. The actual level of labour turnover does not only depend on the propensity to quit but also on the alternative job opportunities. Voluntary quits are so closely related to the tightness of the labour market that they can be regarded as a labour market indicator. Regression analysis shows that the differences in various measures of turnover are essentially due to these four features of turnover. (For measurement of labour turnover it is recomended to use the probabilities of a batch of entrants to survive certain points in time. as well as job tenure-specific turnover rates.)
  相似文献   

3.
In a seminal contribution to the literature on bureaucracy, Breton and Wintrobe (The Logic of Bureaucratic Conduct: An Economic Analysis of Competition, Exchange, and Efficiency in Private and Public Organization. New York, NY: Cambridge University Press, 1982) develop a model wherein subordinates and superiors in a bureaucratic structure trade with each other to advance the objectives of the superiors. The success of such an organizational arrangement (for superiors) is based upon the development of vertical trust networks in a way that facilitates the promise of informal payments by superiors in return for informal services provided by their subordinates. Breton and Wintrobe [Journal of Political Economy 94 (1986) 905] also provide a theoretical application of their model by describing the Nazi bureaucracy as a conglomeration of competing agencies that zealously carried out the Final Solution to the Jewish question. As an extension, this note develops two compelling empirical examples of vertical and horizontal trust networks within the Nazi regime: Einsatzgruppen As (Special Action Detachments) attempt to liquidate all Lithuanian Jews after the German invasion of the U.S.S.R. in 1941 and the 20 July 1944 attempt to assassinate Adolf Hitler.JEL Classification: D23, D73.  相似文献   

4.
Conclusion When this research was started, it was guessed that the Dorfman-Steiner rule would lose its relevance in an intertemporal setting. This belief has turned out to be false: along the optimal paths ofp (t) ands (t), and must be equal. The only difference with the Dorfman-Steiner result is that they will be different from unity.The author is chargé de cours at the Faculté Universitaire Catholique de Mons (Belgium). He has greatyl benefited from comments by M. Beuthe and J. J. Lambin.  相似文献   

5.
Summary SupposeY n is a sequence of i.i.d. random variables taking values in Y, a complete, separable, non-finite metric space. The probability law indexed by, is unknown to a Bayesian statistician with prior, observing this process. Generalizing Freedman [8], we show that generically (i.e., for a residual family of (,) pairs) the posterior beliefs do not weakly converge to a point-mass at the true. Furthermore, for every open setG , generically, the Bayesian will attach probability arbitrarily close to one toG infinitely often. The above result is applied to a two-armed bandit problem with geometric discounting where armk yields an outcome in a complete, separable metric spaceY k. If the infimum of the possible rewards from playing armk is less than the infimum from playing armk', then armk is (generically) chosen only finitely often. If the infimum of the rewards are equal, then both arms are played infinitely often.  相似文献   

6.
Ohne ZusammenfassungDieser Artikel ist dieungekürzte Fassung des Beitrages Klassische Nationalökonomie des Verfassers zum Staatslexikon (Band IV).  相似文献   

7.
We examine behavior in a Coasian contracting game with incomplete information. Experimental subjects propose contracts, while automaton property right holders or robot players with uncertain preferences respond to those proposals. The most common pattern of proposals observed in these games results in too many agreements and, in some games, payoffs that are stochastically dominated by those resulting from rational proposals (which imply fewer agreements). In this sense, we observe a winner's curse similar to that observed in bidding games under incomplete information, such as the common value auction (Kagel, J.H. and Levin, D. (1986) American Economic Review. 76, 894–920) and the takeover game (Samuelson, W. and Bazerman, M.H. (1985) In Research in Experimental Economics, Vol. 3. JAI Press, Greenwich, pp. 105–137; Ball, S.B., Bazerman, M.H., and Carroll, J.S. (1990) Organizational Behavior and Human Decision Processes. 48, 1–22; Holt, C. and Sherman, R. (1994) American Economic Review. 84, 642–652). While the naïve model of behavior nicely predicts the winner's curse in those previous bidding games, it does not do so here. Instead, an alternative model we call the guarantor model explains the anomalous behavior best. Hence, we suggest this is a new variant of the winner's curse.  相似文献   

8.
The paper is motivated by Joseph A. Schumpeter's The Crisis of the Tax State. It inquires whether the buildup of government debt in peacetimeprosperity is a threat to the stability, existence or creation of viable tax states. The paper begins by setting out Schumpeter's conception of the tax state and the nature of recent political-economic events which have reinvigorated the concept. Next the paper sets out some simple debt dynamics and sketches a debt-induced business cycle arising from heavy reliance on debt finance in peacetimeprosperity. Finally, the paper assesses threats to the tax state in light of recent work on path dependence and positive feedback. An attempt is made to throw some light on whether the plethora of new, and often small, states spawned by the demise of communism can be viable tax states.Essay on Government, the Tax State and Economic Dynamics submitted to the Third Schumpeter Prize Competition.  相似文献   

9.
Summary This paper gives an empirical reexamination of the Linear-Expenditure-hypothesis for Austria. It starts with a brief theoretical discussion of the principal properties and restrictions of the Linear-Expenditure-System (LES). To obtain empirical estimates of the parameters of the LES two different estimation procedures are applied, i.e. the original method used byStone and a simplified version of the Systems-Least-Squares-approach (following theMarquardt-algorithm). There are no essential differences between these estimates. They all seem plausible and satisfy the theoretical restrictions.Usually the stability (i.e. time-invariance) of the parameters is accepted without proof. Using the Moving-Window-Regression-technique, however, most of the estimates vary significantly in time. To obtain a direct proof of the time-dependence of the parameters the LES is reestimated now including trend-factors. Especially results considering time-dependent marginal-budget-shares are considerably better than the static-model results.The conclusion of this paper is that the static version of the LES does not explain the consumer behaviour in Austria and that much more effort should be spent on the estimation of dynamic demand systems.  相似文献   

10.
InThe Sensory Order, Friedrich A. Hayek describes the human mind as an apparatus of classification that evolves through experience and that reaches decisions by modeling the alternative courses of action that are available to it. Hayek's mechanistic conception of mind argues aginst the possibility of central planning and against the cogency of any rule that denigrates subjective decision making by employers or other economic agents. As implied by Gödel's proof, no brain, human or mechanical, can ever be sufficiently complex to explain itself. There will therefore always be certain knowledge and rules that cannot be articulated to the satisfaction of a central planner or tribunal.  相似文献   

11.
Summary. This paper shows that information effects per se are not responsible for the Giffen goods anomaly affecting traders demands in multi asset noisy, rational expectations equilibrium markets. The role that information plays in traders strategies also matters. In a market with risk averse, uninformed traders, informed agents have a dual trading motive: speculation and market making. The former entails using prices to assess the effect of error terms; the latter requires employing them to disentangle noise traders demands within aggregate orders. In a correlated environment this complicates the signal extraction problem and may generate upward sloping demand curves. Assuming (i) that competitive, risk neutral market makers price the assets or that (ii) uninformed traders risk tolerance coefficient grows unboundedly, removes the market making component from informed traders demands rendering them well behaved in prices.Received: 30 April 2002, Revised: 3 December 2003, JEL Classification Numbers: G100, G120, G140.Support from the Barcelona Economics Program of CREA and the Ente per gli Studi Monetari e Finanziari Luigi Einaudi, are gratefully acknowledged. I thank Anat Admati, Jordi Caballé, Giacinta Cestone, and Xavier Vives for useful suggestions. The comments provided by the Associate Editor and an anonymous referee greatly improved the papers exposition.  相似文献   

12.
13.
I propose to show how to translate the economic analysis of institutions developed in the tradition of worst case political economy into the lingua franca of robust statistics. An institution will be defined as contingent upon a design theory and the difficulty we consider is the use of the institution by the designer.The technical bridge between institutional robustness and statistical robustness is the the possibility of exploratory data analysis [EDA] with respect to the design theory. In an institutional context with EDA comes the recognition that the model is not completely specified, that we do not fully understand the structure of world before studying it. In a statistical context with EDA comes a non-normal error distribution.The relationship between evolutionary institutions and robust institutions is discussed. A conjecture that rule utilitarianism can be thought of as robust utilitarianism is defended with the historical example of William Paley's discussion of the utility of murder.  相似文献   

14.
In general, synergies across license valuations complicate the auction design process. Theory suggests that a simple (i.e., non-combinatorial) auction will have difficulty in assigning licenses efficiently in such an environment. This difficulty increases with increases in fitting complexity. In some environments, bidding may become mutually destructive. Experiments indicate that a properly designed combinatorial auction is superior to a simple auction in terms of economic efficiency and revenue generation in bidding environments with a low amount of fitting complexity. Concerns that a combinatorial auction will cause a threshold problem are not borne out when bidders for small packages can communicate.  相似文献   

15.
Summary This paper defines a choice process over social outcomes in which agents choose the institutional rules ormechanisms themselves without outside interference. Truly endogenizing the mechanism selection process in this way, however, involves facing an infinite regress problem in which outcomes are chosen by games which are themselves chosen by games, ad infinitum. This paper allows the possibility of such an infinite regress which we callfully endogenous mechanism selection.We introduce the notion ofFree Choice which restricts the class of mechanisms in the regress to those which prevent agents from being locked in to an equilibrium outcome by the actions of others. Under this condition, the infinite regress is shown to get truncated with the number of selection iterations endogenously determined. It turns out that the outcomes resulting from a Free Choice-constrained regress are (Weakly) Pareto optimal; in particular, these outcomes solve a weighted Rawlsian Maxmin criterion. We also show that these outcomes are invariant to the equilibrium concept used to evaluate games in the regress.This paper is based on the author's dissertation from the University of Minnesota (November, 1989).I am very grateful for the guidance, advice, and encouragement from my advisor, Marcel K. Richter, and for the many helpful suggestions from David Levine. I have also benefited from conversations with Nabil Al-Najjar, Gerhard Glomm, Leonid Hurwicz, James Jordan, Ramon Marimon, Andrew McClennan, Ariel Rubinstein, and William Thomson.  相似文献   

16.
Summary We consider the problem of choosing an allocation in an economy in which there are one private good and one public good. Our purpose is to identify the class of procedures of choosing an allocation which satisfy strategy-proofness, individual rationality, no exploitation and non-bossiness. Any such procedure is a scheme of semi-convex cost sharing determined by the minimum demand principle.I wish to thank Professors Salvador Barbera, Matthew Jackson, Herve Moulin and William Thomson for their helpful suggestions and two anonymous referees for their detailed comments. Conversations with Professors Hideo Konishi, Shinji Oseto Ken-ichi Shimomura and Stephen Ching were helpful. This work is supported by the Japan Economic Research Foundation and Research Grants PB89-0294 and PB89-0075 from the Direcion General de Investigacion Cientifica y Tecnica, Spanish Ministry of Education.  相似文献   

17.
Summary This paper examines the efficiency properties of competitive equilibrium in an economy with adverse selection. The agents (firms and households) in this economy exchange contracts, which specify all the relevant aspects of their interaction. Markets are assumed to be complete, in the sense that all possible contracts can, in principle, be traded. Since prices are specified as part of the contract, they cannot be used as free parameters to equate supply and demand in the market for the contract. Instead, equilibrium is achieved by adjusting the probability of trade. If the contract space is sufficiently rich, it can be shown that rationing will not be observed in equilibrium. A further refinement of equilibrium is proposed, restricting agents' beliefs about contracts that are not traded in equilibrium. Incentive-efficient and constrained incentive-efficient allocations are defined to be solutions to appropriately specified mechanism design problems. Constrained incentive efficiency is an artificial construction, obtained by adding the constraint that all contracts yield the same rate of return to firms. Using this notion, analogues of the fundamental theorems of welfare economics can be proved: all refined equilibria are constrained incentive-efficient and all constrained incentive-efficient allocations satisfying some additional conditions can be decentralized as refined equilibria. A constrained incentive-efficient equilibrium is typically not incentive-efficient, however. The source of the inefficiency is the equilibrium condition that forces all firms to earn the same rate of return on each contract.Notation ={ 1,..., k } set of outcomes - : + generic contract or lottery - A = () ; - Ao A{, where denotes the null contract or no trade - S={1,...,¦S¦} set of seller types - L(s) number of type-s sellers - M number of buyers - u: × S seller's utility function, which can be extended toA× S by puttingu(, s) ; - v. × S buyer's utility function, which can be extended toA × S by puttingv(, s) ; - f:A 0 ×S + allocation of sellers - g:A 0 ×S + allocation of buyers - A + sellers' trading function - :A ×S + buyers' trading function This paper has had a long gestation period, during which I have been influenced by helpful conversations with many persons, by their work, or both. Among those who deserve special mention are Martin Hellwig, Roger Myerson, Edward Prescott, Robert Townsend and Yves Younés. Earlier versions were presented to the NBER/CEME Conference on Decentralization at the University of Toronto and the NBER Conference on General Equilibrium at Brown University. I would like to thank John Geanakoplos, Walter Heller, Andreu Mas Colell, Michael Peters, Michel Poitevin, Lloyd Shapley, John Wooders, Nicholas Yannelis and an anonymous referee for their helpful comments and especially Robert Rosenthal for his careful reading of two drafts. The financial support of the National Science Foundation under Grant No. 912202 is gratefully acknowledged.  相似文献   

18.
The smile effect is a result of an empirical observation of the options implied volatility with the same expiration date, across different exercise prices. However, its shape has been under discussion seeming to be dependent on the option underlying security. In this paper, and filling up a scarce empirical research on the topic, we used liquid equity options on 9 stocks traded on the London International Financial Futures and Options Exchange (LIFFE) between August 1990 and December 1991. We tested two different hypothesis for testing two different phenomena: (1) the increase of the smile as maturity approaches; (2) and the association between the smile and the volatility of the underlying stock. In order to estimate implied volatilities for unavailable exercise prices, we modelled the smile using cubic B-spline curves. We found empirical support for the smile intensification (the U-shape is more pronounced) as maturity approaches as well as when volatility rises. However, we found two major sources of disagreement with the literature on stochastic volatility models. First, as maturity approaches, out-of-the-money options implied volatility tends to be higher than the implied volatility of in-the-money options. Second, as the volatility of the underlying asset increases, the implied volatility of in-the-money options tends to be higher than implied volatility of out-of-the-money options.Received: September 2001, Accepted: September 2003, JEL Classification: G13Correspondence to: João L. C. DuqueWe thank Professor Dean A. Paxson (University of Manchester), António Sousa Câmara (University of Strathclyde), Ser-Huang Poon (University of Lancaster) and the attendees of the 26th EFA Annual Conference for helpful comments on previous versions of this paper. We also want to thank to two anonymous referees for their relevant comments and suggestions. Financial support granted by the Fundação para a Ciência e a Tecnologia (FCT) and the Programa Praxis XXI is gratefully acknowledged.  相似文献   

19.
The computer revolution took very long to pay off in productivity growth in the computer-using sectors. The relative wage of skilled workers, however, has risen sharply from the early days of the computer revolution onward. As skilled workers wages reflect their productivity, the two observations together pose a puzzle.This paper provides a micro-based explanation for the long diffusion period of the computer revolution. The general equilibrium model of growth zooms in on the research process and provides an explanation for sluggish growth with booming relative wages of the skilled. Technological progress in firms is driven by research aimed at improving the production technology (innovation) and by assimilation of ideas or principles present outside the firm (learning). A new General Purpose Technology (GPT) like the computer revolution generates an initial slowdown in economic growth and an increase in the skill premium.Acknowledgement I am indebted to Theo van de Klundert for suggestions and encouragement. Suggestions by Jan Boone, Bas Jacobs, Patrick Francois, Henri de Groot, Lex Meijdam, Niek Nahuis Sjak Smulders, Harald Uhlig and anonymous referees have contributed to the paper.  相似文献   

20.
This paper notes that Gini's concept of Transvariazione corresponds to the idea of overlapping of distributions. It shows that this concept is implicit in some measures of income inequality such as the Pietra and Gini Index and is at the basis of various measures of distance between distributions. An empirical illustration is provided, based on Israeli data for the period 1978–1994.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号