首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 453 毫秒
1.
Smeral  Egon 《Empirica》1978,5(2):243-277
Summary The present study analyses the simultaneous problem of consumption and saving by means of a consistent demand system; for this purpose the linear—expenditure—system (LES), developed by R. Stone, has been modified and used as a methodological base. Saving takes, for the sake of operationality, the character of a consumer—good and becomes an argument of the utility function. The usual neoclassical assumption of utility maximization allows the derivation of a linear expenditure system of consumption and saving (LESSC) when prices and income are given. The simultaneous LESSC-model has remarkable weaknesses, however: the assumption of certainty, the static character of the model, the disregard for major savings—motives and private expenditure on homebuiding led to bad elasticity—estimates. The assumpion of directly—additive utility functions causes furthermore collinearity between income—and price—elasticities such that the meaning of the derived elasticities is greatly reduced.The income—elasticities derived from the LESSC are positive throughout but show a remarkable variance. The calculation of the Friedman—bias demonstrates a rather strong bias due to the assumption of certainty. A modification resulted in income—elasticities of private consumption and savings of around 0,93 (unmodified: 0,88) and 1,41 (unmodified: 1,76). The demand for consumption goods of great necessity was income—inelastic whereas the demand for goods of less importance to survival was income—elastic. An analysis of income—elasticities of the disaggregated system and the relation between transitory components of consumption and income existing in Austria gave the impression that unexpected changes in income are not only reflected in saving but also in changes of the consumption—structure.The respective price—elasticites are all negative and smaller than 1. For less important consumption—goods lower price—elasticities have been measured and for easily substitutable goods higher ones. Marked crossprice—elasticities could only by discovered with clothing and food products. Generally it can be said that an increase in prices of goods of the daily needs hits both the expenditure on easily substitutable consumption goods and causes dissaving.A comparison with the elasticities calculated through OLS shows a greater reliance of LESSC—elasticities as far as data of differing aggregation levels are concerned.

Mécanique Sociale may one day take her place along with Mécanique Celeste throned each upon the double—sided height of one maximum principle, the supreme pinnacle of moral as of physical science.  相似文献   

2.
Power,luck and the right index   总被引:1,自引:0,他引:1  
Summary We have pointed out the theoretical drawbacks of the traditional indices for measuring a priori voting power inasmuch as they are implied in considering the coalition value a private good. This criticism caused us to view the coalition outcome as a public good. From this aspect and additional considerations with respect to power, luck, and decisiveness, we obtained a story describing the characteristics of an adequate measure of a priori voting power. These characteristics were found to be fulfilled by an index presented by Holler (1978). Through the above analysis this index has received its theoretical justification. An independent view of this index was then provided by means of an axiomatic characterization. This characterization makes possible abstract comparison of the index with previously established private good indices.While we have restricted our attention to simple games, the index presented can be generalized to provide a value on games in characteristic function form. We leave this topic for future conideration.  相似文献   

3.
In general, synergies across license valuations complicate the auction design process. Theory suggests that a simple (i.e., non-combinatorial) auction will have difficulty in assigning licenses efficiently in such an environment. This difficulty increases with increases in fitting complexity. In some environments, bidding may become mutually destructive. Experiments indicate that a properly designed combinatorial auction is superior to a simple auction in terms of economic efficiency and revenue generation in bidding environments with a low amount of fitting complexity. Concerns that a combinatorial auction will cause a threshold problem are not borne out when bidders for small packages can communicate.  相似文献   

4.
The experimental treatments analysed in this paper are simple in that there is a unique Nash equilibrium resulting in each player having a dominant strategy. However, the data show quite clearly that subjects do not always choose this strategy. In fact, when this dominant strategy is not a focal outcome it does not even describe the average decision adequately. It is shown that average individual decisions are best described by a decision error model based on a censored distribution as opposed to the truncated regression model which is typically used in similar studies. Moreover it is shown that in the treatments where the dominant strategy is not focal dynamics are important with average subject decisions initially corresponding to the focal outcome and then adjusting towards the Nash prediction. Overall, 66.7% of subjects are consistent with Payoff Maximization, 27.8% are consistent with an alternate preference maximization and 5.6% are random.  相似文献   

5.
This paper examines the tie between the popular black box neoclassical quantity-setting firm under demand uncertainty and a firm with a rudimentary but explicit employee relation organizational structure in which workers are offered fixed wages for following management directives. Surprisingly, the quantity-setting firm unambiguously mimics optimal employment relation hiring and work rules when the contract is incentive-compatible ex post. The attitude toward risk is shown to be the key determinant of whether or not the quantity-setting firm replicates the optimal employment relation contract when ex post incentive-compatibility is relaxed.  相似文献   

6.
The paper investigates a climate-economy model with an iso-elastic welfare function in which one parameter measures relative risk-aversion and a distinct parameter measures resistance to intertemporal substitution.We show both theoretically and numerically that climate policy responds differently to variations in the two parameters. In particular, we show that higher but lower leads to increase emissions control. We also argue that climate-economy models based on intertemporal expected utility maximization, i.e. models where = , may misinterpret the sensitivity of the climate policy to risk-aversion.  相似文献   

7.
Summary This paper views uncertainty and economic fluctuations as being primarily endogenous and internally propagated phenomena. The most important Endogenous Uncertainty examined in this paper is price uncertainty which arises when agents do not have structural knowledge and are complelled to make decisions on the basis of their beliefs. We assume that agents adopt Rational Beliefs as in Kurz [1994a]. The trading of endogenous uncertainty is accomplished by using Price Contingent Contracts (PCC) rather than the Arrow-Debreu state contingent contracts. The paper provides a full construction of the price state space which requires the expansion of the exogenous state space to include the state of beliefs. This construction is central to the analysis of equilibrium with endogenous uncertainty and the paper provides an existence theorem for a Rational Belief Equilibrium with PCC. It shows how the PCC completes the markets for trading endogenous uncertainty and lead to an allocation which is Pareto optimal. This paper also demonstrates that endogenous uncertainty is generically present in this new equilibrium.This research was supported in part by the Fondazione Eni Enrico Mattei of Milan, Italy, and by the National Science Council of Taiwan. The authors thank Carsten K. Nielsen for valuable suggestions.  相似文献   

8.
Summary A number of authors have used formal models of computation to capture the idea of bounded rationality in repeated games. Most of this literature has used computability by a finite automaton as the standard. A conceptual difficulty with this standard is that the decision problem is not closed. That is, for every strategy implementable by an automaton, there is some best response implementable by an automaton, but there may not exist any algorithm forfinding such a best response that can be implemented by an automaton. However, such algorithms can always be implemented by a Turing machine, the most powerful formal model of computation. In this paper, we investigate whether the decision problem can be closed by adopting Turing machines as the standard of computability. The answer we offer is negative. Indeed, for a large class of discounted repeated games (including the repeated Prisoner's Dilemma) there exist strategies implementable by a Turing machine for whichno best response is implementable by a Turing machine.The work was begun while Nachbar was a visitor at The Center for Mathematical studies in Economics and Management Science at Northwestern University; he is grateful for their hospitality. We are also grateful to Robert Anderson and Neil Gretsky and to seminar audiences at UCLA for useful comments, and to the National Science Foundation and the UCLA Academic Senate Committee on Research for financial support. This paper is an outgrowth of work reported in Learning and Computability in Discounted Supergames.  相似文献   

9.
Convergence empirics across economies with (some) capital mobility   总被引:4,自引:3,他引:4  
This paper uses a model of growth and imperfect capital mobility across multiple economies to characterize the dynamics of (cross-country) income distributions. This allows convenient study of the convergence hypothesis, and reveals, where appropriate, polarization and clumping within subgroups. The data show little cross-country convergence; instead, the important features are persistence, immobility, and polarization, exemplified by convergence club or twin peaks dynamics.  相似文献   

10.
The groundzero premise (so to speak) of the biological sciences is that survival and reproduction is the basic, continuing, inescapable problem for all living organisms; life is at bottom a survival enterprise. It follows that survival is the paradigmatic problem for human societies as well; it is a prerequisite for any other, more exalted objectives. Although the term adaptation is also familiar to social scientists, until recently it has been used only selectively, and often very imprecisely. Here a more rigorous and systematic approach to the concept of adaptation is proposed in terms of basic needs. The concept of basic human needs has a venerable history – tracing back at least to Plato and Aristotle. Yet the development of a formal theory of basic needs has lagged far behind. The reason is that the concept of objective, measurable needs is inconsistent with the theoretical assumptions that have dominated economic and social theory for most of this century, namely, valuerelativism and cultural determinism. Nevertheless, there have been a number of efforts over the past 30 years to develop more universalistic criteria for basic needs, both for use in monitoring social wellbeing (social indicators) and for public policy formulation. Here I will advance a strictly biological approach to perationalizing the concept of basic needs. It is argued that much of our economic and social life (and the motivations behind our revealed preferences and subjective utility assessments), not to mention the actions of modern governments, are either directly or indirectly related to the meeting of our basic survival needs. Furthermore, these needs can be specified to a first approximation and supported empirically to varying degrees, with the obvious caveat that there are major individual and contextual variations in their application. Equally important, complex human societies generate an array of instrumental needs which, as the term implies, serve as intermediaries between our primary needs and the specific economic, cultural and political contexts within which these needs must be satisfied. An explicit framework of Survival Indicators, including a profile of Personal Fitness and an aggregate index of Population Fitness, is briefly elucidated. Finally, it is suggested that a basic needs paradigm could provide an analytical tool (a biologic) for examining more closely the relationship between our social, economic and political behaviors and institutions and their survival consequences, as well as providing a predictive tool of some value.  相似文献   

11.
The computer revolution took very long to pay off in productivity growth in the computer-using sectors. The relative wage of skilled workers, however, has risen sharply from the early days of the computer revolution onward. As skilled workers wages reflect their productivity, the two observations together pose a puzzle.This paper provides a micro-based explanation for the long diffusion period of the computer revolution. The general equilibrium model of growth zooms in on the research process and provides an explanation for sluggish growth with booming relative wages of the skilled. Technological progress in firms is driven by research aimed at improving the production technology (innovation) and by assimilation of ideas or principles present outside the firm (learning). A new General Purpose Technology (GPT) like the computer revolution generates an initial slowdown in economic growth and an increase in the skill premium.Acknowledgement I am indebted to Theo van de Klundert for suggestions and encouragement. Suggestions by Jan Boone, Bas Jacobs, Patrick Francois, Henri de Groot, Lex Meijdam, Niek Nahuis Sjak Smulders, Harald Uhlig and anonymous referees have contributed to the paper.  相似文献   

12.
We examine behavior in a Coasian contracting game with incomplete information. Experimental subjects propose contracts, while automaton property right holders or robot players with uncertain preferences respond to those proposals. The most common pattern of proposals observed in these games results in too many agreements and, in some games, payoffs that are stochastically dominated by those resulting from rational proposals (which imply fewer agreements). In this sense, we observe a winner's curse similar to that observed in bidding games under incomplete information, such as the common value auction (Kagel, J.H. and Levin, D. (1986) American Economic Review. 76, 894–920) and the takeover game (Samuelson, W. and Bazerman, M.H. (1985) In Research in Experimental Economics, Vol. 3. JAI Press, Greenwich, pp. 105–137; Ball, S.B., Bazerman, M.H., and Carroll, J.S. (1990) Organizational Behavior and Human Decision Processes. 48, 1–22; Holt, C. and Sherman, R. (1994) American Economic Review. 84, 642–652). While the naïve model of behavior nicely predicts the winner's curse in those previous bidding games, it does not do so here. Instead, an alternative model we call the guarantor model explains the anomalous behavior best. Hence, we suggest this is a new variant of the winner's curse.  相似文献   

13.
InThe Sensory Order, Friedrich A. Hayek describes the human mind as an apparatus of classification that evolves through experience and that reaches decisions by modeling the alternative courses of action that are available to it. Hayek's mechanistic conception of mind argues aginst the possibility of central planning and against the cogency of any rule that denigrates subjective decision making by employers or other economic agents. As implied by Gödel's proof, no brain, human or mechanical, can ever be sufficiently complex to explain itself. There will therefore always be certain knowledge and rules that cannot be articulated to the satisfaction of a central planner or tribunal.  相似文献   

14.
Convention, Social Order, and the Two Coordinations   总被引:2,自引:0,他引:2  
The word coordination has two meanings, and thesemeanings are often conflated. One meaning, associated with ThomasSchelling, is seen in situations like choosing whether to driveon the left or the right; the drivers must coordinate to eachother's behavior. The other meaning, associated with FriedrichHayek, means that a concatenation of activities is arranged soas to produce good results. Along with the Schelling sense ofcoordination comes the notion of convention, such as drivingon the right. Some conventions are consciously designed; othersemerge without design (or are emergent). Along with the Hayeksense of coordination comes the notion of social order. Somesocial orders, such as the skeleton of activities within thefirm or within the hypothetical socialist economy, are consciouslyplanned. Other social orders, such as the catallaxy of the freesociety, function without central planning (or are spontaneous).Distinguishing between the two coordinations (and, in parallelfashion, between convention and social order) clarifies thinkingand resolves some confusions that have arisen in discussionsof coordination and spontaneous order. The key distinctionsare discussed in the context of the thought of, on the one hand,Menger, Schelling, David Lewis, and the recent path-dependencetheorists, and, on the other hand, Smith, Hayek, Polanyi, Coase,and the modern Austrian economists. The paper concludes witha typology that encompasses the several distinctions.  相似文献   

15.
We estimate the respective contributions of institutions, geography, and trade in determining income levels around the world, using recently developed instrumental variables for institutions and trade. Our results indicate that the quality of institutions trumps everything else. Once institutions are controlled for, conventional measures of geography have at best weak direct effects on incomes, although they have a strong indirect effect by influencing the quality of institutions. Similarly, once institutions are controlled for, trade is almost always insignificant, and often enters the income equation with the wrong (i.e., negative) sign. We relate our results to recent literature, and where differences exist, trace their origins to choices on samples, specification, and instrumentation.  相似文献   

16.
Collective action can take place at a plurality of levels. It has to be based on a constitution which defines the basic rules of interaction. Here, we are concerned with the problem of the constitutional setting of bottom-up formal institutions with a club nature. The pressure to improve the efficiency of services pushes local administrations to co-ordinate to produce public goods. This process has stimulated the birth of different forms of agencies or private companies with a club nature. The aim of this paper is to discuss the effects of institutional interdependence on the efficiency of this kind of collective action. In order to shed some light on this problem, the paper first discusses the problem of the relativity of efficiency to the institutional setting. A framework of analysis is then discussed to identify the main factors affecting collective action. Finally some evidence will be provided by a comparative institutional analysis performed on some case studies concerning local associational forms among communes in north-eastern Italy.  相似文献   

17.
The paper compares the relative efficiency of country models in the relationship between finance and investments. Results, confirmed under three different panel data estimates (Arellano-Bond GMM method, random and fixed effect estimates) suggest that: i) the UK thick market reduces informational asymmetries for large firms and for those firms providing good signals to shareholders; ii) the Japanese vertical (between firms and banks) integration and horizontal (among firms) integration almost eliminates financial constraints (the horizontal integration effect) and equates agency costs across firms (the vertical integration effect). These results are consistent with the short-termist hypothesis which assumes that the Japanese economic system can process information more efficiently reducing managerial myopic behaviour and thereby determining positive effects on long term growth.  相似文献   

18.
Summary SupposeY n is a sequence of i.i.d. random variables taking values in Y, a complete, separable, non-finite metric space. The probability law indexed by, is unknown to a Bayesian statistician with prior, observing this process. Generalizing Freedman [8], we show that generically (i.e., for a residual family of (,) pairs) the posterior beliefs do not weakly converge to a point-mass at the true. Furthermore, for every open setG , generically, the Bayesian will attach probability arbitrarily close to one toG infinitely often. The above result is applied to a two-armed bandit problem with geometric discounting where armk yields an outcome in a complete, separable metric spaceY k. If the infimum of the possible rewards from playing armk is less than the infimum from playing armk', then armk is (generically) chosen only finitely often. If the infimum of the rewards are equal, then both arms are played infinitely often.  相似文献   

19.
Patrick Gunning refuses to acknowledge the most salient arguments against the Chicago law and economics case for negligence made by Austrian economists. Because of this, he makes the same errors in his defense of Coase that permeate the Chicago paradigm. In particular, his defense of Coasean type analysis completely ignores Austrian cost theory, i.e., that all economically relevant costs are strictly subjective and therefore conceptually impossible to measure. He also fails to grasp the implications of disequilibrium market process theory for the use of any kind of least-cost-avoider rule in the economic analysis of the law. As a result, Gunning's defense of Coase suffers from the same pretense of knowledge as the analysis that he is defending.  相似文献   

20.
Ohne ZusammenfassungNach einem in der Nationalökonomischen Gesellschaft in Wien am 27. April 1954 gehaltenen Vortrag.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号