首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 62 毫秒
1.
In this paper, by using a combination of long-run and short-run restrictions, we identify a small structural VECM which includes inflation, unemployment and the federal funds rate and study the dynamic interactions at different frequencies among these variables. Our results show that: (a) in accordance with the traditional view of economic fluctuations, aggregate demand shocks and monetary policy shocks push inflation and unemployment in opposite directions in the short run; (b) the permanent supply shock explains the long-run movement of inflation and unemployment. These conclusions are at odds with the prediction of “natural-rate” models but are consistent with the idea of a propagation mechanism which links productivity shocks to inflation and unemployment at medium and low frequencies. Thus, with respect to some recent studies (e.g. Beyer and Farmer, ECB Working Paper 121, 2002, and Ireland, J Monet Econ 44:279–291, 1999), we offer a different interpretation of the low-frequency comovements between inflation and unemployment characterizing the US economy in the last decades.
Antonio RibbaEmail:
  相似文献   

2.
We explore the implications of the farsightedness assumption on the conjectures of players in a coalitional Great Fish War model with symmetric players, derived from the seminal model of Levhari and Mirman (Bell J Econ 11:649–661, 1980). The farsightedness assumption for players in a coalitional game acknowledges the fact that a deviation from a single player will lead to the formation of a new coalition structure as the result of possibly successive moves of her rivals in order to improve their payoffs. It departs from mainstream game theory in that it relies on the so-called rational conjectures, as opposed to the traditional Nash conjectures formed by players on the behavior of their rivals. For values of the biological parameter and the discount factor more plausible than the ones used in the current literature, the farsightedness assumption predicts a wide scope for cooperation in non-trivial coalitions, sustained by credible threats of successive deviations that defeat the shortsighted payoff of any prospective deviator. Compliance or deterrence of deviations may also be addressed by acknowledging that information on the fish stock or on the catch policies actually implemented may be available only with a delay (dynamic farsightedness). In that case, the requirements are stronger and the sizes and number of possible farsighted stable coalitions are different. In the sequential move version, which could mimic some characteristics of fishery models, the results are not less appealing, even if the dominant player or dominant coalition with first move advantage assumption provides a case for cooperation with the traditional Nash conjectures.  相似文献   

3.
Pelikan (J Evol Econ 21:341–366, 2011) develops an interesting conceptual framework that adds to prior work on generalised Darwinism. Despite claims to the contrary we show that it is similar to the approach developed by Hodgson and Knudsen (J Evol Econ 16(4):343–366, 2006a, J Econ Behav Organ 75(1):12–24, 2010ab), Aldrich et al. (J Evol Econ 18(5):577–596, 2008) and others. Pelikan also mischaracterises the Hodgson–Knudsen position over Lamarckism. We show why the term is misleading (rather than strictly wrong) when applied to social evolution.  相似文献   

4.
Would I lie to you? On social preferences and lying aversion   总被引:1,自引:0,他引:1  
This paper reinterprets the evidence on lying or deception presented in Gneezy (Am. Econ. Rev. 95(1):384–394, 2005). We show that Gneezy’s data are consistent with the simple hypothesis that people are one of two kinds: either a person will never lie, or a person will lie whenever she prefers the outcome obtained by lying over the outcome obtained by telling the truth. This implies that so long as lying induces a preferred outcome over truth-telling, a person’s decision of whether to lie may be completely insensitive to other changes in the induced outcomes, such as exactly how much she monetarily gains relative to how much she hurts an anonymous partner. We run new but broadly similar experiments to those of Gneezy in order to test this hypothesis. While we also confirm that there is an aversion to lying in our subject population, our data cannot reject the simple hypothesis described above either.
Electronic Supplementary Material  The online version of this article () contains supplementary material, which is available to authorized users.   相似文献   

5.
The paper by Ghosh and Saha (Econ Theory 30:575–586, 2007) shows that entry can be socially excessive even if there are no scale economies. We show that exogenous cost asymmetry is responsible for this result. In a simple model with R&D investment by the more cost efficient firm, thus creating endogenous cost asymmetry, we show that entry is socially insufficient instead of excessive if the slope of the marginal cost of R&D is not very high.  相似文献   

6.
In the general context of smooth two-player games, this paper shows that there is a close connection between (constant) consistent conjectures in a given game and the evolutionary stability of these conjectures. Evolutionarily stable conjectures are consistent and consistent conjectures are the only interior candidates to be evolutionarily stable. Examples are provided to illustrate the result.  相似文献   

7.
This paper applies conjectural variations (CVs) to a model of public good provision and shows that CVs are superior to Nash beliefs. In addition to imposing consistency, as Bresnahan, I show that consistent conjectures (CCs) are obtained from individual payoff maximization. CCs emerge as the unique subgame perfect Nash equilibrium (NE) in a two-stage game in which beliefs are chosen in Stage 1 and quantities in Stage 2. There is an individual payoff advantage to non-Nash behavior, generating a Prisoner's Dilemma in conjectures in addition to the usual free-rider problem associated with public goods. The correct and payoff maximizing conjecture is the unique equilibrium in an evolutionary framework against a player with Nash conjectures. The consistent conjecture equilibrium is the unique evolutionary equilibrium when both players conjectures evolve. Hence, the NE prediction is too optimistic when players have rational conjectures.  相似文献   

8.
Endogenous timing in a mixed oligopoly with semipublic firms   总被引:1,自引:0,他引:1  
An endogenous order of moves is analyzed in a mixed market where a firm jointly owned by the public sector and private domestic shareholders (a semipublic firm) competes with n private firms. We show that there is an equilibrium in which firms take production decisions simultaneously. This result is strikingly different from that obtained by Pal (Econ Lett 61:181–185, 1998), who shows that when a public firm competes with n private firms all firms producing simultaneously in the same period cannot be sustained as a Subgame Perfect Nash Equilibrium outcome. Our result differs from that of Pal (Econ Lett 61:181–185, 1998) for two reasons: firstly, we consider that there is a semipublic firm rather than a public firm. Secondly, Pal (Econ Lett 61:181–185, 1998) considers that the public firm is less efficient than private firms while in our paper all firms are equally efficient.  相似文献   

9.
The aim of this paper is a sensitivity analysis with the core-periphery model of ‘new economic geography’ put forward in Grazi et al. (Environ Resour Econ 38:135–153, 2007). This model comprises interregional trade, agglomeration advantages and resource (land) use or environmental externalities. Grazi et al. (2007, GBR) compare a social welfare (SW) indicator with the ecological footprint (EF) indicator for measuring spatial sustainability of a set of land use configurations. Their main result is that the SW and the EF indicator can yield completely different rankings and only for extreme parameterizations of environmental externalities the rankings coincide. We adapt the model by interpreting total natural land as a resource constraint and differentiate between weak and strong sustainability. In a sensitivity analysis we show that the main results of GBR (2007) correspond to the case of weak sustainability in our adapted model version. In the case of strong sustainability our adapted model version shows the same welfare rankings for both indicators without the extreme parameterization that is necessary to obtain the same results in the original GBR (2007) model.  相似文献   

10.
It is the main aim of our paper to study network formation in experimental setups in discrete and continuous time. Our design is inspired by the theoretical model on network formation by Bala and Goyal (Econometrica, 68(5): 1181–1229, 2000) as well as the experiments by Callander and Plott (J. Public Econ., 89: 1469–1495, 2005) and Falk and Kosfeld (IEW Working Paper, University of Zürich, Zürich, Switzerland, No. 146, 2003). In particular, we analyze the role of star-shaped networks which are strict Nash-equilibria of the corresponding network formation game. Our experimental results show that strict Nash networks prove to be a good indicator for predicting network formation, particularly in continuous time. In explaining our results, it turns out that, among others, the complexity in coordinating on stars, the inequity aversion against unequal payoff distribution in the network, and the groups’ degrees of activity are the most important determinants for the formation of strict Nash networks.   相似文献   

11.
Truncated distributions commonly arise in economics and related areas, see, for example, Lee (Econ Lett 3:165–169, 1979), Lien (Econ Lett 19:243–247, 1985; Econ Lett 20:45–47, 1986), Burdett (Econ Lett 52:263–267, 1996), Sercu (Insur: Math and Econ 20:79–95, 1997), Abadir and Magdalinos (Econom Theory 18:1276–1287, 2002), and Horrace (J Econom 126:335–354, 2005). In this note, we consider the most commonly encountered truncated distributions with heavy tails: the truncated t distribution and the truncated F distribution. For each of these distributions, we derive explicit expressions for the moments and estimation procedures by the method of moments and the method of maximum likelihood. An application is illustrated to a popular data set in the econometric literature.   相似文献   

12.
13.
There is a long history of analyzing the workability of markets regarding concentration ratios as indicators of workability. In this paper, we discuss a comparatively new concept, the Coordination Failure Diagnostics (CFD) Concept, introduced by Grossekettler (1982, 1999, 2005, 2008). The CFD-concept analyzes real market processes by means of time series analysis and investigates whether they operate efficiently or not. Furthermore, the concept can be used as a tool for detecting cartels. Therefore, we develop a System of Cartel Markers which can be used to analyze real markets. We analyze the German cement industry as an example of a cartel and find significant differences to the competitive benchmark.  相似文献   

14.
In this paper, we study transmission of traits through generations in multifactorial inheritance models with sex- and time-dependent heritability. We further analyze the implications of these models under heavy-tailedness of traits’ distributions. Among other results, we show that in the case of a trait (for instance, a medical or behavioral disorder or a phenotype with significant heritability affecting human capital in an economy) with not very thick-tailed initial density, the trait distribution becomes increasingly more peaked, that is, increasingly more concentrated and unequally spread, with time. But these patterns are reversed for traits with sufficiently heavy-tailed initial distributions (e.g., a medical or behavioral disorder for which there is no strongly expressed risk group or a relatively equally distributed ability with significant genetic influence). Such traits’ distributions become less peaked over time and increasingly more spread in the population. The proof of the results in the paper is based on the general results on majorization properties of heavy-tailed distributions obtained recently in Ibragimov (Econom Theory 23: 501–517, 2007) and also presented in the author’s Ph.D. dissertation (Ibragimov, New majorization theory in economics and martingale convergence results in econometrics. Yale University, 2005) and several their extensions derived in this work.   相似文献   

15.
Market objectives can conflict with long-term goals. Behind the conflict is the impatience axiom introduced by T. Koopmans to describe choices over time. The conflict is resolved here by introducing a new concept, sustainable markets. These differ from Arrow-Debreu markets in that traders have sustainable preferences and no bounds on short sales. Sustainable preferences are sensitive to the basic needs of the present without sacrificing the needs of future generations and embody the essence of sustainable development (Chichilnisky in Soc Choice Welf 13(2):231–257, 1996a; Res Energy Econ 73(4):467–491, 1996b). Theorems 1 and 2 show that limited arbitrage is a necessary and sufficient condition describing diversity and ensuring the existence of a sustainable market equilibrium where the invisible hand delivers sustainable as well as efficient solutions (Chichilnisky in Econ Theory 95:79–108, 1995; Chichilnisky and Heal in Econ Theory 12:163–176, 1998). In sustainable markets prices have a new role: they reflect both the value of instantaneous consumption and the value of the long-run future. The latter are connected to the independence of the axiom of choice at the foundations of mathematics (Godel 1940).  相似文献   

16.
In this paper we analyze per capita incomes of the G7 countries using the common cycles test developed by Vahid and Engle (Journal of Applied Econometrics, 8:341–360, 1993) and extended by Hecq et al. (Oxford Bulletin of Economics and Statistics, 62:511–532, 2000; Econometric Reviews, 21:273–307, 2002) and the common trend test developed by Johansen (Journal of Economic Dynamics and Control, 12:231–254, 1988). Our main contribution is that we impose the common cycle and common trend restrictions in decomposing the innovations into permanent and transitory components. Our main finding is permanent shocks explain the bulk of the variations in incomes for the G7 countries over short time horizons, and is in sharp contrast to the bulk of the recent literature. We attribute this to the greater forecasting accuracy achieved, which we later confirm through performing a post sample forecasting exercise, from the variance decomposition analysis.
Paresh Kumar NarayanEmail:
  相似文献   

17.
Group polarization in the team dictator game reconsidered   总被引:1,自引:0,他引:1  
While most papers on team decision-making find that teams behave more selfishly, less trustingly and less altruistically than individuals, Cason and Mui (1997) report that teams are more altruistic than individuals in a dictator game. Using a within-subjects design we re-examine group polarization by letting subjects make individual as well as team decisions in an experimental dictator game. In our experiment teams are more selfish than individuals, and the most selfish team member has the strongest influence on team decisions. Various explanations for the different findings in Cason and Mui (1997) and in our paper are discussed.   相似文献   

18.
For a sample of over 700 celebrity appointments to corporate boards of directors over the period 1985–2006, we find positive excess market returns at the time of their announcement. The 1-, 2-, and 3-year long-run performance of the appointing firms provide corroborating evidence of the value of these appointments. We conclude that the appointment of celebrities as directors increase a firm’s visibility in a fashion consistent with Merton’s (J Finance 42:483–510, 1987) investor recognition hypothesis.  相似文献   

19.
This paper studies the optimal pricing of a two-sided monopoly platform when one side is affected by congestion. We show that the divide-and-conquer pricing strategy (or skewed pricing) depends not only on the relative magnitude of the sides’ price elasticities of demand but it also depends on the marginal congestion cost that an agent imposes on the others. Compared with the no-congestion case, this pricing strategy gives rise to some interesting features that violate the results of Rochet and Tirole (J Eur Econ Assoc 1:990–1029 in 2003, Rand J Econ 37:645–667 in 2006). In the case of equal price elasticities of demand, the no-congested side is charged the highest price. On the other hand, in the case of different price elasticities, the platform congestion pricing depends on a certain threshold of the marginal congestion cost. We show, under some conditions, that the divide-and-conquer pricing strategy is reversed. In the social context, the Rochet and Tirole’s (J Eur Econ Assoc 1:990–1029 in 2003) cost allocation condition is modified by the congestion cost. We show that the congestion does not only affect the buyers’ contribution to the sellers’ surplus, but it also affects the sellers’ contribution to the buyers’.  相似文献   

20.
We extend the tax versus permits literature by considering permit supply functions and pollution tax functions that are generalizations of the usual constant permit supply and constant pollution tax rate. In our model, pollution is not uniformly mixed and the regulator is uncertain about the polluting firms’ abatement costs. We determine the optimal permit supply functions and the optimal pollution tax functions. Using these functions, we show that permits lead unambiguously to lower total expected costs than taxes. We analyze the magnitude of this difference for a simple model of climate change. By relating the optimal permit supply functions to Weitzman (Am Econ Rev 68:683–691, 1978) we provide a new interpretation of his results.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号