共查询到20条相似文献,搜索用时 500 毫秒
1.
In this paper the long-run trend in RPI inflation (core inflation) for the UK over the 1961–1997 period is estimated within the framework of a multivariate common trends model which extends
the bivariate VAR approach of Quah and Vahey (1995). In this context core inflation is directly linked to money and wage growth and interpreted
as the long-run forecast of inflation from a small-scale, cointegrated macroeconomic system.
First version received: September 1999/Final version received: October 2001
RID="*"
ID="*" We thank two anonymous referees for many helpful comments and suggestions. Work on this paper was partially conducted
when C. Morana was at Heriot-Watt University. 相似文献
2.
Armando Levy 《Empirical Economics》2003,28(1):3-22
This paper proposes a semi-parametric approach to estimation in Tobit models. A generalized additive Tobit model of residential local long distance (intra-LATA) telephone demand is estimated on a cross-section of residential telephone
consumers across twenty-eight states. While past studies of telecommunications demand have used fully parametric models, the
model presented here is non-parametric in two dimensions: first no distributional assumption is made for the error distribution,
and second, the demand equation is non-parametric with respect to price. We find that the elasticity of demand is substantially
lower (in absolute value) that found in previous studies for a 40% cut in tariffs.
First version received: July 2000/Final version received: March 2001
RID="*"
ID="*" I thank the referee and Associate Editor for suggestions which improved the paper.
The views expressed here are of the author and not Analysis Group | Economics. 相似文献
3.
John Geanakoplos 《Economic Theory》2003,21(2-3):585-603
Summary. The existence of Nash and Walras equilibrium is proved via Brouwer's Fixed Point Theorem, without recourse to Kakutani's
Fixed Point Theorem for correspondences. The domain of the Walras fixed point map is confined to the price simplex, even when
there is production and weakly quasi-convex preferences. The key idea is to replace optimization with “satisficing improvement,”
i.e., to replace the Maximum Principle with the “Satisficing Principle.”
Received: July 9, 2001; revised version: February 25, 2002
RID="*"
ID="*" I wish to thank Ken Arrow, Don Brown, and Andreu Mas-Colell for helpful comments. I first thought about using Brouwer's
theorem without Kakutani's extension when I heard Herb Scarf's lectures on mathematical economics as an undergraduate in 1974,
and then again when I read Tim Kehoe's 1980 Ph.D dissertation under Herb Scarf, but I did not resolve my confusion until I
had to discuss Kehoe's presentation at the celebration for Herb Scarf's 65th birthday in September, 1995.
RID="*"
ID="*"Correspondence to: C. D. Aliprantis 相似文献
4.
Until recently, considerable effort has been devoted to the estimation of panel data regression models without adequate attention
being paid to the drivers of interaction amongst cross-section and spatial units. We discuss some new methodologies in this
emerging area and demonstrate their use in measurement and inferences on cross-section and spatial interactions. Specifically,
we highlight the important distinction between spatial dependence driven by unobserved common factors and those based on a
spatial weights matrix. We argue that purely factor-driven models of spatial dependence may be inadequate because of their
connection with the exchangeability assumption. The three methods considered are appropriate for different asymptotic settings;
estimation under structural constraints when N is fixed and T → ∞, whilst the methods based on GMM and common correlated effects are appropriate when T ≫ N → ∞. Limitations and potential enhancements of the existing methods are discussed, and several directions for new research
are highlighted. 相似文献
5.
On the choice of functional form in stochastic frontier modeling 总被引:4,自引:1,他引:4
This paper examines the effect of functional form specification on the estimation of technical efficiency using a panel data
set of 125 olive-growing farms in Greece for the period 1987–93. The generalized quadratic Box-Cox transformation is used
to test the relative performance of alternative, widely used, functional forms and to examine the effect of prior choice on
final efficiency estimates. Other than the functional specifications nested within the Box-Cox transformation, the comparative
analysis includes the minflex Laurent translog and generalized Leontief that possess desirable approximation properties. The
results indicate that technical efficiency measures are very sensitive to the choice of functional specification. Perhaps
most importantly, the choice of functional form affects the identification of the factors affecting individual performance
– the sources of technical inefficiency. The analysis also shows that while specification searches do narrow down the set
of feasible alternatives, the identification of the most appropriate functional specification might not always be (statistically)
feasible.
First version received: November 1999/Final version received: July 2001
RID="*"
ID="*" The authors wish to thank Almas Heshmati, Robert Romain, and an anonymous referee for insightful comments and suggestions.
Special thanks go to the associate editor who handled the paper, and whose careful reading and suggestions have improved the
paper substantially. The second author wishes to acknowledge the financial support from “President SSHRC” from the University
of Saskatchewan. The usual caveats with respect to opinions expressed in the paper apply. Senior authorship is shared. This is University of Nebraska-Lincoln Agricultural Research Division Article No. 13270. 相似文献
6.
The proper panel econometric specification of the gravity equation: A three-way model with bilateral interaction effects 总被引:6,自引:0,他引:6
We argue that the proper specification of a panel gravity model should include main (exporter, importer, and time) as well
as time invariant exporter-by-importer (bilateral) interaction effects. In a panel of 11 APEC countries, the latter are highly
significant and account for the largest part of variation.
First version received: February 2001/Final version received: June 2002
RID="*"
ID="*" We are grateful to two anonymous referees and Robert Kunst for their helpful comments. 相似文献
7.
Social Security and personal saving: 1971 and beyond 总被引:1,自引:0,他引:1
Feldstein (1996, 1974) reported that Social Security in the U.S.A. reduced personal saving (“saving”) in 1992 (1971) by $416
($61) billion. I reestimate his life-cycle consumption specification using data from the latest NIPA revision, correct his
calculations, and find that the implied reduction in 1992 (1971) saving is now $280 ($22) billion, 48% (16%) of actual net
private saving, with a standard error of $114 ($14) billion. If structural breaks around WWII and the 1972 Social Security
amendments (which raised real per capita SSW by 22%) are allowed, and the market value of Treasury debt included in the specification, the reduction in 1971 and 1992
saving attributable to Social Security is at most 0.55 times its standard error, and 12% of net private saving. I then reestimate
the preferred specification of Coates and Humphreys (1999), allowing for these structural breaks and relaxing other restrictions.
The implied effect of Social Security on saving is again statistically zero.
First version received: September 2000/Final version received: September 2001
RID="*"
ID="*" I thank Les Oxley for pointing out that correcting for AR(1) residuals is not a categorical imperative but a cultural
relative, in which case common factor restrictions are crucial. 相似文献
8.
The economic effects of restrictions on government budget deficits: imperfect private credit markets
Summary. The present paper is an extension of Ghiglino and Shell [7] to the case of imperfect consumer credit markets. We show that
with constraints on individual credit and only anonymous (i.e., non-personalized) lump-sum taxes, strong (or “global”) irrelevance
of government budget deficits is not possible, and weak (or “local”) irrelevance can hold only in very special situations.
This is in sharp contrast to the result for perfect credit markets. With credit constraints and anonymous consumption taxes,
weak irrelevance holds if the number of tax instruments is sufficiently large and at least one consumer's credit constraint
is not binding. This is an extension of the result for perfect credit markets.
Received: August 28, 2001; revised version: March 25, 2002
RID="*"
ID="*" We thank Todd Keister, Bruce Smith, and two referees for helpful comments.
Correspondence to: C. Ghiglino 相似文献
9.
Summary. We provide a detailed portfolio analysis for a financial market with an atomless continuum of assets. In the context of an
exact arbitrage pricing theory (EAPT), we go beyond the characterization of the existence of important portfolios (normalized
riskless, mean, cost, factor and mean-variance efficient portfolios) to furnish exact portfolio compositions in terms of explicit
portfolio weights. Such an analysis has not been furnished before in the context of the asymptotic arbitrage pricing theory
(APT). We also characterize conditions under which a mean-variance efficient portfolio is a benchmark portfolio used in the
EAPT to proxy essential risk. We illustrate our results with several examples of specific financial markets.
Received: May 30, 2002; revised version: August 15, 2002
RID="*"
ID="*"Some of the results reported here constituted part of Cowles Foundation Discussion Paper– No. 1139 circulated under the title “Hyperfinite Asset Pricing Theory”; additional results were obtained when Sun visited
the Department of Economics at Johns Hopkins University during March 2002. This paper was presented at the Conference on Economic Design held at NYU on July 6–9, 2002
Correspondence to: M. A. Khan 相似文献
10.
Summary. We prove existence of a competitive equilibrium in a version of a Ramsey (one sector) model in which agents are heterogeneous
and gross investment is constrained to be non negative. We do so by converting the infinite-dimensional fixed point problem
stated in terms of prices and commodities into a finite-dimensional Negishi problem involving individual weights in a social
value function. This method allows us to obtain detailed results concerning the properties of competitive equilibria. Because
of the simplicity of the techniques utilized our approach is amenable to be adapted by practitioners in analogous problems
often studied in macroeconomics.
Received: September 13, 2001; revised version: December 9, 2002
RID="*"
ID="*" We are grateful to Tapan Mitra for pointing out errors as well as making very valuable suggestions. Thanks are due
to Raouf Boucekkine and Jorge Duran for additional helpful discussions. We also thank an anonymous referee for his/her helpful
comments. The second author acknowledges the financial support of the Belgian Ministry of Scientific Research (Grant ARC 99/04-235
“Growth and incentive design”) and of the Belgian Federal Goverment (Grant PAI P5/10, “Equilibrium theory and optimization
for public policy and industry regulation”).
Correspondence to: C. Le Van 相似文献
11.
Estimating a mixture of stochastic frontier regression models via the em algorithm: A multiproduct cost function application 总被引:1,自引:1,他引:1
Steven B. Caudill 《Empirical Economics》2003,28(3):581-598
Researchers have become increasingly interested in estimating mixtures of stochastic frontiers. Mester (1993), Caudill (1993),
and Polachek and Yoon (1987), for example, estimate stochastic frontier models for different regimes, assuming sample separation
information is given. Building on earlier work by Lee and Porter (1984), Douglas, Conway, and Ferrier (1995) estimate a stochastic
frontier switching regression model in the presence of noisy sample separation information. The purpose of this paper is to
extend earlier work by estimating a mixture of stochastic frontiers assuming no sample separation information. This case is more likely to occur in practice than even noisy sample separation information.
In order to estimate a mixture of stochastic frontiers with no sample separation information, an EM algorithm to obtain maximum
likelihood estimates is developed. The algorithm is used to estimate a mixture of stochastic (cost) frontiers using data on
U.S. savings and loans for the years 1986, 1987, and 1988. Statistical evidence is found supporting the existence of a mixture
of stochastic frontiers.
First version received: 3/13/01/Final version received: 6/17/02
RID="*"
ID="*" I am grateful to Ram Acharya, Janice Caudill, and especially James R. Barth for several helpful comments on an earlier
version of the paper. During the revision process I benefitted greatly from the suggestions of the Associate Editor and three
anonymous referees. 相似文献
12.
Rolando F. Peláez 《Empirical Economics》2003,28(2):417-429
This paper provides the strongest evidence to-date on the predictability of real stock prices over long horizons. Ex ante
forecasts account for over two-thirds of the variation of the growth rate of real stock prices over ten year spans from 1940
through 2001. The paper forecasts negative growth rates of real stock prices over the next ten years. This bearish long-run
outlook is buttressed by the long-run relationship between the growth rates of real stock prices, inflation, dividends, and
productivity.
First version received: June 2000/Final version received: June 2001
RID="*"
ID="*" Special thanks to an anonymous referee for helpful comments. 相似文献
13.
We estimate a linear and a piecewise linear Phillips curve model with regional labor market data for West German and Neue
L?nder. Employing regional observations allows us to country difference the data. This eliminates, under the assumption of
homogeneous L?nder, supply shocks and changes in the formation of expectations as possible identification failures. With seemingly
unrelated regressions we find a flat Phillips curve in the Neue L?nder. For the West German L?nder a piecewise linear model
with a higher inflation-unemployment tradeoff for the regime of low unemployment rates fits the data very well. The results
hold true if we control for endogeneity of the unemployment rate. With a kinked but upward sloping aggregate supply curve
there seems to be room for stabilization policies, at least in the range of aggregate demand shifts that our data covers.
First version received: December 2000/Final version accepted: Jan. 2002
RID="*"
ID="*" An earlier version of this paper was written while the second author was at Universidad Carlos III in Madrid. He thanks
Juan Dolado and is grateful for financial support by the TMR Program on New Approaches for the Study of Economic Fluctuations. He would also like to thank Bertrand Koebel for his critique on that earlier version. Both authors are grateful to an editor
and three anonymous referees for very helpful comments. Moreover, we wish to thank participants of the seminar on Quantitative Wirtschaftsforschung by Jürgen Wolters and Peter Kuhbier, Freie Universit?t Berlin. Finally we profited from discussions with participants at
the conferences of the European Economic Association in Lausanne and the Verein für Socialpolitik in Magdeburg where the paper
was presented. Of course, all errors are to our sole responsibility. 相似文献
14.
Summary. This paper discusses and develops “non-welfaristic” arguments on distributive justice à la J. Rawls and A. K. Sen, and formalizes,
in cooperative production economies, “non-welfaristic” distribution rules as game form types of resource allocation schemes.
First, it conceptualizes Needs Principle which the distribution rule should satisfy if this takes individuals' needs into
account. Second, one class of distribution rules which satisfy Needs Principle, a class of J-based Capability Maximum Rules, is proposed. Third, axiomatic characterizations of the class of J-based Capability Maximum Rules are provided.
Received: July 30, 1999; revised version: March 11, 2002
RID="*"
ID="*" We are grateful to an anonymous referee of this journal, Professors Marc Fleurbaey, Nicolas Gravel, Ryo-ichi Nagahisa,
Prasanta Pattanaik, Kotaro Suzumura, Koich Tadenuma, and Yongsheng Xu for their fruitful comments. An earlier version of this
paper was published with the title name, “A Game Form Approach to Theories of Distributive Justice: Formalizing Needs Principle”
as the Discussion Paper No. 407 of the Institute of Social and Economic Research, Osaka University, and in the proceedings
of the International Conference on Logic, Game, and Social Choice held at Oisterwijk in May 1999. That version was also presented
at the 3rd Decentralization Conference in Japan held at Hitotsubashi University in September 1997, at the annual meeting of
the Japan Association of Economics and Econometrics held at Waseda University in September 1997, and the 4th International
Conference of Social Choice and Welfare held at University of British Colombia in July 1998. This research was partially supported
by the Japanese Ministry of Education and the Ministry of Health and Welfare.
Correspondence to: N. Yoshihara 相似文献
15.
Summary. Using a general equilibrium framework, this paper analyzes the equilibrium provision of a pure public bad commodity (for
example pollution). Considering a finite economy with one desired private good and one pure public “bad” we explicitly introduce the concept of Lindahl equilibrium and the Lindahl prices into a pure public bad economy. Then, the Lindahl provision
is analyzed and compared with the Cournot-Nash provision. The main results for economies with heterogeneous agents state that
the asymptotic Lindahl allocation of the pure public bad is the null allocation. In contrast, the asymptotic Cournot-Nash
provision of the public bad might approach infinity. Other results were obtained in concert with the broad analysis of the
large finite economies with pure public bad commodities.
Received: July 26, 2001; revised version: March 12, 2002
RID="*"
ID="*" We are indebt to Nicholas Yannelis and anonymous referee for their valuable comments and suggestions.
Correspondence to: B. Shitovitz 相似文献
16.
Summary. A well-known result in the medical insurance literature is that zero co-insurance is never second-best for insurance contracts
subject to moral hazard. We replace the usual expected utility assumption with a version of the rank-dependent utility (RDU)
model that has greater experimental support. When consumers exhibit such preferences, we show that zero co-insurance may in
fact be optimal, especially for low-risk consumers. Indeed, it is even possible that the first-best and second-best contracts
are identical. In this case, there is no “market failure”, despite the informational asymmetry. We argue that these RDU results are in
better accord with the empirical evidence from US health insurance markets.
Received: February 26, 2001; revised version: October 4, 2002
RID="*"
ID="*"The authors would particularly like to thank Simon Grant, John Quiggin, Peter Wakker and an anonymous referee for valuable
comments and suggestions on earlier drafts. The paper has also benefitted from the input of seminar audiences at The Australian
National University, University of Auckland, University of Melbourne and University of Sydney. Ryan also gratefully acknowledges
the financial support of the ARC, through Grant number A000000055.
Correspondence to:R. Vaithianathan 相似文献
17.
Summary. Suppose a large economy with individual risk is modeled by a continuum of pairwise exchangeable random variables (i.i.d.,
in particular). Then the relevant stochastic process is jointly measurable only in degenerate cases. Yet in Monte Carlo simulation,
the average of a large finite draw of the random variables converges almost surely. Several necessary and sufficient conditions
for such “Monte Carlo convergence” are given. Also, conditioned on the associated Monte Carlo -algebra, which represents macroeconomic risk, individual agents' random shocks are independent. Furthermore, a converse to
one version of the classical law of large numbers is proved.
Received: October 29, 2001; revised version: April 24, 2002
RID="*"
ID="*" Part of this work was done when Yeneng Sun was visiting SITE at Stanford University in July 2001. An early version
of some results was included in a presentation to Tom Sargent's macro workshop at Stanford. We are grateful to him and Felix
Kübler in particular for their comments. And also to Marcos Lisboa for several discussions with Peter Hammond, during which
the basic idea of the paper began to take shape.
Correspondence to: P.J. Hammond 相似文献
18.
Robert J. Aumann 《Economic Theory》2003,21(2-3):233-239
Summary. Evidence is adduced that the sages of the ancient Babylonian Talmud, as well as some of the medieval commentators thereon,
were well aware of sophisticated concepts of modern theories of risk-bearing.
Received: April 10, 2002; revised version: May 7, 2002
RID="*"
ID="*"Presented at the Institute for Mathematical Studies in the Social Sciences-Economics, Stanford University, August 4,
1981. Subsequent to that presentation, the author's attention was drawn to an article by Zvi Ilani, “Models in the Economics
of Uncertainty: The Cost of Concluding a Conditional Contract, according to the Talmud and the Halachic Literature,” Iyunim Bekalkala (Investigations in Economics), The Israel Association for Economics, Jerusalem, Nissan 5740 (April 1980), 246–261 (in Hebrew). Inter alia, Ilani treats
the Talmudic passage that forms the subject of this paper, and provides a fairly comprehensive review of the medieval commentaries
thereon; undoubtedly, he was the first to recognize in print the relevance of this passage to modern economic theories of
uncertainty. It is not clear, though, whether or not his understanding of the passage agrees with ours. The current paper
appeared in January 2002 in the Research Bulletin Series of the Research Center on Jewish Law and Economics, Department of
Economics, Bar Ilan University. 相似文献
19.
Susanne Maidorn 《Empirical Economics》2003,28(2):387-402
Assuming full hysteresis in the Austrian labour market, a simple macroeconomic framework is used to model the effect of four
structural shocks, i.e. shocks to productivity, demand, wages and labour supply. By using SVAR analysis, we derive impulse-response
functions that show the effects of these shocks on unemployment. What constitutes a distinctive feature of our study is the
deliberate use of overidentifying restrictions, allowing for a likelihood ratio test. The objection to SVAR methodology, that
it relies on arbitrary assumptions, can thus be overcome, as invalid sets of identifying restrictions are rejected.
First version received: September 2000/Final version received: March 2002
RID="*"
ID="*" I thank Juan F. Jimeno, Martin Wagner, Helmut Hofer and Bernhard B?hm for their assistance; Robert Kunst and Martin
Spitzer for their discussion of an earlier version of this paper; Thomas Sparla, Michael Roos and two anonymous referees for
their helpful comments. 相似文献
20.
N. E. Sofronidis 《Economic Theory》2008,34(2):395-399
Our purpose in this article is to prove that given any integer n ≥ 2 and any non-empty compact Polish spaces S
1, ..., S
n
, if for any u ∈ C( S
1 × ... × S
n
, R)
n
, we denote by MNE(u) the set of mixed Nash equilibria of (S
1, ..., S
n
, u), then MNE(u) is a non-empty compact subset of P(S
1) × ... × P(S
n
) and if u
k
→ u in C(S
1 × ... × S
n
, R)
n
as k → ∞, then lim sup
k → ∞ MNE (u
k
) MNE(u).
The author would like to thank the referee for offering critical comments on this paper. 相似文献