首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The application of the Box-Cox transformation to the dependent and independent variables is discussed. Maximum likelihood and iterative GLS estimators are used and bootstrapping is carried out to compare the bootstrap sample variability with the finite sample variability (RMSE) and improve RMSE estimation. The biases of parameter estimators were shown to be substantial in small samples. The standard errors obtained from the Hessian matrix were a poor measure of the finite sample variability. Thet-ratios of the linear parameter estimators may not be normally distributed in small samples.The authors acknowledge the helpful comments of two referees.  相似文献   

2.
Let be a sequence of differential information economies, converging to a limit differential information economy (written as ). Denote by the set of all ε-private core allocations, ε ≥ 0 (for ε=0 we get the private core of Yannelis (1991), denoted by ). Under appropriate conditions, we prove the following stability results
(1) (upper semicontinuity): if , , and if f k f L 1-weakly, then .
(2) (lower semicontinuity): if , , ε > 0, then there exist , with f k f L 1-weakly.
JEL Classification Numbers D82, D50, D83, C62, C71, D46, D61Most of this work was done in Spring 2001, when Balder held a visiting professorship at the University of Illinois. Presentations based on this paper were given by Balder at the Midwestern Theory Conference in Madison, Wisconsin (May, 2001) and at the SAET Conference in Ischia, Italy (June, 2001).  相似文献   

3.
There are many bootstrap methods that can be used for econometric analysis. In certain circumstances, such as regression models with independent and identically distributed error terms, appropriately chosen bootstrap methods generally work very well. However, there are many other cases, such as regression models with dependent errors, in which bootstrap methods do not always work well. This paper discusses a large number of bootstrap methods that can be useful in econometrics. Applications to hypothesis testing are emphasized, and simulation results are presented for a few illustrative cases.  相似文献   

4.
A formula is derived for the probability that a "randomly selected" n-person matrix game has exactly k pure strategy equilibria. It is shown that for all n ≥ 2, this probability converges to e−1/k! as the sizes of the strategy sets of at least two players increase without bound. Thus the number of pure strategy equilibria in large random n-person matrix games is approximately Poisson distributed with mean one. The latter is a known result obtained by a new proof in this note. Journal of Economic Literature Classification Number: C72.  相似文献   

5.
We consider Friedrich Hayek's Road to Serfdom in light of global ideological and economic developments during the 60 years since its publication. Specific problems considered include socialism and planning, whether national socialism was really socialism, whether Hayek's views could be labeled as social democratic and whether his critique of social democracy was too strong, and his discussion of the prospects for international economic order. While often right and enormously influential, Hayek himself agreed that some of his predictions did not become true.  相似文献   

6.
Hilaire Belloc's The Servile State is often seen as an antisocialist tract arguing that “socialism is slavery.” It is typically assumed that an appreciation and defense of free market capitalism, as well as a general dislike of government intervention must motivate the its thesis. Nevertheless The Servile State is an argument against what Belloc saw as unbridled capitalism not collectivism. Belloc defines capitalism to mean a state in which there is a skewed distribution of wealth in society where the majority of people are dispossessed, proletariat, and a minority makes up the capitalist, property owning class. For Belloc capitalism is an inherently unstable system and servile measures arise to ameliorate insecurity and instability.  相似文献   

7.
The recombinant estimation technique of Mullin and Reiley (2006) can be a useful tool for analyzing data from normal-form games. The recombinant estimator falls within a general category of statistics known as U-statistics. This classification has both theoretical and practical implications: (1) the recombinant estimator is optimal (minimum variance) among unbiased estimators, (2) there is a computationally simple method for computing its asymptotic standard error, and (3) the estimation technique can be extended to multiple outcomes and to other types of inferential procedures commonly used for experimental data, such as the sign test. Simulation evidence suggests that researchers should use the asymptotic standard error rather than the standard error of Mullin and Reiley (2006) since the latter exhibits a downward bias. JEL Classification C12, C90 Although the idea of recombinant estimation appears previously in the literature (for example, Mitzkewitz and Nagel (1993) and Mehta et al. (1994)), Mullin and Reiley (2006) is the first attempt at formalizing the econometric methodology and proposing a method for standard-error calculation.  相似文献   

8.
This note shows the empirical dangers of the presence of large additive outliers when testing for unit roots using standard unit root statistics. Using recent proposed procedures applied to four Latin-American inflation series, I show that the unit root hypothesis cannot be rejected.Jel classification: C2, C3, C5I want to thank Pierre Perron for useful comments on a preliminary version of this paper. Helpful comments from an anonymous referee, and Yiagadeesen Samy are appreciated. I thank the Editor Baldev Raj for useful comments about the final structure of this paper. Finally, I also thank André Lucas for helpful suggestions concerning the use of his nice computer program Robust Inference Plus Estimation (RIPE).First revision received: August 2001/Final revision received: December 2002  相似文献   

9.
In this paper, we use p-best response sets—a set-valued extension of p-dominance—in order to provide a new sufficient condition for the robustness of equilibria to incomplete information: if there exists a set S which is a p-best response set with , and there exists a unique correlated equilibrium μ* whose support is in S then μ* is a robust Nash equilibrium.  相似文献   

10.
This paper studies equilibrium selection based on a class of perfect foresight dynamics and relates it to the notion of p-dominance. A continuum of rational players is repeatedly and randomly matched to play a symmetric n×n game. There are frictions: opportunities to revise actions follow independent Poisson processes. The dynamics has stationary states, each of which corresponds to a Nash equilibrium of the static game. A strict Nash equilibrium is linearly stable under the perfect foresight dynamics if, independent of the current action distribution, there exists a consistent belief that any player necessarily plays the Nash equilibrium action at every revision opportunity. It is shown that a strict Nash equilibrium is linearly stable under the perfect foresight dynamics with a small degree of friction if and only if it is the p-dominant equilibrium with p<1/2. It is also shown that if a strict Nash equilibrium is the p-dominant equilibrium with p<1/2, then it is uniquely absorbing (and globally accessible) for a small friction (but not vice versa). Set-valued stability concepts are introduced and their existence is shown. Journal of Economic Literature Classification Numbers: C72, C73.  相似文献   

11.
Summary Assume thatL is a topological vector lattice andY is a closed subset ofL + ×R N, whereR N denotes theN-dimensional Euclidean space. It is shown that the setY–L + ×R + N is closed ifY has appropriate monotonicity properties. The result is applicable to the case ofL equal toL with the Mackey topology, (L ,L 1).  相似文献   

12.
Modern theories of government finance stress the importance of an economy’s fiscal deficits in determining the course of monetary policy. Modern growth theory stresses the role of monetary factors in economic growth. This paper explores how these two are interrelated, using a simple AK growth model, one with money, reserve requirements, and government debt. We provide a comprehensive look at the coordination of macroeconomic policy and its effects on long-run growth under three alternative coordinating arrangements. We uncover some unconventional results regarding the relationship between growth and a number of policy variables; these rest squarely on the constraint of the coordination process.  相似文献   

13.
In this paper we considern-person nonzero-sum games where the strategy spaces of players are compact subsets ofRs. The main result states that if the payoff functions are semicontinuous and strongly quasi-concave then an ε-Nash equilibrium in pure strategies exists for every positive ε.Journal of Economic LiteratureClassification Number: C72.  相似文献   

14.
In this paper we examine long-run house price convergence across US states using a novel econometric approach advocated by Pesaran (2007) and Pesaran et al. (2009). Our empirical modelling strategy employs a probabilistic test statistic for convergence based on the percentage of unit root rejections among all state house price differentials. Using a sieve bootstrap procedure, we construct confidence intervals and find evidence in favour of convergence. We also conclude that speed of adjustment towards long-run equilibrium is inversely related to distance.  相似文献   

15.
Constructing bootstrap confidence intervals for impulse response functions (IRFs) from structural vector autoregression (SVAR) models has become standard practice in empirical macroeconomic research. The accuracy of such confidence intervals can deteriorate severely, however, if the bootstrap IRFs are biased. We document an apparently common source of bias in the estimation of the VAR error covariance matrix which can be easily reduced by a scale adjustment. This bias is generally unrecognized because it only affects the bootstrap estimates of the error variance, not the original OLS estimates. Nevertheless, as we illustrate here, analytically, with sampling experiments, and in an example from the literature, the bootstrap error variance bias can have significant distorting effects on bootstrap IRF confidence intervals. We also show that scale-adjusted bootstrap confidence intervals can be expected to exhibit improved coverage accuracy.  相似文献   

16.
This paper introduces a model of commodity price speculation and proves that the optimal trading strategy is of the (S,s) form when a no expected loss condition holds. A strong form of this condition is that the retail price charged to consumers at time t exceeds the expected wholesale price of the commodity at time t+1, i.e. , where β ∈(0,1) is the speculator’s discount factor. We are extremely grateful to Herbert Scarf for pointing out an important error in a previous draft of this paper and for suggesting the key argument in a revised proof that fixed the problem. We also benefited from helpful feedback from an anonymous referee, William Brainard, Zvi Eckstein, participants of seminars at Yale, the Operations Research Center at MIT, and the Econometric Society Winter School at the Indian Statistical Institute, New Delhi.  相似文献   

17.
In this paper the existence problem of undominated Nash equilibrium in normal form games is analyzed. It is shown that an undominated Nash equilibrium exists, if (a) strategy sets are convex polytopes inRnand (b) utility functions are affine with respect to each player's own strategy. It is shown by counterexamples that, first, it is not sufficient to have concave utility functions instead of affine under condition (b) even when condition (a) is satisfied, and, second, it is not sufficient to have just compact and convex strategy sets instead of polytopes in condition (a) even when condition (b) is satisfied.  相似文献   

18.
The European Commission has proposed air quality standards for NO2, SO2 and PM10 to be in force by 2010. The present paper presents a study that gauged their costs and benefits. An analysis of the expected emissions for 2010 (reference emission scenario), using simplified air quality models, showed that non-compliance with these standards will occur in cities only, not in rural areas. Most compliance problems are expected for PM10, least for SO2. Central estimates of the costs to meet standards range from 21 MECU (SO2), to 79 MECU (NO2) to 87--225 MECU (PM10). The estimated benefits are 83--3783 MECU (SO2), 408--5900 MECU (NO2), and 5007--51247 MECU (PM10). Uncertainties are high, due to errors and incertitude in various steps of the methodology, mainly the estimation of the human health effects, in particular effects on mortality, and in the valuation of a statistical life. In the case of PM10, additional uncertainty results from the small size of the air quality database. Notwithstanding the uncertainties, the indications are that the benefits exceed the costs.  相似文献   

19.
This paper investigates the time series properties of per capita CO2 emissions and per capita GDP levels for a sample of 86 countries over the period 1960-2000. For that purpose, we employ a state-of-the-art panel stationarity test which incorporates multiple shifts in level and slope, thereby controlling for cross-sectional dependence through bootstrap methods. Our analysis renders clear-cut evidence that per capita GDP levels are nonstationary for the world as a whole while per capita CO2 is found to be regime-wise trend stationary. The analysis of country-groups shows that for Africa and Asia, per capita CO2 is best described as nonstationary, while per capita GDP appears stationary around a broken trend. In addition, we find evidence of regime-wise trend stationarity in both variables for the country-groups consisting of America, Europe and Oceania. The results of our analysis carry important implications for the statistical modelling of the Environmental Kuznets curve for CO2, since the differing order of integration in both variables for the world as a whole and for Africa and Asia calls into question the validity of panel cointegration techniques which assume that both variables are nonstationary and cointegrated with one another. Cointegration techniques would not be appropriate either for the case of America, Europe and Oceania which are characterised by per capita GDP and CO2 emissions being stationary around a broken trend. Similar conclusions are reached when we analyse country-groups based on levels of development. Failure to properly characterise the time series properties of the data by not controlling for an unknown number of structural breaks and for cross-sectional dependence could be responsible for the fragility and lack of robustness surrounding the estimation of environmental Kuznets curves.  相似文献   

20.
累积创新中的内生许可证   总被引:1,自引:0,他引:1  
This paper analyzes the endogeneity of licensing arrangements in cost-reducing cumulative innovation. There exists the following results. First, for the first-generation patentee, ex post licensing matters for rent extraction while ex ante licensing matters for efficiency. Second, if the second-generation innovator does not exit, then the firms’ profits as well as social welfare are all irrelevant to whether ex ante licensing is allowed. Third, costly litigation can occur on the equilibrium path and its occurrence is also irrelevant to ex ante licensing. Interestingly, the conditional probability of the first-generation patentee winning litigation first decreases and then increases in patent breadth. Fourth, optimal patent breadth depends on the tradeoff between litigation costs and antitrust effect. Translated from Shijie Jingji Wenhui 世界经济文汇 (World Economic Papers), 2006, (6): 1–29  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号