首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The practical relevance of several concepts of exogeneity of treatments for the estimation of causal parameters based on observational data are discussed. We show that the traditional concepts, such as strong ignorability and weak and super-exogeneity, are too restrictive if interest lies in average effects (i.e. not on distributional effects of the treatment). We suggest a new definition of exogeneity, KL-exogeneity. It does not rely on distributional assumptions and is not based on counterfactual random variables. As a consequence it can be empirically tested using a proposed test that is simple to implement and is distribution-free.  相似文献   

2.
This study develops a methodology of inference for a widely used Cliff–Ord type spatial model containing spatial lags in the dependent variable, exogenous variables, and the disturbance terms, while allowing for unknown heteroskedasticity in the innovations. We first generalize the GMM estimator suggested in  and  for the spatial autoregressive parameter in the disturbance process. We also define IV estimators for the regression parameters of the model and give results concerning the joint asymptotic distribution of those estimators and the GMM estimator. Much of the theory is kept general to cover a wide range of settings.  相似文献   

3.
In this paper we propose estimators for the regression coefficients in censored duration models which are distribution free, impose no parametric specification on the baseline hazard function, and can accommodate general forms of censoring. The estimators are shown to have desirable asymptotic properties and Monte Carlo simulations demonstrate good finite sample performance. Among the data features the new estimators can accommodate are covariate-dependent censoring, double censoring, and fixed (individual or group specific) effects. We also examine the behavior of the estimator in an empirical illustration.  相似文献   

4.
We introduce test statistics based on generalized empirical likelihood methods that can be used to test simple hypotheses involving the unknown parameter vector in moment condition time series models. The test statistics generalize those in Guggenberger and Smith [2005. Generalized empirical likelihood estimators and tests under partial, weak and strong identification. Econometric Theory 21 (4), 667–709] from the i.i.d. to the time series context and are alternatives to those in Kleibergen [2005a. Testing parameters in GMM without assuming that they are identified. Econometrica 73 (4), 1103–1123] and Otsu [2006. Generalized empirical likelihood inference for nonlinear and time series models under weak identification. Econometric Theory 22 (3), 513–527]. The main feature of these tests is that their empirical null rejection probabilities are not affected much by the strength or weakness of identification. More precisely, we show that the statistics are asymptotically distributed as chi-square under both classical asymptotic theory and weak instrument asymptotics of Stock and Wright [2000. GMM with weak identification. Econometrica 68 (5), 1055–1096]. We also introduce a modification to Otsu's (2006) statistic that is computationally more attractive. A Monte Carlo study reveals that the finite-sample performance of the suggested tests is very competitive.  相似文献   

5.
6.
We provide analytical formulae for the asymptotic bias (ABIAS) and mean-squared error (AMSE) of the IV estimator, and obtain approximations thereof based on an asymptotic scheme which essentially requires the expectation of the first stage F-statistic to converge to a finite (possibly small) positive limit as the number of instruments approaches infinity. Our analytical formulae can be viewed as generalizing the bias and MSE results of [Richardson and Wu 1971. A note on the comparison of ordinary and two-stage least squares estimators. Econometrica 39, 973–982] to the case with nonnormal errors and stochastic instruments. Our approximations are shown to compare favorably with approximations due to [Morimune 1983. Approximate distributions of kk-class estimators when the degree of overidentifiability is large compared with the sample size. Econometrica 51, 821–841] and [Donald and Newey 2001. Choosing the number of instruments. Econometrica 69, 1161–1191], particularly when the instruments are weak. We also construct consistent estimators for the ABIAS and AMSE, and we use these to further construct a number of bias corrected OLS and IV estimators, the properties of which are examined both analytically and via a series of Monte Carlo experiments.  相似文献   

7.
This paper considers a linear triangular simultaneous equations model with conditional quantile restrictions. The paper adjusts for endogeneity by adopting a control function approach and presents a simple two-step estimator that exploits the partially linear structure of the model. The first step consists of estimation of the residuals of the reduced-form equation for the endogenous explanatory variable. The second step is series estimation of the primary equation with the reduced-form residual included nonparametrically as an additional explanatory variable. This paper imposes no functional form restrictions on the stochastic relationship between the reduced-form residual and the disturbance term in the primary equation conditional on observable explanatory variables. The paper presents regularity conditions for consistency and asymptotic normality of the two-step estimator. In addition, the paper provides some discussions on related estimation methods in the literature.  相似文献   

8.
This paper is motivated by recent evidence that many univariate economic and financial time series have both nonlinear and long memory characteristics. Hence, this paper considers a general nonlinear, smooth transition regime autoregression which is embedded within a strongly dependent, long memory process. A time domain MLEMLE with simultaneous estimation of the long memory, linear ARAR and nonlinear parameters is shown to have desirable asymptotic properties. The Bayesian and Hannan–Quinn information criteria are shown to provide consistent model selection procedures. The paper also considers an alternative two step estimator where the original time series is fractionally filtered from an initial semi-parametric estimate of the long memory parameter. Simulation evidence indicates that the time domain MLEMLE is generally superior to the two step estimator. The paper also includes some applications of the methodology and estimation of a fractionally integrated, nonlinear autoregressive-ESTARESTAR model to forward premium and real exchange rates.  相似文献   

9.
When location shifts occur, cointegration-based equilibrium-correction models (EqCMs) face forecasting problems. We consider alleviating such forecast failure by updating, intercept corrections, differencing, and estimating the future progress of an ‘internal’ break. Updating leads to a loss of cointegration when an EqCM suffers an equilibrium-mean shift, but helps when collinearities are changed by an ‘external’ break with the EqCM staying constant. Both mechanistic corrections help compared to retaining a pre-break estimated model, but an estimated model of the break process could outperform. We apply the approaches to EqCMs for UK M1, compared with updating a learning function as the break evolves.  相似文献   

10.
11.
We study the scope of local indirect least squares (LILS) methods for nonparametrically estimating average marginal effects of an endogenous cause X on a response Y in triangular structural systems that need not exhibit linearity, separability, or monotonicity in scalar unobservables. One main finding is negative: in the fully nonseparable case, LILS methods cannot recover the average marginal effect. LILS methods can nevertheless test the hypothesis of no effect in the general nonseparable case. We provide new nonparametric asymptotic theory, treating both the traditional case of observed exogenous instruments Z and the case where one observes only error-laden proxies for Z.  相似文献   

12.
Regional employment offices provide placement services to job-seekers and employers and organize active labor market programs. In this paper, we carry out a quantitative evaluation of the employment offices’ performance in Switzerland based on production efficiency measures. We use Data Envelopment Analysis (DEA) to estimate the performance of all employment offices and then account for factors in the local operating environment that are outside managerial control. This approach, and the ranking of employment offices, may easily be interpreted by policymakers and provides guidelines for raising the efficiency of the public employment service. Our findings suggest that there is considerable room for improved efficiency in employment service, which could lead to a lower level of structural unemployment. We also find that differences in the external operating environment have a significant influence upon the efficiency of employment offices.  相似文献   

13.
This paper proposes a contemporaneous smooth transition threshold autoregressive model (C-STAR) as a modification of the smooth transition threshold autoregressive model surveyed in Teräsvirta [1998. Modelling economic relationships with smooth transition regressions. In: Ullah, A., Giles, D.E.A. (Eds.), Handbook of Applied Economic Statistics. Marcel Dekker, New York, pp. 507–552.], in which the regime weights depend on the ex ante probability that a latent regime-specific variable will exceed a threshold value. We argue that the contemporaneous model is well suited to rational expectations applications (and pricing exercises), in that it does not require the initial regimes to be predetermined. We investigate the properties of the model and evaluate its finite-sample maximum likelihood performance. We also propose a method to determine the number of regimes based on a modified Hansen [1992. The likelihood ratio test under nonstandard conditions: testing the Markov switching model of GNP. Journal of Applied Econometrics 7, S61–S82.] procedure. Furthermore, we construct multiple-step ahead forecasts and evaluate the forecasting performance of the model. Finally, an empirical application of the short term interest rate yield is presented and discussed.  相似文献   

14.
In this paper we propose an approach to both estimate and select unknown smooth functions in an additive model with potentially many functions. Each function is written as a linear combination of basis terms, with coefficients regularized by a proper linearly constrained Gaussian prior. Given any potentially rank deficient prior precision matrix, we show how to derive linear constraints so that the corresponding effect is identified in the additive model. This allows for the use of a wide range of bases and precision matrices in priors for regularization. By introducing indicator variables, each constrained Gaussian prior is augmented with a point mass at zero, thus allowing for function selection. Posterior inference is calculated using Markov chain Monte Carlo and the smoothness in the functions is both the result of shrinkage through the constrained Gaussian prior and model averaging. We show how using non-degenerate priors on the shrinkage parameters enables the application of substantially more computationally efficient sampling schemes than would otherwise be the case. We show the favourable performance of our approach when compared to two contemporary alternative Bayesian methods. To highlight the potential of our approach in high-dimensional settings we apply it to estimate two large seemingly unrelated regression models for intra-day electricity load. Both models feature a variety of different univariate and bivariate functions which require different levels of smoothing, and where component selection is meaningful. Priors for the error disturbance covariances are selected carefully and the empirical results provide a substantive contribution to the electricity load modelling literature in their own right.  相似文献   

15.
In this paper, we propose a flexible, parametric class of switching regime models allowing for both skewed and fat-tailed outcome and selection errors. Specifically, we model the joint distribution of each outcome error and the selection error via a newly constructed class of multivariate distributions which we call generalized normal mean–variance mixture distributions. We extend Heckman’s two-step estimation procedure for the Gaussian switching regime model to the new class of models. When the distributions of the outcome errors are asymmetric, we show that an additional correction term accounting for skewness in the outcome error distribution (besides the analogue of the well known inverse mill’s ratio) needs to be included in the second step regression. We use the two-step estimators of parameters in the model to construct simple estimators of average treatment effects and establish their asymptotic properties. Simulation results confirm the importance of accounting for skewness in the outcome errors in estimating both model parameters and the average treatment effect and the treatment effect for the treated.  相似文献   

16.
Scanner data for fast moving consumer goods typically amount to panels of time series where both N and T are large. To reduce the number of parameters and to shrink parameters towards plausible and interpretable values, Hierarchical Bayes models turn out to be useful. Such models contain in the second level a stochastic model to describe the parameters in the first level.  相似文献   

17.
This paper considers the issue of selecting the number of regressors and the number of structural breaks in multivariate regression models in the possible presence of multiple structural changes. We develop a modified Akaike information criterion (AIC), a modified Mallows’ Cp criterion and a modified Bayesian information criterion (BIC). The penalty terms in these criteria are shown to be different from the usual terms. We prove that the modified BIC consistently selects the regressors and the number of breaks whereas the modified AIC and the modified Cp criterion tend to overfit with positive probability. The finite sample performance of these criteria is investigated through Monte Carlo simulations and it turns out that our modification is successful in comparison to the classical model selection criteria and the sequential testing procedure robust to heteroskedasticity and autocorrelation.  相似文献   

18.
This paper proposes a testing strategy for the null hypothesis that a multivariate linear rational expectations (LRE) model may have a unique stable solution (determinacy) against the alternative of multiple stable solutions (indeterminacy). The testing problem is addressed by a misspecification-type approach in which the overidentifying restrictions test obtained from the estimation of the system of Euler equations of the LRE model through the generalized method of moments is combined with a likelihood-based test for the cross-equation restrictions that the model places on its reduced form solution under determinacy. The resulting test has no power against a particular class of indeterminate equilibria, hence the non rejection of the null hypothesis can not be interpreted conclusively as evidence of determinacy. On the other hand, this test (i) circumvents the nonstandard inferential problem generated by the presence of the auxiliary parameters that appear under indeterminacy and that are not identifiable under determinacy, (ii) does not involve inequality parametric restrictions and hence the use of nonstandard inference, (iii) is consistent against the dynamic misspecification of the LRE model, and (iv) is computationally simple. Monte Carlo simulations show that the suggested testing strategy delivers reasonable size coverage and power against dynamic misspecification in finite samples. An empirical illustration focuses on the determinacy/indeterminacy of a New Keynesian monetary business cycle model of the US economy.  相似文献   

19.
This paper considers the linear model with endogenous regressors and multiple changes in the parameters at unknown times. It is shown that minimization of a Generalized Method of Moments criterion yields inconsistent estimators of the break fractions, but minimization of the Two Stage Least Squares (2SLS) criterion yields consistent estimators of these parameters. We develop a methodology for estimation and inference of the parameters of the model based on 2SLS. The analysis covers the cases where the reduced form is either stable or unstable. The methodology is illustrated via an application to the New Keynesian Phillips Curve for the US.  相似文献   

20.
We introduce a class of instrumental quantile regression methods for heterogeneous treatment effect models and simultaneous equations models with nonadditive errors and offer computable methods for estimation and inference. These methods can be used to evaluate the impact of endogenous variables or treatments on the entire distribution of outcomes. We describe an estimator of the instrumental variable quantile regression process and the set of inference procedures derived from it. We focus our discussion of inference on tests of distributional equality, constancy of effects, conditional dominance, and exogeneity. We apply the procedures to characterize the returns to schooling in the U.S.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号