首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
This paper considers a two-way error component model with no lagged dependent variable and investigates the performance of various testing and estimation procedures applied to this model by means of Monte Carlo experiments. The following results were found: (1) The Chow-test performed poorly in testing the stability of cross-section regressions over time and in testing the stability of time-series regression across regions. (2) The Roy-Zellner test performed well and is recommended for testing the poolability of the data. (3) The Hausman specification test, employed to test the orthogonality assumption, gave a low frequency of Type I errors. (4) The Lagrange multiplier test, employed to test for zero variance components, did well except in cases where it was badly needed. (5) The problem of negative estimates of the variance components was found to be more serious in the two-way model than in the one-way model. However, replacing the negative variance estimates by zero did not have a serious effect on the performance of the second-round GLS estimates of the regression coefficients. (6) As in the one-way model, all the two-stage estimation methods performed reasonably well. (7) Better estimates of the variance components did not necessarily lead to better second-round GLS estimates of the regression coefficients.  相似文献   

2.
Bayesian and empirical Bayesian estimation methods are reviewed and proposed for the row and column parameters in two-way Contingency tables without interaction. Rasch's multiplicative Poisson model for misreadings is discussed in an example. The case is treated where assumptions of exchangeability are reasonable a priori for the unknown parameters. Two different types of prior distributions are compared, It appears that gamma priors yield more tractable results than lognormal priors.  相似文献   

3.
H. Nyquist 《Metrika》1987,34(1):177-183
Robust alternatives to the method of moments estimator for estimating the simple structural errors-in-variables model are proposed. Consistency and asymptotic normality of the estimators are established. Using the influence curve the asymptotic variance is given. Results from a simulation experiment indicate a superior performance of robust alternatives to the method of moments estimator in a small sample framework when measurement errors are contaminated normal. Research reported in this paper was supported by a grant from Sundsvallsbanken.  相似文献   

4.
Summary  In this paper the concept of 'rank-interaction' is introduced and a distribution-free method for testing against the presence of 'rank-interaction' is suggested in the case of a two-way layout (classification) with m (> 1) observations per cell. Roughly speaking rank-interaction can be understood as the phenomenon at which the ranks of the levels of some relevant variable are different for different classes of the other factor. The exact null distribution of the test statistic has been computed in some cases. The asymptotic distribution under the null hypothesis has been derived. A test suggested by J.V. B radley in his book 'Distribution-free Statistical Tests' [2] is discussed. In the opinion of the authors it is doubtful whether the asymptotic distribution of the test statistic under the null hypothesis, as given by B radley , is correct. The test of B radley was intended to be sensitive to the presence of interactions defined in the usual way and hence not only to 'rank-interaction'. The same applies to methods proposed by some other authors. We claim that situations exist where one should test against rank-interaction and not against the usual more general alternative.  相似文献   

5.
This paper analyzes an approach to correcting spurious regressions involving unit-root nonstationary variables by generalized least squares (GLS) using asymptotic theory. This analysis leads to a new robust estimator and a new test for dynamic regressions. The robust estimator is consistent for structural parameters not just when the regression error is stationary but also when it is unit-root nonstationary under certain conditions. We also develop a Hausman-type test for the null hypothesis of cointegration for dynamic ordinary least squares (OLS) estimation. We demonstrate our estimation and testing methods in three applications: (i) long-run money demand in the U.S., (ii) output convergence among industrial and developing countries, and (iii) purchasing power parity (PPP) for traded and non-traded goods.  相似文献   

6.
A large-scale regional econometric model is estimated using six estimation techniques, including Iterated Instrumental Variables and Iterated Two-Stage Least-Squares. Following estimation the model is simulated and a seventh technique called PANGLOSS is derived. The seven techniques are then compared in ex post and ex ante tests.  相似文献   

7.
Robust normal reference bandwidth for kernel density estimation   总被引:1,自引:0,他引:1  
Bandwidth selection is the main problem of kernel density estimation, the most popular method of density estimation. The classical normal reference bandwidth usually oversmoothes the density estimate. The existing hi-tech bandwidths have computational problems (even may not exist) and are not robust against outliers in the sample. A highly robust normal reference bandwidth is proposed, which adapts to different types of densities.  相似文献   

8.
This paper investigates a class of penalized quantile regression estimators for panel data. The penalty serves to shrink a vector of individual specific effects toward a common value. The degree of this shrinkage is controlled by a tuning parameter λλ. It is shown that the class of estimators is asymptotically unbiased and Gaussian, when the individual effects are drawn from a class of zero-median distribution functions. The tuning parameter, λλ, can thus be selected to minimize estimated asymptotic variance. Monte Carlo evidence reveals that the estimator can significantly reduce the variability of the fixed-effect version of the estimator without introducing bias.  相似文献   

9.
Fixed-width confidence interval estimation problems for location parameters of negative exponential populations have been studied. Three-stage sampling procedures have been developed for both the one- and two-sample situations. Our discussions are primarily concerned with second-order expansions of various characteristics of the proposed procedures including those for the achieved coverage probability in either problem. Some simulated results are also presented to indicate the usefulness of our procedures for moderate sample sizes.  相似文献   

10.
Mixed-integer programming models for the two-group discriminant problem appear to be more promising, in terms of accuracy, than are linear programming models, but at a substantial computational cost. This paper poses a particular mixed-integer model and suggests heuristics, based on linear programming, for obtaining suboptimal but “good” solutions to it. The heuristics are compared to the mixed-integer model using Monte Carlo simulation with Gaussian data.  相似文献   

11.
In 1980, Palm and Zellner presented a number of joint or system estimation and testing procedures for the final equations, transfer functions and structural equations of a multivariate dynamic model with normally distributed vector moving average errors. In this paper, we show that there is a flaw in their argument which renders most of the procedures invalid. Possible alternative procedures are described.  相似文献   

12.
Joint two-step estimation procedures which have the same asymptotic properties as the maximum likelihood (ML) estimator are developed for the final equation, transfer function and structural form of a multivariate dynamic model with normally distributed vector-moving average errors. The ML estimator under fixed and known initial values is obtained by iterating the procedure until convergence. The asymptotic distribution of the two-step estimators is used to construct large sample testing procedures for the different forms of the model.  相似文献   

13.
In this paper, we study a robust and efficient estimation procedure for the order of finite mixture models based on the minimizing a penalized density power divergence estimator. For this task, we use the locally conic parametrization approach developed by Dacunha-Castelle and Gassiate (ESAIM Probab Stat 285–317, 1997a; Ann Stat 27:1178–1209, 1999), and verify that the minimizing a penalized density power divergence estimator is consistent. Simulation results are provided for illustration.  相似文献   

14.
Counternarcotics interdiction efforts have traditionally relied on historically determined sorting criteria or “best guess” to find and classify suspected smuggling traffic. We present a more quantitative approach which incorporates customized database applications, graphics software and statistical modeling techniques to develop forecasting and classification models. Preliminary results show that statistical methodology can improve interdiction rates and reduce forecast error. The idea of predictive modeling is thus gaining support in the counterdrug community. The problem is divided into sea, air and land forecasting, only part of which will be addressed here. The maritime problem is solved using multiple regression in lieu of multivariate time series. This model predicts illegal boat counts by behavior and geographic region. We developed support software to present the forecasts and to automate the process of performing periodic model updates. During the period, the model was in use at. Coast Guard Headquarters. Because of deterrence provided by improved intervention, the vessel seizure rate declined from 1 every 36 hours to 1 every 6 months. Due in part to the success of the sea model, the maritime movement of marijuana has ceased to be a major threat. The air problem is more complex, and required us to locally design data collection and display software. Intelligence analysts are using a customized relational database application with a map overlay to perform visual pattern recognition of smuggling routes. We are solving the modeling portion of the air problem using multiple regression for regional forecasts of traffic density, and discriminant analysis to develop tactical models that classify “good guys” and “bad guys”. The air models are still under development, but we discuss some modeling considerations and preliminary results. The land problem is even more difficult, and data collection is still in progress.  相似文献   

15.
This paper considers the consistent estimation of nonlinear errors-in-variables models. It adopts the functional modeling approach by assuming that the true but unobserved regressors are random variables but making no parametric assumption on the distribution from which the latent variables are drawn. This paper shows how the information extracted from the replicate measurements can be used to identify and consistently estimate a general nonlinear errors-in-variables model. The identification is established through characteristic functions. The estimation procedure involves nonparametric estimation of the conditional density of the latent variables given the measurements using the identification results at the first stage, and at the second stage, a semiparametric nonlinear least-squares estimator is proposed. The consistency of the proposed estimator is also established. Finite sample performance of the estimator is investigated through a Monte Carlo study.  相似文献   

16.
This paper describes a deep-learning-based time-series forecasting method that was ranked third in the accuracy challenge of the M5 competition. We solved the problem using a deep-learning approach based on DeepAR, which is an auto-regressive recurrent network model conditioned on historical inputs. To address the intermittent and irregular characteristics of sales demand, we modified the training procedure of DeepAR; instead of using actual values for the historical inputs, our model uses values sampled from a trained distribution and feeds them to the network as past values. We obtained the final result using an ensemble of multiple models to make a robust and stable prediction. To appropriately select a model for the ensemble, each model was evaluated using the average weighted root mean squared scaled error, calculated for all levels of a wide range of past periods.  相似文献   

17.
This paper carries out a Bayesian analysis of the Hildreth-Houck (1968) random coefficient model and applies it to some cross-section production function data. Posterior distributions for mean coefficients, actual coefficients, variances and variance ratios are derived. The variance ratio posteriors are largely uninformative but they do lead to relatively informative densities on the variances, and the problem of negative variance estimates, obtained with previous techniques, is overcome. Posterior densities for the mean coefficients are not extremely sensitive to the variance ratios.  相似文献   

18.
This paper proposes a new semiparametric estimator for the truncated regression model under the independence restriction. Many existing approaches such as those in Lee (1992) and Honoré and Powell (1994) are moment-based methods, whereas our approach makes use of the entire truncated distribution. As a result, our approach is expected to require weaker identification and to have more favorable performance. Our simulation results suggest that our estimator outperforms that of Lee (1992) and Honoré and Powell (1994) in a variety of designs. Our estimator is shown to be consistent and asymptotically normal.  相似文献   

19.
The existing semiparametric estimation literature has mainly focused on univariate Tobit models and no semiparametric estimation has been considered for bivariate Tobit models. In this paper, we consider semiparametric estimation of the bivariate Tobit model proposed by Amemiya (1974), under the independence condition without imposing any parametric restriction on the error distribution. Our estimator is shown to be consistent and asymptotically normal, and simulation results show that our estimator performs well in finite samples. It is also worth noting that while Amemiya’s (1974) instrumental variables estimator (IV) requires the normality assumption, our semiparametric estimator actually outperforms his IV estimator even when normality holds. Our approach can be extended to higher dimensional multivariate Tobit models.  相似文献   

20.
Maximum likelihood is used to estimate a generalized autoregressive conditional heteroskedastic (GARCH) process where the residuals have a conditional stable distribution (GARCH-stable). The scale parameter is modelled such that a GARCH process with normally distributed residuals is a special case. The usual methods of estimating the parameters of the stable distribution assume constant scale and will underestimate the characteristic exponent when the scale parameter follows a GARCH process. The parameters of the GARCH-stable model are estimated with daily foreign currency returns. Estimates of characteristic exponents are higher with the GARCH-stable than when independence is assumed. Monte Carlo hypothesis testing procedures, however, reject our GARCH-stable model at the 1% significance level in four out of five cases.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号