首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 62 毫秒
1.
This paper uses local-to-unity theory to evaluate the asymptotic mean-squared error (AMSE) and forecast expected squared error from least-squares estimation of an autoregressive model with a root close to unity. We investigate unconstrained estimation, estimation imposing the unit root constraint, pre-test estimation, model selection estimation, and model average estimation. We find that the asymptotic risk depends only on the local-to-unity parameter, facilitating simple graphical comparisons. Our results strongly caution against pre-testing. Strong evidence supports averaging based on Mallows weights. In particular, our Mallows averaging method has uniformly and substantially smaller risk than the conventional unconstrained estimator, and this holds for autoregressive roots far from unity. Our averaging estimator is a new approach to forecast combination.  相似文献   

2.
This paper is the second of a series of two which describe the estimation and simulation of a stock-flow consistent macro-economic model of the UK economy. The first part (Davis, 1987) surveyed the theoretical literature on stock-adjustment dynamics, criticized existing UK forecasting models for omitting many potential stock-flow interactions and gave an outline of the model which is constructed here. The estimation and simulation results suggest that variables encapsulating such stock-flow effects are frequently significant in the estimation of key equations, and that their inclusion may make a sizeable difference to the simulation properties of a model.  相似文献   

3.
The normal-gamma stochastic frontier model was proposed in Greene (1990) and Beckers and Hammond (1987) as an extension of the normal-exponential proposed in the original derivations of the stochastic frontier by Aigner, Lovell and Schmidt (1977). The normal-gamma model has the virtue of providing a richer and more flexible parameterization of the inefficiency distribution in the stochastic frontier model than either of the canonical forms, normal-half normal and normal-exponential. However, several attempts to operationalize the normal-gamma model have met with very limited success, as the log likelihood is possesed of a significant degree of complexity. This note will propose an alternative approach to estimation of this model based on the method of maximum simulated likelihood estimation as opposed to the received attempts which have approached the problem by direct maximization.  相似文献   

4.
This paper provides a probabilistic and statistical comparison of the log-GARCH and EGARCH models, which both rely on multiplicative volatility dynamics without positivity constraints. We compare the main probabilistic properties (strict stationarity, existence of moments, tails) of the EGARCH model, which are already known, with those of an asymmetric version of the log-GARCH. The quasi-maximum likelihood estimation of the log-GARCH parameters is shown to be strongly consistent and asymptotically normal. Similar estimation results are only available for the EGARCH (1,1) model, and under much stronger assumptions. The comparison is pursued via simulation experiments and estimation on real data.  相似文献   

5.
Model averaging has become a popular method of estimation, following increasing evidence that model selection and estimation should be treated as one joint procedure. Weighted‐average least squares (WALS) is a recent model‐average approach, which takes an intermediate position between frequentist and Bayesian methods, allows a credible treatment of ignorance, and is extremely fast to compute. We review the theory of WALS and discuss extensions and applications.  相似文献   

6.
This paper proposes an alternative to maximum likelihood estimation of the parameters of the censored regression (or censored ‘Tobit’) model. The proposed estimator is a generalization of least absolute deviations estimation for the standard linear model, and, unlike estimation methods based on the assumption of normally distributed error terms, the estimator is consistent and asymptotically normal for a wide class of error distributions, and is also robust to heteroscedasticity. The paper gives the regularity conditions and proofs of these large-sample results, and proposes classes of consistent estimators of the asymptotic covariance matrix for both homoscedastic and heteroscedastic disturbances.  相似文献   

7.
Multilevel structural equation modeling (multilevel SEM) has become an established method to analyze multilevel multivariate data. The first useful estimation method was the pseudobalanced method. This method is approximate because it assumes that all groups have the same size, and ignores unbalance when it exists. In addition, full information maximum likelihood (ML) estimation is now available, which is often combined with robust chi‐squares and standard errors to accommodate unmodeled heterogeneity (MLR). In addition, diagonally weighted least squares (DWLS) methods have become available as estimation methods. This article compares the pseudobalanced estimation method, ML(R), and two DWLS methods by simulating a multilevel factor model with unbalanced data. The simulations included different sample sizes at the individual and group levels and different intraclass correlation (ICC). The within‐group part of the model posed no problems. In the between part of the model, the different ICC sizes had no effect. There is a clear interaction effect between number of groups and estimation method. ML reaches unbiasedness fastest, then the two DWLS methods, then MLR, and then the pseudobalanced method (which needs more than 200 groups). We conclude that both ML(R) and DWLS are genuine improvements on the pseudobalanced approximation. With small sample sizes, the robust methods are not recommended.  相似文献   

8.
Time series data arise in many medical and biological imaging scenarios. In such images, a time series is obtained at each of a large number of spatially dependent data units. It is interesting to organize these data into model‐based clusters. A two‐stage procedure is proposed. In stage 1, a mixture of autoregressions (MoAR) model is used to marginally cluster the data. The MoAR model is fitted using maximum marginal likelihood (MMaL) estimation via a minorization–maximization (MM) algorithm. In stage 2, a Markov random field (MRF) model induces a spatial structure onto the stage 1 clustering. The MRF model is fitted using maximum pseudolikelihood (MPL) estimation via an MM algorithm. Both the MMaL and MPL estimators are proved to be consistent. Numerical properties are established for both MM algorithms. A simulation study demonstrates the performance of the two‐stage procedure. An application to the segmentation of a zebrafish brain calcium image is presented.  相似文献   

9.
This paper gives an overview about the sixteen papers included in this special issue. The papers in this special issue cover a wide range of topics. Such topics include discussing a class of tests for correlation, estimation of realized volatility, modeling time series and continuous-time models with long-range dependence, estimation and specification testing of time series models, estimation in a factor model with high-dimensional problems, finite-sample examination of quasi-maximum likelihood estimation in an autoregressive conditional duration model, and estimation in a dynamic additive quantile model.  相似文献   

10.
We consider Bayesian inference techniques for agent-based (AB) models, as an alternative to simulated minimum distance (SMD). Three computationally heavy steps are involved: (i) simulating the model, (ii) estimating the likelihood and (iii) sampling from the posterior distribution of the parameters. Computational complexity of AB models implies that efficient techniques have to be used with respect to points (ii) and (iii), possibly involving approximations. We first discuss non-parametric (kernel density) estimation of the likelihood, coupled with Markov chain Monte Carlo sampling schemes. We then turn to parametric approximations of the likelihood, which can be derived by observing the distribution of the simulation outcomes around the statistical equilibria, or by assuming a specific form for the distribution of external deviations in the data. Finally, we introduce Approximate Bayesian Computation techniques for likelihood-free estimation. These allow embedding SMD methods in a Bayesian framework, and are particularly suited when robust estimation is needed. These techniques are first tested in a simple price discovery model with one parameter, and then employed to estimate the behavioural macroeconomic model of De Grauwe (2012), with nine unknown parameters.  相似文献   

11.
The standard generalized method of moments (GMM) estimation of Euler equations in heterogeneous‐agent consumption‐based asset pricing models is inconsistent under fat tails because the GMM criterion is asymptotically random. To illustrate this, we generate asset returns and consumption data from an incomplete‐market dynamic general equilibrium model that is analytically solvable and exhibits power laws in consumption. Monte Carlo experiments suggest that the standard GMM estimation is inconsistent and susceptible to Type II errors (incorrect nonrejection of false models). Estimating an overidentified model by dividing agents into age cohorts appears to mitigate Type I and II errors.  相似文献   

12.
This paper develops a concrete formula for the asymptotic distribution of two-step, possibly non-smooth semiparametric M-estimators under general misspecification. Our regularity conditions are relatively straightforward to verify and also weaker than those available in the literature. The first-stage nonparametric estimation may depend on finite dimensional parameters. We characterize: (1) conditions under which the first-stage estimation of nonparametric components do not affect the asymptotic distribution, (2) conditions under which the asymptotic distribution is affected by the derivatives of the first-stage nonparametric estimator with respect to the finite-dimensional parameters, and (3) conditions under which one can allow non-smooth objective functions. Our framework is illustrated by applying it to three examples: (1) profiled estimation of a single index quantile regression model, (2) semiparametric least squares estimation under model misspecification, and (3) a smoothed matching estimator.  相似文献   

13.
In this paper we present an exact maximum likelihood treatment for the estimation of a Stochastic Volatility in Mean (SVM) model based on Monte Carlo simulation methods. The SVM model incorporates the unobserved volatility as an explanatory variable in the mean equation. The same extension is developed elsewhere for Autoregressive Conditional Heteroscedastic (ARCH) models, known as the ARCH in Mean (ARCH‐M) model. The estimation of ARCH models is relatively easy compared with that of the Stochastic Volatility (SV) model. However, efficient Monte Carlo simulation methods for SV models have been developed to overcome some of these problems. The details of modifications required for estimating the volatility‐in‐mean effect are presented in this paper together with a Monte Carlo study to investigate the finite sample properties of the SVM estimators. Taking these developments of estimation methods into account, we regard SV and SVM models as practical alternatives to their ARCH counterparts and therefore it is of interest to study and compare the two classes of volatility models. We present an empirical study of the intertemporal relationship between stock index returns and their volatility for the United Kingdom, the United States and Japan. This phenomenon has been discussed in the financial economic literature but has proved hard to find empirically. We provide evidence of a negative but weak relationship between returns and contemporaneous volatility which is indirect evidence of a positive relation between the expected components of the return and the volatility process. Copyright © 2002 John Wiley & Sons, Ltd.  相似文献   

14.
Tong's threshold models have been found useful in modelling nonlinearities in the conditional mean of a time series. The threshold model is extended to the so-called double-threshold ARCH(DTARCH) model, which can handle the situation where both the conditional mean and the conditional variance specifications are piecewise linear given previous information. Potential applications of such models include financial data with different (asymmetric) behaviour in a rising versus a falling market and business cycle modelling. Model identification, estimation and diagnostic checking techniques are developed. Maximum likelihood estimation can be achieved via an easy-to-use iteratively weighted least squares algorithm. Portmanteau-type statistics are also derived for checking model adequacy. An illustrative example demonstrates that asymmetric behaviour in the mean and the variance could be present in financial series and that the DTARCH model is capable of capturing these phenomena.  相似文献   

15.
Abstract

This paper develops a unified framework for fixed effects (FE) and random effects (RE) estimation of higher-order spatial autoregressive panel data models with spatial autoregressive disturbances and heteroscedasticity of unknown form in the idiosyncratic error component. We derive the moment conditions and optimal weighting matrix without distributional assumptions for a generalized moments (GM) estimation procedure of the spatial autoregressive parameters of the disturbance process and define both an RE and an FE spatial generalized two-stage least squares estimator for the regression parameters of the model. We prove consistency of the proposed estimators and derive their joint asymptotic distribution, which is robust to heteroscedasticity of unknown form in the idiosyncratic error component. Finally, we derive a robust Hausman test of the spatial random against the spatial FE model.  相似文献   

16.
吕敏红  张惠玲 《价值工程》2012,31(20):301-302
近年来,半参数模型是处理回归问题的有力工具,进年来,已经成为当今回归分析的热点,引起了众多学者的关注。文章研究了具有AR(p)误差的半参数回归模型,首先对其误差的相关性进行了消除,然后将模型转变成为经典的半参数回归模型,运用惩罚最小二乘估计方法对模型参数进行了估计。  相似文献   

17.
The spatio-temporal variation in the demand for transportation, particularly taxis, in the highly dynamic urban space of a metropolis such as New York City is impacted by various factors such as commuting, weather, road work and closures, disruptions in transit services, etc. This study endeavors to explain the user demand for taxis through space and time by proposing a generalized spatio-temporal autoregressive (STAR) model. It deals with the high dimensionality of the model by proposing the use of LASSO-type penalized methods for tackling parameter estimation. The forecasting performance of the proposed models is measured using the out-of-sample mean squared prediction error (MSPE), and the proposed models are found to outperform other alternative models such as vector autoregressive (VAR) models. The proposed modeling framework has an easily interpretable parameter structure and is suitable for practical application by taxi operators. The efficiency of the proposed model also helps with model estimation in real-time applications.  相似文献   

18.
This paper presents a simple two-step estimator for a simultaneous equations model that contains an ordinal endogenous variable. The estimation rules are extensions of the Heckman (1978) estimators, also considered by Amemiya (1978). Asymptotic covariance matrices of the estimators also are derived. The estimator is applied to an economic model in which the statewide extent of teacher bargaining and teacherbargaining legislation are determined jointly.  相似文献   

19.
In this paper we propose a flexible model to describe nonlinearities and long-range dependence in time series dynamics. The new model is a multiple regime smooth transition extension of the Heterogeneous Autoregressive (HAR) model, which is specifically designed to model the behavior of the volatility inherent in financial time series. The model is able to simultaneously approximate long memory behavior, as well as describe sign and size asymmetries. A sequence of tests is developed to determine the number of regimes, and an estimation and testing procedure is presented. Monte Carlo simulations evaluate the finite-sample properties of the proposed tests and estimation procedures. We apply the model to several Dow Jones Industrial Average index stocks using transaction level data from the Trades and Quotes database that covers ten years of data. We find strong support for long memory and both sign and size asymmetries. Furthermore, the new model, when combined with the linear HAR model, is viable and flexible for purposes of forecasting volatility.  相似文献   

20.
We use frequency domain techniques to estimate a medium‐scale dynamic stochastic general equilibrium (DSGE) model on different frequency bands. We show that goodness of fit, forecasting performance and parameter estimates vary substantially with the frequency bands over which the model is estimated. Estimates obtained using subsets of frequencies are characterized by significantly different parameters, an indication that the model cannot match all frequencies with one set of parameters. In particular, we find that: (i) the low‐frequency properties of the data strongly affect parameter estimates obtained in the time domain; (ii) the importance of economic frictions in the model changes when different subsets of frequencies are used in estimation. This is particularly true for the investment adjustment cost and habit persistence: when low frequencies are present in the estimation, the investment adjustment cost and habit persistence are estimated to be higher than when low frequencies are absent. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号