首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Quasi maximum likelihood estimation and inference in multivariate volatility models remains a challenging computational task if, for example, the dimension of the parameter space is high. One of the reasons is that typically numerical procedures are used to compute the score and the Hessian, and often they are numerically unstable. We provide analytical formulae for the score and the Hessian for a variety of multivariate GARCH models including the Vec and BEKK specifications as well as the recent dynamic conditional correlation model. By means of a Monte Carlo investigation of the BEKK–GARCH model we illustrate that employing analytical derivatives for inference is clearly preferable to numerical methods.  相似文献   

2.
This paper develops a new model for the analysis of stochastic volatility (SV) models. Since volatility is a latent variable in SV models, it is difficult to evaluate the exact likelihood. In this paper, a non-linear filter which yields the exact likelihood of SV models is employed. Solving a series of integrals in this filter by piecewise linear approximations with randomly chosen nodes produces the likelihood, which is maximized to obtain estimates of the SV parameters. A smoothing algorithm for volatility estimation is also constructed. Monte Carlo experiments show that the method performs well with respect to both parameter estimates and volatility estimates. We illustrate our model by analysing daily stock returns on the Tokyo Stock Exchange. Since the method can be applied to more general models, the SV model is extended so that several characteristics of daily stock returns are allowed, and this more general model is also estimated. Copyright © 1999 John Wiley & Sons, Ltd.  相似文献   

3.
This paper studies the empirical performance of stochastic volatility models for twenty years of weekly exchange rate data for four major currencies. We concentrate on the effects of the distribution of the exchange rate innovations for both parameter estimates and for estimates of the latent volatility series. The density of the log of squared exchange rate innovations is modelled as a flexible mixture of normals. We use three different estimation techniques: quasi-maximum likelihood, simulated EM, and a Bayesian procedure. The estimated models are applied for pricing currency options. The major findings of the paper are that: (1) explicitly incorporating fat-tailed innovations increases the estimates of the persistence of volatility dynamics; (2) the estimation error of the volatility time series is very large; (3) this in turn causes standard errors on calculated option prices to be so large that these prices are rarely significantly different from a model with constant volatility. © 1998 John Wiley & Sons, Ltd.  相似文献   

4.
In this paper we present an exact maximum likelihood treatment for the estimation of a Stochastic Volatility in Mean (SVM) model based on Monte Carlo simulation methods. The SVM model incorporates the unobserved volatility as an explanatory variable in the mean equation. The same extension is developed elsewhere for Autoregressive Conditional Heteroscedastic (ARCH) models, known as the ARCH in Mean (ARCH‐M) model. The estimation of ARCH models is relatively easy compared with that of the Stochastic Volatility (SV) model. However, efficient Monte Carlo simulation methods for SV models have been developed to overcome some of these problems. The details of modifications required for estimating the volatility‐in‐mean effect are presented in this paper together with a Monte Carlo study to investigate the finite sample properties of the SVM estimators. Taking these developments of estimation methods into account, we regard SV and SVM models as practical alternatives to their ARCH counterparts and therefore it is of interest to study and compare the two classes of volatility models. We present an empirical study of the intertemporal relationship between stock index returns and their volatility for the United Kingdom, the United States and Japan. This phenomenon has been discussed in the financial economic literature but has proved hard to find empirically. We provide evidence of a negative but weak relationship between returns and contemporaneous volatility which is indirect evidence of a positive relation between the expected components of the return and the volatility process. Copyright © 2002 John Wiley & Sons, Ltd.  相似文献   

5.
We propose a new estimation method for the factor loading matrix in generalized orthogonal GARCH (GO-GARCH) models. The method is based on eigenvectors of suitably defined sample autocorrelation matrices of squares and cross-products of returns. The method is numerically more attractive than likelihood-based estimation. Furthermore, the new method does not require strict assumptions on the volatility models of the factors, and therefore is less sensitive to model misspecification. We provide conditions for consistency of the estimator, and study its efficiency relative to maximum likelihood estimation using Monte Carlo simulations. The method is applied to European sector returns.  相似文献   

6.
基于极值分布理论的VaR与ES度量   总被引:4,自引:0,他引:4  
本文应用极值分布理论对金融收益序列的尾部进行估计,计算收益序列的在险价值VaR和预期不足ES来度量市场风险。通过伪最大似然估计方法估计的GARCH模型对收益数据进行拟合,应用极值理论中的GPD对新息分布的尾部建模,得到了基于尾部估计产生收益序列的VaR和ES值。采用上证指数日对数收益数据为样本,得到了度量条件极值和无条件极值下VaR和ES的结果。实证研究表明:在置信水平很高(如99%)的条件下,采用极值方法度量风险值效果更好。而置信水平在95%下,其他方法和极值方法结合效果会很好。用ES度量风险能够使我们了解不利情况发生时风险的可能情况。  相似文献   

7.
In this paper, we propose the two-component realized EGARCH (REGARCH-2C) model, which accommodates the high-frequency information and the long memory volatility through the realized measure of volatility and the component volatility structure, to forecast VIX. We obtain the risk-neutral dynamics of the REGARCH-2C model and derive the corresponding model-implied VIX formula. The parameter estimates of the REGARCH-2C model are obtained via the joint maximum likelihood estimation using observations on the returns, realized measure and VIX. Our empirical results demonstrate that the proposed REGARCH-2C model provides more accurate VIX forecasts compared to a variety of competing models, including the GARCH, GJR-GARCH, nonlinear GARCH, Heston–Nandi GARCH, EGARCH, REGARCH and two two-component GARCH models. This result is found to be robust to alternative realized measure. Our empirical evidence highlights the importance of incorporating the realized measure as well as the component volatility structure for VIX forecasting.  相似文献   

8.
This paper gives an overview about the sixteen papers included in this special issue. The papers in this special issue cover a wide range of topics. Such topics include discussing a class of tests for correlation, estimation of realized volatility, modeling time series and continuous-time models with long-range dependence, estimation and specification testing of time series models, estimation in a factor model with high-dimensional problems, finite-sample examination of quasi-maximum likelihood estimation in an autoregressive conditional duration model, and estimation in a dynamic additive quantile model.  相似文献   

9.
Estimation methods for stochastic volatility models: a survey   总被引:5,自引:0,他引:5  
Abstract.  Although stochastic volatility (SV) models have an intuitive appeal, their empirical application has been limited mainly due to difficulties involved in their estimation. The main problem is that the likelihood function is hard to evaluate. However, recently, several new estimation methods have been introduced and the literature on SV models has grown substantially. In this article, we review this literature. We describe the main estimators of the parameters and the underlying volatilities focusing on their advantages and limitations both from the theoretical and empirical point of view. We complete the survey with an application of the most important procedures to the S&P 500 stock price index.  相似文献   

10.
We analyze the predictive performance of various volatility models for stock returns. To compare their performance, we choose loss functions for which volatility estimation is of paramount importance. We deal with two economic loss functions (an option pricing function and an utility function) and two statistical loss functions (a goodness-of-fit measure for a value-at-risk (VaR) calculation and a predictive likelihood function). We implement the tests for superior predictive ability of White [Econometrica 68 (5) (2000) 1097] and Hansen [Hansen, P. R. (2001). An unbiased and powerful test for superior predictive ability. Brown University]. We find that, for option pricing, simple models like the Riskmetrics exponentially weighted moving average (EWMA) or a simple moving average, which do not require estimation, perform as well as other more sophisticated specifications. For a utility-based loss function, an asymmetric quadratic GARCH seems to dominate, and this result is robust to different degrees of risk aversion. For a VaR-based loss function, a stochastic volatility model is preferred. Interestingly, the Riskmetrics EWMA model, proposed to calculate VaR, seems to be the worst performer. For the predictive likelihood-based loss function, modeling the conditional standard deviation instead of the variance seems to be a dominant modeling strategy.  相似文献   

11.
We propose a new generic and highly efficient Accelerated Gaussian Importance Sampler (AGIS) for the numerical evaluation of (very) high-dimensional density functions. A specific case of interest to us is the evaluation of likelihood functions for a broad class of dynamic latent variable models. The feasibility of our method is strikingly illustrated by means of an application to a first-order dynamic stochastic volatility model for daily stock returns, whose likelihood for an actual sample of size 2022 (!) is evaluated with high numerical accuracy by means of 10,000 Monte Carlo replications. The estimated model parsimoniously dominates ARCH and GARCH alternatives, one of which includes twelve lags.  相似文献   

12.
Adding multivariate stochastic volatility of a flexible form to large vector autoregressions (VARs) involving over 100 variables has proved challenging owing to computational considerations and overparametrization concerns. The existing literature works with either homoskedastic models or smaller models with restrictive forms for the stochastic volatility. In this paper, we develop composite likelihood methods for large VARs with multivariate stochastic volatility. These involve estimating large numbers of parsimonious models and then taking a weighted average across these models. We discuss various schemes for choosing the weights. In our empirical work involving VARs of up to 196 variables, we show that composite likelihood methods forecast much better than the most popular large VAR approach, which is computationally practical in very high dimensions: the homoskedastic VAR with Minnesota prior. We also compare our methods to various popular approaches that allow for stochastic volatility using medium and small VARs involving up to 20 variables. We find our methods to forecast appreciably better than these as well.  相似文献   

13.
This paper provides a probabilistic and statistical comparison of the log-GARCH and EGARCH models, which both rely on multiplicative volatility dynamics without positivity constraints. We compare the main probabilistic properties (strict stationarity, existence of moments, tails) of the EGARCH model, which are already known, with those of an asymmetric version of the log-GARCH. The quasi-maximum likelihood estimation of the log-GARCH parameters is shown to be strongly consistent and asymptotically normal. Similar estimation results are only available for the EGARCH (1,1) model, and under much stronger assumptions. The comparison is pursued via simulation experiments and estimation on real data.  相似文献   

14.
Applied researchers interested in estimating key parameters of dynamic stochastic general equilibrium models face an array of choices regarding numerical solution and estimation methods. We focus on the likelihood evaluation of models with occasionally binding constraints. We document how solution approximation errors and likelihood misspecification, related to the treatment of measurement errors, can interact and compound each other.  相似文献   

15.
Novel transition-based misspecification tests of semiparametric and fully parametric univariate diffusion models based on the estimators developed in [Kristensen, D., 2010. Pseudo-maximum likelihood estimation in two classes of semiparametric diffusion models. Journal of Econometrics 156, 239-259] are proposed. It is demonstrated that transition-based tests in general lack power in detecting certain departures from the null since they integrate out local features of the drift and volatility. As a solution to this, tests that directly compare drift and volatility estimators under the relevant null and alternative are also developed which exhibit better power against local alternatives.  相似文献   

16.
The transformed-data maximum likelihood estimation (MLE) method for structural credit risk models developed by Duan [Duan, J.-C., 1994. Maximum likelihood estimation using price data of the derivative contract. Mathematical Finance 4, 155–167] is extended to account for the fact that observed equity prices may have been contaminated by trading noises. With the presence of trading noises, the likelihood function based on the observed equity prices can only be evaluated via some nonlinear filtering scheme. We devise a particle filtering algorithm that is practical for conducting the MLE estimation of the structural credit risk model of Merton [Merton, R.C., 1974. On the pricing of corporate debt: The risk structure of interest rates. Journal of Finance 29, 449–470]. We implement the method on the Dow Jones 30 firms and on 100 randomly selected firms, and find that ignoring trading noises can lead to significantly over-estimating the firm’s asset volatility. The estimated magnitude of trading noise is in line with the direction that a firm’s liquidity will predict based on three common liquidity proxies. A simulation study is then conducted to ascertain the performance of the estimation method.  相似文献   

17.
In this paper, we propose a component conditional autoregressive range (CCARR) model for forecasting volatility. The proposed CCARR model assumes that the price range comprises both a long-run (trend) component and a short-run (transitory) component, which has the capacity to capture the long memory property of volatility. The model is intuitive and convenient to implement by using the maximum likelihood estimation method. Empirical analysis using six stock market indices highlights the value of incorporating a second component into range (volatility) modelling and forecasting. In particular, we find that the proposed CCARR model fits the data better than the CARR model, and that it generates more accurate out-of-sample volatility forecasts and contains more information content about the true volatility than the popular GARCH, component GARCH and CARR models.  相似文献   

18.
Conventionally the parameters of a linear state space model are estimated by maximizing a Gaussian likelihood function, even when the input errors are not Gaussian. In this paper we propose estimation by estimating functions fulfilling Godambe's optimality criterion. We discuss the issue of an unknown starting state vector, and we also develop recursive relations for the third- and fourth-order moments of the state predictors required for the calculations. We conclude with a simulation study demonstrating the proposed procedure on the estimation of the stochastic volatility model. The results suggest that the new estimators outperform the Gaussian likelihood.  相似文献   

19.
The measurement of liquidity based on low-frequency data is a crucial issue in the financial market microstructure literature. This paper extends the commonly used LOT liquidity model by incorporating the characterization of the volatility dynamics and distributional properties of return series, thereby significantly improving both the goodness-of-fit of liquidity model and the estimation performance of the existing low-frequency liquidity measures. The new models, which have a special form of heavy-tailed Censored-GARCH model, have challenges in estimation. We then provide an approximate maximum likelihood estimation method to circumvent this problem. A real data analysis is conducted to evaluate the performance of the new liquidity measures. The results show overwhelming evidence that our new measures have advantages over the existing measures in both estimation error and correlation coefficient with high-frequency bid-ask spread. The robustness check analysis further illustrates that the advantages of our new measures are stable across different stock industries and different turnover levels.  相似文献   

20.
We develop an efficient and analytically tractable method for estimation of parametric volatility models that is robust to price-level jumps. The method entails first integrating intra-day data into the Realized Laplace Transform of volatility, which is a model-free estimate of the daily integrated empirical Laplace transform of the unobservable volatility. The estimation is then done by matching moments of the integrated joint Laplace transform with those implied by the parametric volatility model. In the empirical application, the best fitting volatility model is a non-diffusive two-factor model where low activity jumps drive its persistent component and more active jumps drive the transient one.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号