首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 21 毫秒
1.
The realized-GARCH framework is extended to incorporate the two-sided Weibull distribution, for the purpose of volatility and tail risk forecasting in a financial time series. Further, the realized range, as a competitor for realized variance or daily returns, is employed as the realized measure in the realized-GARCH framework. Sub-sampling and scaling methods are applied to both the realized range and realized variance, to help deal with inherent micro-structure noise and inefficiency. A Bayesian Markov Chain Monte Carlo (MCMC) method is adapted and employed for estimation and forecasting, while various MCMC efficiency and convergence measures are employed to assess the validity of the method. In addition, the properties of the MCMC estimator are assessed and compared with maximum likelihood, via a simulation study. Compared to a range of well-known parametric GARCH and realized-GARCH models, tail risk forecasting results across seven market indices, as well as two individual assets, clearly favour the proposed realized-GARCH model incorporating the two-sided Weibull distribution; especially those employing the sub-sampled realized variance and sub-sampled realized range.  相似文献   

2.
In high-frequency finance, the statistical terms ‘realized skewness’ and ‘realized kurtosis’ refer to the realized third- and fourth-order moments of high-frequency returns data normalized (or divided) by ‘realized variance’. In particular, before any computations of these two normalized realized moments are carried out, one often predetermines the holding-interval and sampling-interval and thus implicitly influencing the actual magnitudes of the computed values of the normalized realized higher-order moments i.e. they have been found to be interval-variant. To-date, little theoretical or empirical studies have been undertaken in the high-frequency finance literature to properly investigate and understand the effects of these two types of intervalings on the behaviour of the ensuring measures of realized skewness and realized kurtosis. This paper fills this gap by theoretically and empirically analyzing as to why and how these two normalized realized higher-order moments of market returns are influenced by the selected holding-interval and sampling-interval. Using simulated and price index data from the G7 countries, we then proceed to illustrate via count-based signature plots, the theoretical and empirical relationships between the realized higher-order moments and the sampling-intervals and holding-intervals.  相似文献   

3.
Given a time series of intra-day tick-by-tick price data, how can realized variance be estimated? The obvious estimator—the sum of squared returns between trades—is biased by microstructure effects such as bid–ask bounce and so in the past, practitioners were advised to drop most of the data and sample at most every five minutes or so. Recently, however, numerous alternative estimators have been developed that make more efficient use of the available data and improve substantially over those based on sparsely sampled returns. Yet, from a practical viewpoint, the choice of which particular estimator to use is not a trivial one because the study of their relative merits has primarily focused on the speed of convergence to their asymptotic distributions, which in itself is not necessarily a reliable guide to finite sample performance (especially when the assumptions on the price or noise process are violated). In this paper we compare a comprehensive set of nineteen realized variance estimators using simulated data from an artificial “zero-intelligence” market that has been shown to mimic some key properties of actual markets. In evaluating the competing estimators, we concentrate on efficiency but also pay attention to implementation, practicality, and robustness. One of our key findings is that for scenarios frequently encountered in practice, the best variance estimator is not always the one suggested by theory. In fact, an ad hoc implementation of a subsampling estimator, realized kernel, or maximum likelihood realized variance, delivers the best overall result. We make firm practical recommendations on choosing and implementing a realized variance estimator, as well as data sampling.  相似文献   

4.
We estimate the daily integrated variance and covariance of stock returns using high-frequency data in the presence of jumps, market microstructure noise and non-synchronous trading. For this we propose jump robust two time scale (co)variance estimators and verify their reduced bias and mean square error in simulation studies. We use these estimators to construct the ex-post portfolio realized volatility (RV) budget, determining each portfolio component’s contribution to the RV of the portfolio return. These RV budgets provide insight into the risk concentration of a portfolio. Furthermore, the RV budgets can be directly used in a portfolio strategy, called the equal-risk-contribution allocation strategy. This yields both a higher average return and lower standard deviation out-of-sample than the equal-weight portfolio for the stocks in the Dow Jones Industrial Average over the period October 2007–May 2009.  相似文献   

5.
Volatility measuring and estimation based on intra-day high-frequency data has grown in popularity during the last few years. A significant part of the research uses volatility and variance measures based on the sum of squared high-frequency returns. These volatility measures, introduced and mathematically justified in a series of papers by Andersen et al. [1999. (Understanding, optimizing, using and forecasting) realized volatility and correlation. Leonard N. Stern School Finance Department Working Paper Series, 99-061, New York University; 2000a. The distribution of realized exchange rate volatility. Journal of the American Statistical Association 96, no. 453: 42–55; 2000b. Exchange rate returns standardized by realized volatility are (nearly) Gaussian. Multinational Finance Journal 4, no. 3/4: 159–179; 2003. Modeling and forecasting realized volatility. NBER Working Paper Series 8160.] and Andersen et al. 2001a. Modeling and forecasting realized volatility. NBER Working Paper Series 8160., are referred to as ‘realized variance’. From the theory of quadratic variations of diffusions, it is possible to show that realized variance measures, based on sufficiently frequently sampled returns, are error-free volatility estimates. Our objective here is to examine realized variance measures, where well-documented market microstructure effects, such as return autocorrelation and volatility clustering, are included in the return generating process. Our findings are that the use of squared returns as a measure for realized variance will lead to estimation errors on sampling frequencies adopted in the literature. In the case of return autocorrelation, there will be systematic biases. Further, we establish increased standard deviation in the error between measured and real variance as sampling frequency decreases and when volatility is non-constant.  相似文献   

6.
We consider the properties of three estimation methods for integrated volatility, i.e. realized volatility, Fourier, and wavelet estimation, when a typical sample of high-frequency data is observed. We employ several different generating mechanisms for the instantaneous volatility process, e.g. Ornstein–Uhlenbeck, long memory, and jump processes. The possibility of market microstructure contamination is also entertained using models with bid-ask bounce and price discreteness, in which case alternative estimators with theoretical justification under market microstructure noise are also examined. The estimation methods are compared in a simulation study which reveals a general robustness towards persistence or jumps in the latent stochastic volatility process. However, bid-ask bounce effects render realized volatility and especially the wavelet estimator less useful in practice, whereas the Fourier method remains useful and is superior to the other two estimators in that case. More strikingly, even compared to bias correction methods for microstructure noise, the Fourier method is superior with respect to RMSE while having only slightly higher bias. A brief empirical illustration with high-frequency GE data is also included.  相似文献   

7.
A central limit theorem for the realized volatility estimator of the integrated volatility based on a specific random sampling scheme is proved, where prices are sampled with every ‘continued price change’ in bid or ask quotation data. The estimator is shown to be robust to market microstructure noise induced by price discreteness and bid–ask spreads. More general sampling schemes also are treated in case that the price process is a diffusion.  相似文献   

8.
As a means of validating an option pricing model, we compare the ex-post intra-day realized variance of options with the realized variance of the associated underlying asset that would be implied using assumptions as in the Black and Scholes (BS) model, the Heston, and the Bates model. Based on data for the S&P 500 index, we find that the BS model is strongly directionally biased due to the presence of stochastic volatility. The Heston model reduces the mismatch in realized variance between the two markets, but deviations are still significant. With the exception of short-dated options, we achieve best approximations after controlling for the presence of jumps in the underlying dynamics. Finally, we provide evidence that, although heavily biased, the realized variance based on the BS model contains relevant predictive information that can be exploited when option high-frequency data is not available.  相似文献   

9.
This paper develops a two-step estimation methodology that allows us to apply catastrophe theory to stock market returns with time-varying volatility and to model stock market crashes. In the first step, we utilize high-frequency data to estimate daily realized volatility from returns. Then, we use stochastic cusp catastrophe theory on data normalized by the estimated volatility in the second step to study possible discontinuities in the markets. We support our methodology through simulations in which we discuss the importance of stochastic noise and volatility in a deterministic cusp catastrophe model. The methodology is empirically tested on nearly 27 years of US stock market returns covering several important recessions and crisis periods. While we find that the stock markets showed signs of bifurcation in the first half of the period, catastrophe theory was not able to confirm this behaviour in the second half. Translating the results, we find that the US stock market’s downturns were more likely to be driven by the endogenous market forces during the first half of the studied period, while during the second half of the period, exogenous forces seem to be driving the market’s instability. The results suggest that the proposed methodology provides an important shift in the application of catastrophe theory to stock markets.  相似文献   

10.
In this article I study the statistical properties of a bias-correctedrealized variance measure when high-frequency asset prices arecontaminated with market microstructure noise. The analysisis based on a pure jump process for asset prices and explicitlydistinguishes among different sampling schemes, including calendartime, business time, and transaction time sampling. Two mainfindings emerge from the theoretical and empirical analysis.First, based on the mean-squared error (MSE) criterion, a biascorrection to realized variance (RV) allows for the more efficientuse of higher frequency data than the conventional RV estimator.Second, sampling in business time or transaction time is generallysuperior to the common practice of calendar time sampling inthat it leads to a further reduction in MSE. Using IBM transactiondata, I estimate a 2.5-minute optimal sampling frequency forRV in calendar time, which drops to about 12 seconds when afirst-order bias correction is applied. This results in a morethan 65% reduction in MSE. If, in addition, prices are sampledin transaction time, a further reduction of about 20% can beachieved.  相似文献   

11.
We propose a new threshold–pre-averaging realized estimator for the integrated co-volatility of two assets using non-synchronous observations with the simultaneous presence of microstructure noise and jumps. We derive a noise-robust Hayashi–Yoshida estimator that allows for very general structure of jumps in the underlying process. Based on the new estimator, different aspects and components of co-volatility are compared to examine the effect of jumps on systematic risk using tick-by-tick data from the Chinese stock market during 2009–2011. We find controlling for jumps contributes significantly to the beta estimation and common jumps mostly dominate the jump’s effect, but there is also evidence that idiosyncratic jumps may lead to significant deviation. We also find that not controlling for noise and jumps in previous realized beta estimations tend to considerably underestimate the systematic risk.  相似文献   

12.
Using two newly available ultrahigh-frequency datasets, we investigate empirically how frequently one can sample certain foreign exchange and U.S. Treasury security returns without contaminating estimates of their integrated volatility with market microstructure noise. Using the standard realized volatility estimator, we find that one can sample dollar/euro returns as frequently as once every 15 to 20 s without contaminating estimates of integrated volatility; 10-year Treasury note returns may be sampled as frequently as once every 2 to 3 min on days without U.S. macroeconomic announcements, and as frequently as once every 40 s on announcement days. Using a simple realized kernel estimator, this sampling frequency can be increased to once every 2 to 5 s for dollar/euro returns and to about once every 30 to 40 s for T-note returns. These sampling frequencies, especially in the case of dollar/euro returns, are much higher than those that are generally recommended in the empirical literature on realized volatility in equity markets. The higher sampling frequencies for dollar/euro and T-note returns likely reflect the superior depth and liquidity of these markets.  相似文献   

13.
We find that augmenting a regression of excess bond returns on the term structure of forward rates with an estimate of the mean realized jump size almost doubles the R2 of the forecasting regression. The return predictability from augmenting with the jump mean easily dominates that offered by augmenting with options-implied volatility and realized volatility from high-frequency data. In out-of-sample forecasting exercises, inclusion of the jump mean can reduce the root mean square prediction error by up to 40%. The incremental return predictability captured by the realized jump mean largely accounts for the countercyclical movements in bond risk premia. This result is consistent with the setting of an incomplete market in which the conditional distribution of excess bond returns is affected by a jump risk factor that does not lie in the span of the term structure of yields.  相似文献   

14.
This paper proposes an innovative econometric approach for the computation of 24-h realized volatilities across stock markets in Europe and the US. In particular, we deal with the problem of non-synchronous trading hours and intermittent high-frequency data during overnight non-trading periods. Using high-frequency data for the Euro Stoxx 50 and the S&P 500 Index between 2003 and 2011, we combine squared overnight returns and realized daytime variances to obtain synchronous 24-h realized volatilities for both markets. Specifically, we use a piece-wise weighting procedure for daytime and overnight information to take into account structural breaks in the relation between the two. To demonstrate the new possibilities that our approach opens up, we use the new 24-h volatilities to estimate a bivariate extension of Corsi et al.’s [Econom. Rev., 2008, 27(1–3), 46–78] HAR-GARCH model. The results suggest that the contemporaneous transatlantic volatility interdependence is remarkably stable over the sample period.  相似文献   

15.
This paper examines the incorporation of higher moments in portfolio selection problems utilising high-frequency data. Our approach combines innovations from the realised volatility literature with a portfolio selection methodology utilising higher moments. We provide an empirical study of the measurement of higher moments from tick by tick data and implement the model for a selection of stocks from the DOW 30 over the time period 2005–2011. We demonstrate a novel estimator for moments and co-moments in the presence of microstructure noise.  相似文献   

16.
Abstract

Volatility movements are known to be negatively correlated with stock index returns. Hence, investing in volatility appears to be attractive for investors seeking risk diversification. The most common instruments for investing in pure volatility are variance swaps, which now enjoy an active over-the-counter (OTC) market. This paper investigates the risk-return tradeoff of variance swaps on the Deutscher Aktienindex and Euro STOXX 50 index over the time period from 1995 to 2004. We synthetically derive variance swap rates from the smile in option prices. Using quotes from two large investment banks over two months, we validate that the synthetic values are close to OTC market prices. We find that variance swap returns exhibit an option-like profile compared to returns of the underlying index. Given this pattern, it is crucial to account for the non-normality of returns in measuring the performance of variance swap investments. As in the US, the average returns of selling variance swaps are found to be strongly positive and too large to be compatible with standard equilibrium models. The magnitude of the estimated risk premium is related to variance uncertainty and past index returns. This indicates that the variance swap rate does not seem to incorporate all past information relevant for forecasting future realized variance.  相似文献   

17.
This paper proposes a dynamic equilibrium model that can provide a unified explanation for the stylized facts observed in stock index markets such as the fat tails of the risk-neutral return distribution relative to the physical distribution, negative expected returns on deep OTM call options and negative realized variance risk premiums. In particular, we focus on the U-shaped pricing kernel against the stock index return, which is closely related to the negative call returns. We assume that the stock index return follows a time-changed Lévy process and that a representative investor has power utility over the aggregate consumption that forms a linear regression of the stock index return and its stochastic activity rate. This model offers a macroeconomic interpretation of the stylized facts from the perspective of the sensitivity of the activity rate and stock index return on aggregate consumption as well as the investor’s risk aversion.  相似文献   

18.
This paper investigates whether positive and negative returns share the same dynamic volatility process. The well established stylized facts on volatility persistence and asymmetric effects are re-examined in light of such dichotomy. To analyze the dynamics of down and up volatilities estimated from daily returns I use a bivariate generalization of the standard EGARCH model. As a robustness check, I also investigate various specifications of down and up realized measures estimated from high-frequency data. The empirical findings point to the existence of a marked diversity in the volatilities of positive and negative daily returns in terms of persistence and sensitivity to good and bad news. A simple forecasting exercise highlights the striking performance of the proposed approach even during the crisis period.  相似文献   

19.
Modeling the joint distribution of spot and futures returns is crucial for establishing optimal hedging strategies. This paper proposes a new class of dynamic copula-GARCH models that exploits information from high-frequency data for hedge ratio estimation. The copula theory facilitates constructing a flexible distribution; the inclusion of realized volatility measures constructed from high-frequency data enables copula forecasts to swiftly adapt to changing markets. By using data concerning equity index returns, the estimation results show that the inclusion of realized measures of volatility and correlation greatly enhances the explanatory power in the modeling. Moreover, the out-of-sample forecasting results show that the hedged portfolios constructed from the proposed model are superior to those constructed from the prevailing models in reducing the (estimated) conditional hedged portfolio variance. Finally, the economic gains from exploiting high-frequency data for estimating the hedge ratios are examined. It is found that hedgers obtain additional benefits by including high-frequency data in their hedging decisions; more risk-averse hedgers generate greater benefits.  相似文献   

20.
We separate noise from information related variance for stocks traded on the Indonesian Stock Exchange with a realized variance estimator. We find that the average optimal sampling frequency to estimate the realized variance is 9 min and that market quality has improved after 2004. The positive relation between the standard deviation of the noise variance and the square root of the efficient realized variance implies that as uncertainty about asset values increases the risk of transacting with traders with superior information increases as well. Furthermore, variance ratio comparisons reveal that private information is a significant trading component on the IDX.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号