共查询到20条相似文献,搜索用时 0 毫秒
1.
《International Journal of Forecasting》2021,37(4):1677-1690
Volatility proxies like realised volatility (RV) are extensively used to assess the forecasts of squared financial returns produced by volatility models. But are volatility proxies identified as expectations of the squared return? If not, then the results of these comparisons can be misleading, even if the proxy is unbiased. Here, a tripartite distinction is introduced between strong, semi-strong, and weak identification of a volatility proxy as an expectation of the squared return. The definition implies that semi-strong and weak identification can be studied and corrected for via a multiplicative transformation. Well-known tests can be used to check for identification and bias, and Monte Carlo simulations show that they are well sized and powerful—even in fairly small samples. As an illustration, 12 volatility proxies used in three seminal studies are revisited. Half of the proxies do not satisfy either semi-strong or weak identification, but their corrected transformations do. It is then shown how correcting for identification can change the rankings of volatility forecasts. 相似文献
2.
Indirect estimation of large conditionally heteroskedastic factor models,with an application to the Dow 30 stocks 总被引:1,自引:0,他引:1
We derive indirect estimators of conditionally heteroskedastic factor models in which the volatilities of common and idiosyncratic factors depend on their past unobserved values by calibrating the score of a Kalman-filter approximation with inequality constraints on the auxiliary model parameters. We also propose alternative indirect estimators for large-scale models, and explain how to apply our procedures to many other dynamic latent variable models. We analyse the small sample behaviour of our indirect estimators and several likelihood-based procedures through an extensive Monte Carlo experiment with empirically realistic designs. Finally, we apply our procedures to weekly returns on the Dow 30 stocks. 相似文献
3.
We assess the predictive accuracies of a large number of multivariate volatility models in terms of pricing options on the Dow Jones Industrial Average. We measure the value of model sophistication in terms of dollar losses by considering a set of 444 multivariate models that differ in their specification of the conditional variance, conditional correlation, innovation distribution, and estimation approach. All of the models belong to the dynamic conditional correlation class, which is particularly suitable because it allows consistent estimations of the risk neutral dynamics with a manageable amount of computational effort for relatively large scale problems. It turns out that increasing the sophistication in the marginal variance processes (i.e., nonlinearity, asymmetry and component structure) leads to important gains in pricing accuracy. Enriching the model with more complex existing correlation specifications does not improve the performance significantly. Estimating the standard dynamic conditional correlation model by composite likelihood, in order to take into account potential biases in the parameter estimates, generates only slightly better results. To enhance this poor performance of correlation models, we propose a new model that allows for correlation spillovers without too many parameters. This model performs about 60% better than the existing correlation models we consider. Relaxing a Gaussian innovation for a Laplace innovation assumption improves the pricing in a more minor way. In addition to investigating the value of model sophistication in terms of dollar losses directly, we also use the model confidence set approach to statistically infer the set of models that delivers the best pricing performances. 相似文献
4.
《管理科学学报(英文)》2017,2(1):1-33
Volatility models have been playing important roles in economics and finance. Using a generalized spectral second order derivative approach, we propose a new class of generally applicable omnibus tests for the adequacy of linear and nonlinear volatility models. Our tests have a convenient asymptotic null N(0,1) distribution, and can detect a wide range of misspecifications for volatility dynamics, including both neglected linear and nonlinear volatility dynamics. Distinct from the existing diagnostic tests for volatility models, our tests are robust to time-varying higher order moments of unknown form (e.g., time-varying skewness and kurtosis). They check a large number of lags and are therefore expected to be powerful against neglected volatility dynamics that occurs at higher order lags or display long memory properties. Despite using a large number of lags, our tests do not suffer much from the loss of a large number of degrees of freedom, because our approach naturally discounts higher order lags, which is consistent with the stylized fact that economic or financial markets are affected more by the recent past events than by the remote past events. No specific estimation method is required, and parameter estimation uncertainty has no impact on the convenient limit N(0,1) distribution of the test statistics. Moreover, there is no need to formulate an alternative volatility model, and only estimated standardized residuals are needed to implement our tests. We do not have to calculate tedious and model-specific score functions or derivatives of volatility models with respect to estimated parameters, which are required in some existing popular diagnostic tests for volatility models. We examine the finite sample performance of the proposed tests. It is documented that the new tests are rather powerful in detecting neglected nonlinear volatility dynamics which the existing tests can easily miss. They are useful diagnostic tools for practitioners when modelling volatility dynamics. 相似文献
5.
Toshiaki Watanabe 《Journal of Applied Econometrics》1999,14(2):101-121
This paper develops a new model for the analysis of stochastic volatility (SV) models. Since volatility is a latent variable in SV models, it is difficult to evaluate the exact likelihood. In this paper, a non-linear filter which yields the exact likelihood of SV models is employed. Solving a series of integrals in this filter by piecewise linear approximations with randomly chosen nodes produces the likelihood, which is maximized to obtain estimates of the SV parameters. A smoothing algorithm for volatility estimation is also constructed. Monte Carlo experiments show that the method performs well with respect to both parameter estimates and volatility estimates. We illustrate our model by analysing daily stock returns on the Tokyo Stock Exchange. Since the method can be applied to more general models, the SV model is extended so that several characteristics of daily stock returns are allowed, and this more general model is also estimated. Copyright © 1999 John Wiley & Sons, Ltd. 相似文献
6.
《International Journal of Forecasting》2023,39(2):720-735
This paper develops a new class of dynamic models for forecasting extreme financial risk. This class of models is driven by the score of the conditional distribution with respect to both the duration between extreme events and the magnitude of these events. It is shown that the models are a feasible method for modeling the time-varying arrival intensity and magnitude of extreme events. It is also demonstrated how exogenous variables such as realized measures of volatility can easily be incorporated. An empirical analysis based on a set of major equity indices shows that both the arrival intensity and the size of extreme events vary greatly during times of market turmoil. The proposed framework performs well relative to competing approaches in forecasting extreme tail risk measures. 相似文献
7.
In this paper we introduce a calibration procedure for validating of agent based models. Starting from the well-known financial model of (Brock and Hommes, 1998), we show how an appropriate calibration enables the model to describe price time series. We formulate the calibration problem as a nonlinear constrained optimization that can be solved numerically via a gradient-based method. The calibration results show that the simplest version of the Brock and Hommes model, with two trader types, fundamentalists and trend-followers, replicates nicely the price series of four different markets indices: the S&P 500, the Euro Stoxx 50, the Nikkei 225 and the CSI 300. We show how the parameter values of the calibrated model are important in interpreting the trader behavior in the different markets investigated. These parameters are then used for price forecasting. To further improve the forecasting, we modify our calibration approach by increasing the trader information set. Finally, we show how this new approach improves the model׳s ability to predict market prices. 相似文献
8.
By using a dynamic factor model, we can substantially improve the reliability of real-time output gap estimates for the U.S. economy. First, we use a factor model to extract a series for the common component in GDP from a large panel of monthly real-time macroeconomic variables. This series is immune to revisions to the extent that revisions are due to unbiased measurement errors or idiosyncratic news. Second, our model is able to handle the unbalanced arrival of the data. This yields favorable nowcasting properties and thus starting conditions for the filtering of data into a trend and deviations from a trend. Combined with the method of augmenting data with forecasts prior to filtering, this greatly reduces the end-of-sample imprecision in the gap estimate. The increased precision has economic importance for real-time policy decisions and improves real-time inflation forecasts. 相似文献
9.
《International Journal of Forecasting》2023,39(1):298-313
This paper develops a nowcasting model for the German economy. The model outperforms a number of alternatives and produces forecasts not only for GDP but also for other key variables. We show that the inclusion of a foreign factor improves the model’s performance, while financial variables do not. Additionally, a comprehensive model averaging exercise reveals that factor extraction in a single model delivers slightly better results than averaging across models. Finally, we estimate a “news” index for the German economy in order to assess the overall performance of the model beyond forecast errors in GDP. The index is constructed as a weighted average of the nowcast errors related to each variable included in the model. 相似文献
10.
Matteo Manera 《Journal of Productivity Analysis》2006,26(2):121-146
The empirical analysis of the economic interactions between factors of production, output and corresponding prices has received
much attention over the last two decades. Most contributions in this area have agreed on the neoclassical principle of a representative
optimizing firm and typically use theory-based structural equation models (SEM). A popular alternative to SEM is the vector
autoregression (VAR) methodology. The most recent attempts to link the SEM approach with VAR analysis in the area of factor
demands concentrate on single-equation models, whereas no effort has been devoted to compare these alternative approaches
when a firm is assumed to face a multi-factor technology and to decide simultaneously the optimal quantity for each input.
This paper bridges this gap. First, we illustrate how the SEM and the VAR approaches can both represent valid alternatives
to model systems of dynamic factor demands. Second, we show how to apply both methodologies to estimate dynamic factor demands
derived from a cost-minimizing capital-labour-energy-materials (KLEM) technology with adjustment costs (ADC) on the quasi-fixed
capital factor. Third, we explain how to use both models to calculate some widely accepted indicators of the production structure
of an economic sector, such as price and quantity elasticities, and alternative measures of ADC. In particular, we propose
and discuss some theoretical and empirical justifications of the differences between observed elasticities, measures of ADC,
and the assumption of exogeneity of output and/or input prices. Finally, we offer some suggestions for the applied researcher.
相似文献
11.
Conventionally the parameters of a linear state space model are estimated by maximizing a Gaussian likelihood function, even when the input errors are not Gaussian. In this paper we propose estimation by estimating functions fulfilling Godambe's optimality criterion. We discuss the issue of an unknown starting state vector, and we also develop recursive relations for the third- and fourth-order moments of the state predictors required for the calculations. We conclude with a simulation study demonstrating the proposed procedure on the estimation of the stochastic volatility model. The results suggest that the new estimators outperform the Gaussian likelihood. 相似文献
12.
《International Journal of Forecasting》2019,35(4):1304-1317
This paper is concerned with the forecasting of probability density functions. Density functions are nonnegative and have a constrained integral, and thus do not constitute a vector space. The implementation of established functional time series forecasting methods for such nonlinear data is therefore problematic. Two new methods are developed and compared to two existing methods. The comparison is based on the densities derived from cross-sectional and intraday returns. For such data, one of our new approaches is shown to dominate the existing methods, while the other is comparable to one of the existing approaches. 相似文献
13.
14.
This paper develops a novel time-varying multivariate Copula-MIDAS-GARCH (TVM-Copula-MIDAS-GARCH) model with exogenous explanatory variables to model the joint distribution of returns. The model accounts for mixed frequency factors that affect the time-varying dependence structure of financial assets. Furthermore, we examine the effectiveness of the proposed model in VaR-based portfolio selection. We conduct an empirical analysis on estimating the 90%, 95%, 99% VaRs of the portfolio constituted of the Shanghai Composite Index, Shanghai SE Fund Index, and Shanghai SE Treasury Bond Index. The empirical results show that the proposed TVM-Copula-MIDAS-GARCH model is effective to investigate the nonlinear time-varying dependence among those three indices and performs better in portfolio selection. 相似文献
15.
We consider the problem of causal effect heterogeneity from a Bayesian point of view. This is accomplished by introducing a three-equation system, similar in spirit to the work of Heckman and Vytlacil (1998), describing the joint determination of a scalar outcome, an endogenous “treatment” variable, and an individual-specific causal return to that treatment. We describe a Bayesian posterior simulator for fitting this model which recovers far more than the average causal effect in the population, the object which has been the focus of most previous work. Parameter identification and generalized methods for flexibly modeling the outcome and return heterogeneity distributions are also discussed.Combining data sets from High School and Beyond (HSB) and the 1980 Census, we illustrate our methods in practice and investigate heterogeneity in returns to education. Our analysis decomposes the impact of key HSB covariates on log wages into three parts: a “direct” effect and two separate indirect effects through educational attainment and returns to education. Our results strongly suggest that the quantity of schooling attained is determined, at least in part, by the individual’s own return to education. Specifically, a one percentage point increase in the return to schooling parameter is associated with the receipt of (approximately) 0.14 more years of schooling. Furthermore, when we control for variation in returns to education across individuals, we find no difference in predicted schooling levels for men and women. However, women are predicted to attain approximately 1/4 of a year more schooling than men on average as a result of higher rates of return to investments in education. 相似文献
16.
Prodosh Simlai 《The Quarterly Review of Economics and Finance》2014,54(1):17-30
In this paper we investigate housing price volatility within a spatial econometrics setting. We propose an extended spatial regression model of the real estate market that includes the effects of both conditional heteroskedasticity and spatial autocorrelation. Our suggested model has features similar to those of autoregressive conditional heteroskedasticity (ARCH) in the time-series context. We utilize the spatial ARCH (SARCH) model to analyze Boston housing price data used by Harrison and Rubinfeld (1978) and Gilley and Pace (1996). We show that measuring the variability of housing prices is an important issue and our SARCH model captures the conditional spatial variability of Boston housing prices. We argue that there is a different source of spatial variation, which is independent of traditional housing and neighborhood characteristics, and is captured by the SARCH model. 相似文献
17.
18.
Kai‐Li Wang Christopher Fawson Christopher B. Barrett James B. McDonald 《Journal of Applied Econometrics》2001,16(4):521-536
Many asset prices, including exchange rates, exhibit periods of stability punctuated by infrequent, substantial, often one‐sided adjustments. Statistically, this generates empirical distributions of exchange rate changes that exhibit high peaks, long tails, and skewness. This paper introduces a GARCH model, with a flexible parametric error distribution based on the exponential generalized beta (EGB) family of distributions. Applied to daily US dollar exchange rate data for six major currencies, evidence based on a comparison of actual and predicted higher‐order moments and goodness‐of‐fit tests favours the GARCH‐EGB2 model over more conventional GARCH‐t and EGARCH‐t model alternatives, particularly for exchange rate data characterized by skewness. Copyright © 2001 John Wiley & Sons, Ltd. 相似文献
19.
《International Journal of Forecasting》2020,36(4):1318-1328
We introduce a new class of stochastic volatility models with autoregressive moving average (ARMA) innovations. The conditional mean process has a flexible form that can accommodate both a state space representation and a conventional dynamic regression. The ARMA component introduces serial dependence, which results in standard Kalman filter techniques not being directly applicable. To overcome this hurdle, we develop an efficient posterior simulator that builds on recently developed precision-based algorithms. We assess the usefulness of these new models in an inflation forecasting exercise across all G7 economies. We find that the new models generally provide competitive point and density forecasts compared to standard benchmarks, and are especially useful for Canada, France, Italy, and the U.S. 相似文献
20.
Local influence analysis for Poisson autoregression with an application to stock transaction data 下载免费PDF全文
In statistical diagnostics and sensitivity analysis, the local influence method plays an important role and has certain advantages over other methods in several situations. In this paper, we use this method to study time series of count data when employing a Poisson autoregressive model. We consider case‐weights, scale, data, and additive perturbation schemes to obtain their corresponding vectors and matrices of derivatives for the measures of slope and normal curvatures. Based on the curvature diagnostics, we take a stepwise local influence approach to deal with data with possible masking effects. Finally, our established results are illustrated to be effective by analyzing a stock transactions data set. 相似文献