首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 873 毫秒
1.
Increasing attention has been focused on the analysis of the realized volatility, which can be treated as a proxy for the true volatility. In this paper, we study the potential use of the realized volatility as a proxy in a stochastic volatility model estimation. We estimate the leveraged stochastic volatility model using the realized volatility computed from five popular methods across six sampling-frequency transaction data (from 1-min to 60- min) based on the trust region method. Availability of the realized volatility allows us to estimate the model parameters via the MLE and thus avoids computational challenge in the high dimensional integration. Six stock indices are considered in the empirical investigation. We discover some consistent findings and interesting patterns from the empirical results. In general, the significant leverage effect is consistently detected at each sampling frequency and the volatility persistence becomes weaker at the lower sampling frequency.  相似文献   

2.
The purpose of this paper is to examine the relationship between least squares and maximum likelihood estimation, where the likelihood function is the product of two explicit functions. We illustrate the correspondence for the particular case of the logit model, and show that this can be estimated by commonly accessible non-linear least squares estimation packages. Unlike the conventional non-linear least squares approach the estimates obtained following the proposed method are maximum likelihood for all sample sizes.  相似文献   

3.
Stochastic Volatility: Likelihood Inference and Comparison with ARCH Models   总被引:41,自引:0,他引:41  
In this paper, Markov chain Monte Carlo sampling methods are exploited to provide a unified, practical likelihood-based framework for the analysis of stochastic volatility models. A highly effective method is developed that samples all the unobserved volatilities at once using an approximating offset mixture model, followed by an importance reweighting procedure. This approach is compared with several alternative methods using real data. The paper also develops simulation-based methods for filtering, likelihood evaluation and model failure diagnostics. The issue of model choice using non-nested likelihood ratios and Bayes factors is also investigated. These methods are used to compare the fit of stochastic volatility and GARCH models. All the procedures are illustrated in detail.  相似文献   

4.
This paper explores the importance of specification in estimated general equilibrium models with changing monetary policy parameters and stochastic volatility. Simulated data is used to estimate models with incorrectly specified exogenous shocks (time-varying vs. constant variance) and models misspecifying the way Taylor rule parameters change over time (constant vs. drifting vs. regime-switching). The model correctly identifies some changes in monetary policy parameters, even when misspecified. The inclusion of stochastic volatility greatly improves model fit even when the data is generated using constant variance exogenous shocks; this relationship is stronger in data generated from models with changing policy parameters.  相似文献   

5.
The portfolio optimization problem is investigated using a multivariate stochastic volatility model with factor dynamics, fat‐tailed errors and leverage effects. The efficient Markov chain Monte Carlo method is used to estimate model parameters, and the Rao–Blackwellized auxiliary particle filter is used to compute the likelihood and to predict conditional means and covariances. The proposed models are applied to sector indices of the Tokyo Stock Price Index (TOPIX), which consists of 33 stock market indices classified by industrial sectors. The portfolio is dynamically optimized under several expected utilities and two additional static strategies are considered as benchmarks. An extensive empirical study indicates that our proposed dynamic factor model with leverage or fat‐tailed errors significantly improves the predictions of the conditional mean and covariances, as well as various measures of portfolio performance.  相似文献   

6.
This paper analyzes the business cycle properties of the Hong Kong economy during the 1984–2011 period, which includes the financial crisis experienced in 1997/98 and the economic crisis of 2008–2010. We show that the volatility respectively, of output, of the growth rate of output and of real interest rates in Hong Kong are higher than the corresponding average volatility among developed economies. Furthermore, interest rates are countercyclical. We build a stochastic neoclassical small open‐economy model estimated with a Bayesian likelihood approach that seeks to replicate the main business cycle characteristics of Hong Kong, and through which we try to quantify the role played by exogenous total factor productivity (TFP) shocks (transitory and permanent), real interest rate shocks and financial frictions. The main finding is that financial frictions, jointly with the assumption that the country spread is endogenous, seem important in explaining the countercyclicality of the real interest rates.  相似文献   

7.
Given a simple stochastic model of technology adoption, we derive a function for technological diffusion that is logistic in the deterministic part and has an error term based on the binomial distribution. We derive two estimators—a generalized least squares (GLS) estimator and a maximum likelihood (ML) estimator—which should be more efficient than the ordinary least squares (OLS) estimators typically used to estimate technological diffusion functions. We compare the two new estimators with OLS using Monte-Carlo techniques and find that under perfect specification, GLS and ML are equally efficient and both are more efficient than OLS. There was no evidence of bias in any of the estimators. We used the estimators on some example data and found evidence suggesting that under conditions of misspecification, the estimated variance-covariance of the ML estimator is badly biased. We verified the existence of the bias with a second Monte-Carlo experiment performed with a known misspecification. In the second experiment, GLS was the most efficient estimator, followed by ML, and OLS was least efficient. We conclude that the GLS estimator of choice.  相似文献   

8.
I describe a strategy for structural estimation that uses simulated maximum likelihood (SML) to estimate the structural parameters appearing in a model's first‐order conditions (FOCs). Generalized method of moments (GMM) is often the preferred method for estimation of FOCs, as it avoids distributional assumptions on stochastic terms, provided all structural errors enter the FOCs additively, giving a single composite additive error. But SML has advantages over GMM in models where multiple structural errors enter the FOCs nonadditively. I develop new simulation algorithms required to implement SML based on FOCs, and I illustrate the method using a model of U.S. multinational corporations.  相似文献   

9.
Estimation and forecasting for realistic continuous‐time stochastic volatility models is hampered by the lack of closed‐form expressions for the likelihood. In response, Andersen, Bollerslev, Diebold, and Labys (Econometrica, 71 (2003), 579–625) advocate forecasting integrated volatility via reduced‐form models for the realized volatility, constructed by summing high‐frequency squared returns. Building on the eigenfunction stochastic volatility models, we present analytical expressions for the forecast efficiency associated with this reduced‐form approach as a function of sampling frequency. For popular models like GARCH, multifactor affine, and lognormal diffusions, the reduced form procedures perform remarkably well relative to the optimal (infeasible) forecasts.  相似文献   

10.
In this paper, we use the local maximum likelihood (LML) method proposed by Kumbhakar et al. (J Econom, 2007) to estimate stochastic cost frontier models for a sample of 3,691 U.S. commercial banks. This method relaxes several deficiencies in the econometric estimation of frontier functions. In particular, we relax the assumption that all banks share the same production technology and provide bank-specific measures of returns to scale and cost inefficiency. The LML method is applied to estimate the cost frontiers in which a truncated normal distribution is used to model technical inefficiency. This formulation allows the cost frontier, inefficiency effects and heteroskedasticity in both noise and inefficiency components to be quite flexible.   相似文献   

11.
We present a maximum likelihood based composed error model to estimate market powers of firms. In our model, the stochastic part of the supply relation includes two random components: the conventional two‐sided error term and a random term, which is capturing firm‐specific conducts. Moreover, we provide a generalization of scaled Stevenson stochastic frontier model in the context of doubly truncated normal distributions. We estimate the market powers of Chicago based airlines as an empirical example that is showing the applicability of our estimation procedure.  相似文献   

12.
The objective of this paper is to put forward a new autoregressive asymmetric stochastic volatility model for modeling volatility and to compare results obtained for this model with an autoregressive stochastic model and another asymmetric volatility model, such as, asymmetric generalized autoregressive conditional heteroskedasticity model. The results obtained from the estimation by maximum likelihood have shown the volatility behavior is asymmetric in the majority of cases. This fact is better shown by the ARSVA model, than the rest of alternative models. Moreover, the ARSVA model is able to reproduce other stylized facts of such series, such as high kurtosis, no autocorrelation of returns, slow decreasing of the autocorrelation function of the squared returns and high persistence.
Román Mínguez Salido (Corresponding author)Email:
  相似文献   

13.
A simple algebraic estimation procedure is developed to estimate the parameters of diffusion models of new product acceptance. The procedure required knowledge of the occurrence of the point of inflection (based on actual data, analogous products, or management judgments). It is conceptually easy to use and can be implemented by using a hand calculator. Since the procedure does not employ period-by-period time-series diffusion data, it is not expected to provide the best fit to the data as compared to the maximum likelihood and nonlinear least squares estimation procedures. However, the procedure can provide very reasonable estimates about the relative magnitudes of the parameter estimates. In that respect, the procedure can be used to generate good starting values for the maximum likelihood and nonlinear least squares estimation procedures. In the absence of data, using management judgments about the point of inflection, the procedure can be implemented in a decision-support system to develop conditional diffusion curves for a new product. Data from four diverse innovations are used to illustrate the procedure.  相似文献   

14.
The GARCH diffusion model has attracted a great deal of attention in recent years, as it is able to describe financial time series better, when comparing to many other models. This paper considers the problem of warrant pricing when the underlying asset follows the GARCH diffusion model. An analytical approximate solution for European option prices is derived by means of Fourier transform. The approximate solution can be quickly computed by the fast Fourier transform (FFT) algorithm. Monte Carlo simulations show that this approximate solution is correct and the FFT is accurate and efficient, and hence it enables us to investigate the volatility smile implied by the GARCH diffusion model. Then a method is developed to provide the maximum likelihood (ML) estimation of the GARCH diffusion model based on the efficient importance sampling (EIS) procedure. Furthermore, the empirical performance of the GARCH diffusion model applied to the valuation of Hang Seng Index (HSI) warrants traded on the Hong Kong Stock Exchange (HKEx) is investigated. Empirical results show that the GARCH diffusion model outperforms the Black–Scholes (B–S) model in terms of the pricing accuracy, indicating that the pricing model incorporated with stochastic volatility can improve the pricing of warrants.  相似文献   

15.
Researchers have become increasingly interested in estimating mixtures of stochastic frontiers. Mester (1993), Caudill (1993), and Polachek and Yoon (1987), for example, estimate stochastic frontier models for different regimes, assuming sample separation information is given. Building on earlier work by Lee and Porter (1984), Douglas, Conway, and Ferrier (1995) estimate a stochastic frontier switching regression model in the presence of noisy sample separation information. The purpose of this paper is to extend earlier work by estimating a mixture of stochastic frontiers assuming no sample separation information. This case is more likely to occur in practice than even noisy sample separation information. In order to estimate a mixture of stochastic frontiers with no sample separation information, an EM algorithm to obtain maximum likelihood estimates is developed. The algorithm is used to estimate a mixture of stochastic (cost) frontiers using data on U.S. savings and loans for the years 1986, 1987, and 1988. Statistical evidence is found supporting the existence of a mixture of stochastic frontiers. First version received: 3/13/01/Final version received: 6/17/02 RID="*" ID="*"  I am grateful to Ram Acharya, Janice Caudill, and especially James R. Barth for several helpful comments on an earlier version of the paper. During the revision process I benefitted greatly from the suggestions of the Associate Editor and three anonymous referees.  相似文献   

16.
We build upon recent research that attributes the moderation of output volatility since the 1980s to the reduced volatility of the Total Factor Productivity (TFP) by investigating the linkage between energy price fluctuations and the stochastic process for TFP. First, we estimate a joint stochastic process for the energy price and TFP and establish that until around 1982, energy prices negatively affected TFP. This spillover has since disappeared. Second, we show that within the framework of a Dynamic Stochastic General Equilibrium (DSGE) model, the disappearance of this energy-productivity spillover accounts for close to 68 percent of the moderation in output volatility.  相似文献   

17.
In recent years, diffusion models for interest rates became very popular. In this paper, we perform a selection of a suitable diffusion model for the Italian short rate. Our data set is given by the yields on 3‐month BOT (Buoni Ordinari del Tesoro), from 1981 to 2001, for a total of 470 observations. We investigate among stochastic volatility models, paying more attention to affine models. Estimating diffusion models via maximum likelihood, which would lead to efficiency, is usually unfeasible because the transition density is not available. Recently, Gallant and Tauchen (1996) proposed a method of moments which gains full efficiency, hence its name of Efficient Method of Moments (EMM); it selects the moments as the scores of an auxiliary model, to be computed via simulation; thus, EMM is suitable to diffusions whose transition density is unknown, but which are convenient to simulate. The auxiliary model is selected among a family of densities which spans the density space. As a by‐product, EMM provides diagnostics that are easy to compute and interpret. We find evidence that one‐factor models and multi‐factor affine models are rejected, while a logarithmic specification of the volatility provides the best fit to the data .  相似文献   

18.
This paper develops a Bayesian model comparison of two broad major classes of varying volatility model, the generalized autoregressive conditional heteroskedasticity and stochastic volatility models, on financial time series. The leverage effect, jumps and heavy‐tailed errors are incorporated into the two models. For estimation, the efficient Markov chain Monte Carlo methods are developed and the model comparisons are examined based on the marginal likelihood. The empirical analyses are illustrated using the daily return data of US stock indices, individual securities and exchange rates of UK sterling and Japanese yen against the US dollar. The estimation results indicate that the stochastic volatility model with leverage and Student‐t errors yield the best performance among the competing models.  相似文献   

19.
This article proposes a threshold stochastic volatility model that generates volatility forecasts specifically designed for value at risk (VaR) estimation. The method incorporates extreme downside shocks by modelling left-tail returns separately from other returns. Left-tail returns are generated with a t-distributional process based on the historically observed conditional excess kurtosis. This specification allows VaR estimates to be generated with extreme downside impacts, yet remains empirically widely applicable. This article applies the model to daily returns of seven major stock indices over a 22-year period and compares its forecasts to those of several other forecasting methods. Based on back-testing outcomes and likelihood ratio tests, the new model provides reliable estimates and outperforms others.  相似文献   

20.
This article analyses the multivariate stochastic volatilities (SVs) with a common factor influencing volatilities in the prices of crude oil and agricultural commodities, used for both biofuel and nonbiofuel purposes. Modelling the volatility is crucial because the volatility is an important variable for asset allocation, risk management and derivative pricing. We develop a SV model comprising a latent common volatility factor with two asymptotic regimes with a smooth transition between them. In contrast to conventional volatility models, SVs are generated by the logistic transformation of latent factors, which comprise two components: the common volatility factor and an idiosyncratic component. We present a SV model with a common factor for oil, corn and wheat from 8 August 2005 to 10 October 2014, using a Markov chain Monte Carlo method to estimate the SVs and extract the common volatility factor. We find that the volatilities of oil and grain markets are persistent. According to the estimated common volatility factor, high volatility periods match the 2007–2009 recession and the 2007–2008 financial crisis quite well. Finally, the extracted common volatility factor exhibits a distinct pattern.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号