首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
This paper examines out-of-sample option pricing performances for the affine jump diffusion (AJD) models by using the S&P 500 stock index and its associated option contracts. In particular, we investigate the role of time-varying jump risk premia in the AJD specifications. Our empirical analysis shows strong evidence in favor of time-varying jump risk premia in pricing cross-sectional options. We also find that, during a period of low volatility, the role of jump risk premia becomes less pronounced, making the differences across pricing performances of the AJD models not as substantial as during a period of high volatility. This finding can possibly explain poor pricing perfomances of the sophisticated AJD models in some previous studies whose sample periods can be characterized by low volatility.  相似文献   

2.
This paper examines a model of short-term interest rates that incorporates stochastic volatility as an independent latent factor into the popular continuous-time mean-reverting model of Chan et al. (J Financ 47:1209–1227, 1992). I demonstrate that this two-factor specification can be efficiently estimated within a generalized method of moments (GMM) framework using a judicious choice of moment conditions. The GMM procedure is compared to a Kalman filter estimation approach. Empirical estimation is implemented on US Treasury bill yields using both techniques. A Monte Carlo study of the finite sample performance of the estimators shows that GMM produces more heavily biased estimates than does the Kalman filter, and with generally larger mean squared errors.  相似文献   

3.
We develop novel methods for estimation and filtering of continuous-time models with stochastic volatility and jumps using so-called Approximate Bayesian Computation which build likelihoods based on limited information. The proposed estimators and filters are computationally attractive relative to standard likelihood-based versions since they rely on low-dimensional auxiliary statistics and so avoid computation of high-dimensional integrals. Despite their computational simplicity, we find that estimators and filters perform well in practice and lead to precise estimates of model parameters and latent variables. We show how the methods can incorporate intra-daily information to improve on the estimation and filtering. In particular, the availability of realized volatility measures help us in learning about parameters and latent states. The method is employed in the estimation of a flexible stochastic volatility model for the dynamics of the S&P 500 equity index. We find evidence of the presence of a dynamic jump rate and in favor of a structural break in parameters at the time of the recent financial crisis. We find evidence that possible measurement error in log price is small and has little effect on parameter estimates. Smoothing shows that, recently, volatility and the jump rate have returned to the low levels of 2004–2006.  相似文献   

4.

We propose a fully Bayesian approach to non-life risk premium rating, based on hierarchical models with latent variables for both claim frequency and claim size. Inference is based on the joint posterior distribution and is performed by Markov Chain Monte Carlo. Rather than plug-in point estimates of all unknown parameters, we take into account all sources of uncertainty simultaneously when the model is used to predict claims and estimate risk premiums. Several models are fitted to both a simulated dataset and a small portfolio regarding theft from cars. We show that interaction among latent variables can improve predictions significantly. We also investigate when interaction is not necessary. We compare our results with those obtained under a standard generalized linear model and show through numerical simulation that geographically located and spatially interacting latent variables can successfully compensate for missing covariates. However, when applied to the real portfolio data, the proposed models are not better than standard models due to the lack of spatial structure in the data.  相似文献   

5.
This study aims to shed light on the debate concerning the choice between discrete-time and continuous-time hazard models in making bankruptcy or any binary prediction using interval censored data. Building on the theoretical suggestions from various disciplines, we empirically compare widely used discrete-time hazard models (with logit and clog-log links) and the continuous-time Cox Proportional Hazards (CPH) model in predicting bankruptcy and financial distress of the United States Small and Medium-sized Enterprises (SMEs). Consistent with the theoretical arguments, we report that discrete-time hazard models are superior to the continuous-time CPH model in making binary predictions using interval censored data. Moreover, hazard models developed using a failure definition based jointly on bankruptcy laws and firms’ financial health exhibit superior goodness of fit and classification measures, in comparison to models that employ a failure definition based either on bankruptcy laws or firms’ financial health alone.  相似文献   

6.
We investigate whether the diversification discount occurs partly as an artifact of poor corporate governance. In panel data models, we find that the discount narrows by 16% to 21% when we add governance variables as regression controls. We also estimate Heckman selection models that account for the endogeneity of diversification and dynamic panel generalized method of moments models that account for the endogeneity of both diversification and governance. We find that the diversification discount persists even with these controls for endogeneity. However, in selection models the discount disappears entirely when we introduce governance variables in the second stage, and in dynamic panel GMM models the discount narrows by 37% when we include governance variables.  相似文献   

7.
Structural equation models (SEMs) have been widely used in behavioural, educational, medical and socio-psychological research for exploring and confirming relations among observed and latent variables. In the existing SEMs, the unknown coefficients in the measurement and structural equations are assumed to be constant with respect to time. This assumption does not always hold, as the relation among the observed and latent variables varies with time for some situations. In this paper, we propose nonlinear dynamical structural equation models to cope with these situations, and explore the nonlinear dynamic of the relation between the variables involved. A local maximum likelihood-based estimation procedure is proposed. We investigate a bootstrap resampling-based test for the hypothesis that the coefficient is constant with respect to time, as well as confidence bands for the unknown coefficients. Intensive simulation studies are conducted to show the empirical performance of the proposed estimation procedure, hypothesis test statistic and confidence band. Finally, a real example in relation to the stock market of Hong Kong is presented to demonstrate the proposed methodologies.  相似文献   

8.
We conduct out-of-sample density forecast evaluations of the affine jump diffusion models for the S&P 500 stock index and its options’ contracts. We also examine the time-series consistency between the model-implied spot volatilities using options & returns and only returns. In particular, we focus on the role of the time-varying jump risk premia. Particle filters are used to estimate the model-implied spot volatilities. We also propose the beta transformation approach for recursive parameter updating. Our empirical analysis shows that the inconsistencies between options & returns and only returns are resolved by the introduction of the time-varying jump risk premia. For density forecasts, the time-varying jump risk premia models dominate the other models in terms of likelihood criteria. We also find that for medium-term horizons, the beta transformation can weaken the systematic effect of misspecified AJD models using options & returns.  相似文献   

9.
According to the bivariate mixture hypothesis (BMH) as proposed by Tauchen and Pitts (1983) and Harris (1986, 1987) the daily price changes and the corresponding trading volume on speculative markets follow a joint mixture of distributions with the unobservable number of daily information events serving as the mixing variable. Using German stock market data of 15 major companies the distributional properties of the BMH is tested employing maximum-likelihood as well as generalised method of moments estimation techniques. In addition to providing a new approach for the pointwise estimation of the latent information arrival rate based on the maximum-likelihood method, we investigate the time-series properties of the BMH. the major results can be summarised as follows: (i) the distributional characteristics of the data (especially leptokurtosis and skewness in the distribution of price changes and volume respectively) cannot be explained satisfactorily by the BMH; univariate mixture models for price changes and trading volume separately reveal a possible specification error in the model; (ii) a univariate normal mixture model can account for the observed distributional characteristics of price changes; (iii) the estimated process of the latent information rate cannot fully explain the time-series characteristics of the data (especially the volatility clustering or ARCH-effects).  相似文献   

10.
In high-frequency financial data not only returns, but also waiting times between consecutive trades are random variables. Therefore, it is possible to apply continuous-time random walks (CTRWs) as phenomenological models of the high-frequency price dynamics. An empirical analysis performed on the 30 DJIA stocks shows that the waiting-time survival probability for high-frequency data is non-exponential. This fact imposes constraints on agent-based models of financial markets.  相似文献   

11.
Abstract

In this article we investigate three related investment-consumption problems for a risk-averse investor: (1) an investment-only problem that involves utility from only terminal wealth, (2) an investment-consumption problem that involves utility from only consumption, and (3) an extended investment-consumption problem that involves utility from both consumption and terminal wealth. Although these problems have been studied quite extensively in continuous-time frameworks, we focus on discrete time. Our contributions are (1) to model these investmentconsumption problems using a discrete model that incorporates the environment risk and mortality risk, in addition to the market risk that is typically considered, and (2) to derive explicit expressions of the optimal investment-consumption strategies to these modeled problems. Furthermore, economic implications of our results are presented. It is reassuring that many of our findings are consistent with the well-known results from the continuous-time models, even though our models have the additional features of modeling the environment uncertainty and the uncertain exit time.  相似文献   

12.
We propose alternative generalized method of moments (GMM) teststhat are analytically solvable in many econometric models, yieldingin particular analytical GMM tests for asset pricing modelswith time-varying risk premiums. We also provide simulationevidence showing that the proposed tests have good finite sampleproperties and that their asymptotic distribution is reliablefor the sample size commonly used. We apply our tests to studythe number of latent factors in the predictable variations ofthe returns on portfolios grouped by industries. Using datafrom October 1941 to September 1986 and two sets of instrumentalvariables, we find that the tests reject a one factor modelbut not a two-factor one.  相似文献   

13.
《Journal of Banking & Finance》2006,30(11):3131-3146
Subsequent to the influential paper of [Chan, K.C., Karolyi, G.A., Longstaff, F.A., Sanders, A.B., 1992. An empirical comparison of alternative models of the short-term interest rate. Journal of Finance 47, 1209–1227], the generalised method of moments (GMM) has been a popular technique for estimation and inference relating to continuous-time models of the short-term interest rate. GMM has been widely employed to estimate model parameters and to assess the goodness-of-fit of competing short-rate specifications. The current paper conducts a series of simulation experiments to document the bias and precision of GMM estimates of short-rate parameters, as well as the size and power of [Hansen, L.P., 1982. Large sample properties of generalised method of moments estimators. Econometrica 50, 1029–1054], J-test of over-identifying restrictions. While the J-test appears to have appropriate size and good power in sample sizes commonly encountered in the short-rate literature, GMM estimates of the speed of mean reversion are shown to be severely biased. Consequently, it is dangerous to draw strong conclusions about the strength of mean reversion using GMM. In contrast, the parameter capturing the levels effect, which is important in differentiating between competing short-rate specifications, is estimated with little bias.  相似文献   

14.
In an L -framework, we present majorant-preserving and sandwich-preserving extension theorems for linear operators. These results are then applied to price systems derived by a reasonable restriction of the class of applicable equivalent martingale measures. Our results prove the existence of a no-good-deal pricing measure for price systems consistent with bounds on the Sharpe ratio. We treat both discrete- and continuous-time market models. Within this study we present definitions of no-good-deal pricing measures that are equivalent to the existing ones and extend them to discrete-time models. We introduce the corresponding version of dynamic no-good-deal pricing measures in the continuous-time setting.  相似文献   

15.
In the last decade, fast Fourier transform methods (i.e. FFT) have become the standard tool for pricing and hedging with affine jump diffusion models (i.e. AJD), despite the FFT theoretical framework is still in development and it is known that the early solutions have serious problems in terms of stability and accuracy. This fact depends from the relevant computational gain that the FFT approach offers with respect to the standard Fourier transform methods that make use of a canonical inverse Levy formula. In this work we revisit a classic FT method and find that changing the quadrature algorithm and using alternative, less flawed, representation for the pricing formulas can improve the computational performance up to levels that are only three time slower than FFT can achieve. This allows to have at the same time a reasonable computational speed and the well known stability and accuracy of canonical FT methods.  相似文献   

16.
Abstract:  Econometric models involving a discrete outcome dependent variable abound in the finance and accounting literatures. However, much of the literature to date utilises a basic or standard logit model. Capitalising on recent developments in the discrete choice literature, we examine three advanced (or non-IID) logit models, namely: nested logit, mixed logit and latent class MNL. Using an illustration from corporate takeovers research, we compare the explanatory and predictive performance of each class of advanced model relative to the standard model. We find that in all cases the more advanced logit model structures, which correct for the highly restrictive IID and IIA conditions, provide significantly greater explanatory power than standard logit. Mixed logit and latent class MNL models exhibited the highest overall predictive accuracy on a holdout sample, while the standard logit model performed the worst. Moreover, the analysis of marginal effects of all models indicates that use of advanced models can lead to more insightful and behaviourally meaningful interpretations of the role and influence of explanatory variables and parameter estimates in model estimation. The results of this paper have implications for the use of more optimal logit structures in future research and practice.  相似文献   

17.
Prior to the financial crisis, most economists probably did not view the zero lower bound (ZLB) as a major problem for central banks. Using a range of structural and statistical models, we find that previous research understated the ZLB threat by ignoring uncertainty about model parameters and latent variables, focusing too much on the Great Moderation experience, and relying on structural models whose dynamics cannot generate sustained ZLB episodes. Our analysis also suggests that the Federal Reserve's asset purchases, while materially improving macroeconomic conditions, did not prevent the ZLB constraint from having first‐order adverse effects on real activity and inflation.  相似文献   

18.
This paper empirically studies the role of macro-factors in explaining and predicting daily bond yields. In general, macro-finance models use low-frequency data to match with macroeconomic variables available only at low frequencies. To deal with this, we construct and estimate a tractable no-arbitrage affine model with both conventional latent factors and macro-factors by imposing cross-equation restrictions on the daily yields of bonds with different maturities, credit risks, and inflation indexation. The estimation results using both the US and the UK data show that the estimated macro-factors significantly predict actual inflation and the output gap. In addition, our daily macro-term structure model forecasts better than no-arbitrage models with only latent factors as well as other statistical models.  相似文献   

19.
Abstract

The evaluation of multiple integrals which occur in order statistics distribution theory is involved due to the fact that the integration is to be carried on over an ordered range of variables of integration. This difficulty is sometimes completely obviated by transforming the ordered variates to the unordered ones. Several such transformations are available in the Theory of Multiple Integrals. In previous papers [2, 3] the author used one such transformation, and gave alternative simplified proofs of several known results in the distribution theory of order statistics from the exponential and the power function distributions. In this paper we use such a known transformation to derive moments (and distributions if necessary) of order statistics from the Pareto distribution. Malik [4] has derived moments of order statistics from this distribution without the transformation of the ordered variates to the unordered ones. The process of direct integration used by Malik becomes complicated for dealing with the moments of more than two ordered variates. Further, the method which we use here is unformly applicable to derive the moments or the distributions of one or more ordered variables, and gives the distributions and moments without any complicated steps in integration. The transformation used by us considerably simplifies the manipulations necessary for the derivation of moments or the Mellin transforms, and thus we hope that our paper would at least be of Pedagogical interest.  相似文献   

20.
We propose new generalized method of moments (GMM) estimators for the number of latent factors in linear factor models. The estimators are appropriate for data with a large (small) number of cross-sectional observations and a small (large) number of time series observations. The estimation procedure is simple and robust to the configurations of idiosyncratic errors encountered in practice. In addition, the method can be used to evaluate the validity of observable candidate factors. Monte Carlo experiments show that the proposed estimators have good finite-sample properties. Applying the estimators to international stock markets, we find that international stock returns are explained by one strong global factor. This factor is highly correlated with the Fama–French factors from the U.S. stock market. This result can be interpreted as evidence of market integration. We also find two weak factors closely related to markets in Europe and the Americas, respectively.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号