首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Many empirical studies suggest that the distribution of risk factors has heavy tails. One always assumes that the underlying risk factors follow a multivariate normal distribution that is a assumption in conflict with empirical evidence. We consider a multivariate t distribution for capturing the heavy tails and a quadratic function of the changes is generally used in the risk factor for a non-linear asset. Although Monte Carlo analysis is by far the most powerful method to evaluate a portfolio Value-at-Risk (VaR), a major drawback of this method is that it is computationally demanding. In this paper, we first transform the assets into the risk on the returns by using a quadratic approximation for the portfolio. Second, we model the return’s risk factors by using a multivariate normal as well as a multivariate t distribution. Then we provide a bootstrap algorithm with importance resampling and develop the Laplace method to improve the efficiency of simulation, to estimate the portfolio loss probability and evaluate the portfolio VaR. It is a very powerful tool that propose importance sampling to reduce the number of random number generators in the bootstrap setting. In the simulation study and sensitivity analysis of the bootstrap method, we observe that the estimate for the quantile and tail probability with importance resampling is more efficient than the naive Monte Carlo method. We also note that the estimates of the quantile and the tail probability are not sensitive to the estimated parameters for the multivariate normal and the multivariate t distribution. The research of Shih-Kuei Lin was partially supported by the National Science Council under grants NSC 93-2146-H-259-023. The research of Cheng-Der Fuh was partially supported by the National Science Council under grants NSC 94-2118-M-001-028.  相似文献   

2.
We propose a new procedure to estimate the loss given default (LGD) distribution. Owing to the complicated shape of the LGD distribution, using a smooth density function as a driver to estimate it may result in a decline in model fit. To overcome this problem, we first apply the logistic regression to estimate the LGD cumulative distribution function. Then, we convert the result into the LGD distribution estimate. To implement the newly proposed estimation procedure, we collect a sample of 5269 defaulted debts from Moody’s Default and Recovery Database. A performance study is performed using 2000 pairs of in-sample and out-of-sample data-sets with different sizes that are randomly selected from the entire sample. Our results show that the newly proposed procedure has better and more robust performance than its alternatives, in the sense of yielding more accurate in-sample and out-of-sample LGD distribution estimates. Thus, it is useful for studying the LGD distribution.  相似文献   

3.
Elliptical distributions are useful for modelling multivariate data, multivariate normal and Student t distributions being two special classes. In this paper, we provide a definition for the elliptical tempered stable (ETS) distribution based on its characteristic function, which involves a unique spectral measure. This definition provides a framework for creating a connection between the infinite divisible distribution (in particular the ETS distribution) with fractional calculus. In addition, a definition for the ETS copula is discussed. A simulation study shows the accuracy of this definition, in comparison to the normal copula for measuring the dependency of data. An empirical study of stock market index returns for 20 countries shows the usefulness of the theoretical results.  相似文献   

4.
Abstract

In this paper we investigate the valuation of investment guarantees in a multivariate (discrete-time) framework. We present how to build multivariate models in general, and we survey the most important multivariate GARCH models. A direct multivariate application of regime-switching models is also discussed, as is the estimation of these models using maximum likelihood and their comparison in a multivariate setting. The computation of the CTE provision is further presented. We have estimated the models with a multivariate dataset (Canada, United States, United Kingdom, and Japan), and we compared the quality of their fit using multiple criteria and tests. We observe that multivariate GARCH models provide a better overall fit than regime-switching models. However, regime-switching models appropriately represent the fat tails of the returns distribution, which is where most GARCH models fail. This leads to significant differences in the value of the CTE provisions, and, in general, provisions computed with regime-switching models are higher. Thus, the results from this multivariate analysis are in line with what was obtained in the literature of univariate models.  相似文献   

5.
The Tweedie distribution, featured with a mass probability at zero, is a convenient tool for insurance claims modeling and pure premium determination in general insurance. Motivated by the fact that an insurance policy typically provides multiple types of coverage, we propose a copula-based multivariate Tweedie regression for modeling the semi-continuous claims while accommodating the association among different types. The proposed approach also allows for dispersion modeling, resulting in a multivariate version of the double generalized linear model. We demonstrate the application in insurance ratemaking using a portfolio of policyholders of automobile insurance from the state of Massachusetts in the United States.  相似文献   

6.
The use of mixture distributions for modeling asset returns has a long history in finance. New methods of demonstrating support for the presence of mixtures in the multivariate case are provided. The use of a two-component multivariate normal mixture distribution, coupled with shrinkage via a quasi-Bayesian prior, is motivated, and shown to be numerically simple and reliable to estimate, unlike the majority of multivariate GARCH models in existence. Equally important, it provides a clear improvement over use of GARCH models feasible for use with a large number of assets, such as constant conditional correlation, dynamic conditional correlation, and their extensions, with respect to out-of-sample density forecasting. A generalization to a mixture of multivariate Laplace distributions is motivated via univariate and multivariate analysis of the data, and an expectation–maximization algorithm is developed for its estimation in conjunction with a quasi-Bayesian prior. It is shown to deliver significantly better forecasts than the mixed normal, with fast and numerically reliable estimation. Crucially, the distribution theory required for portfolio theory and risk assessment is developed.  相似文献   

7.
This paper seeks to characterise the distribution of extreme returns for a UK share index over the years 1975 to 2000. In particular, the suitability of the following distributions is investigated: Gumbel, Frechet, Weibull, Generalised Extreme Value, Generalised Pareto, Log‐Normal and Generalised Logistic. Daily returns for the FT All Share index were obtained from Datastream, and the maxima and minima of these daily returns over a variety of selection intervals were calculated. Plots of summary statistics for the weekly maxima and minima on statistical distribution maps suggested that the best fitting distribution would be either the Generalised Extreme Value or the Generalised Logistic. The results from fitting each of these two distributions to extremes of a series of UK share returns support the conclusion that the Generalised Logistic distribution best fits the UK data for extremes over the period of the study. The Generalised Logistic distribution has fatter tails than either the log‐normal or the Generalised Extreme Value distribution, hence this finding is of importance to investors who are concerned with assessing the risk of a portfolio.  相似文献   

8.
Abstract

Credibility is a form of insurance pricing that is widely used, particularly in North America. The theory of credibility has been called a “cornerstone” in the field of actuarial science. Students of the North American actuarial bodies also study loss distributions, the process of statistical inference of relating a set of data to a theoretical (loss) distribution. In this work, we develop a direct link between credibility and loss distributions through the notion of a copula, a tool for understanding relationships among multivariate outcomes.

This paper develops credibility using a longitudinal data framework. In a longitudinal data framework, one might encounter data from a cross section of risk classes (towns) with a history of insurance claims available for each risk class. For the marginal claims distributions, we use generalized linear models, an extension of linear regression that also encompasses Weibull and Gamma regressions. Copulas are used to model the dependencies over time; specifically, this paper is the first to propose using a t-copula in the context of generalized linear models. The t-copula is the copula associated with the multivariate t-distribution; like the univariate tdistributions, it seems especially suitable for empirical work. Moreover, we show that the t-copula gives rise to easily computable predictive distributions that we use to generate credibility predictors. Like Bayesian methods, our copula credibility prediction methods allow us to provide an entire distribution of predicted claims, not just a point prediction.

We present an illustrative example of Massachusetts automobile claims, and compare our new credibility estimates with those currently existing in the literature.  相似文献   

9.
Systematic longevity risk is increasingly relevant for public pension schemes and insurance companies that provide life benefits. In view of this, mortality models should incorporate dependence between lives. However, the independent lifetime assumption is still heavily relied upon in the risk management of life insurance and annuity portfolios. This paper applies a multivariate Tweedie distribution to incorporate dependence, which it induces through a common shock component. Model parameter estimation is developed based on the method of moments and generalized to allow for truncated observations. The estimation procedure is explicitly developed for various important distributions belonging to the Tweedie family, and finally assessed using simulation.  相似文献   

10.
In this paper, we propose an explicit estimation of Value-at-Risk (VaR) and Expected Shortfall (ES) for linear portfolios when the risk factors change with a convex mixture of generalized Laplace distributions (M-GLD). We introduce the dynamics Delta-GLD-VaR, Delta-GLD-ES, Delta-MGLD-VaR and Delta-MGLD-ES, by using conditional correlation multivariate GARCH. The generalized Laplace distribution impose less restrictive assumptions during estimation that should improve the precision of the VaR and ES through the varying shape and fat tails of the risk factors in relation with the historical sample data. We also suggested some areas of application to measure price risk in agriculture, risk management and financial portfolio optimization.  相似文献   

11.
In this paper, a new class of composite model is proposed for modeling actuarial claims data of mixed sizes. The model is developed using the Stoppa distribution and a mode-matching procedure. The use of the Stoppa distribution allows for more flexibility over the thickness of the tail, and the mode-matching procedure gives a simple derivation of the model compositing with a variety of distributions. In particular, the Weibull–Stoppa and the Lognormal–Stoppa distributions are investigated. Their performance is compared with existing composite models in the context of the well-known Danish fire insurance data-set. The results suggest the composite Weibull–Stoppa model outperforms the existing composite models in all seven goodness-of-fit measures considered.  相似文献   

12.
《Quantitative Finance》2013,13(6):426-441
Abstract

The benchmark theory of mathematical finance is the Black–Scholes–Merton (BSM) theory, based on Brownian motion as the driving noise process for stock prices. Here the distributions of financial returns of the stocks in a portfolio are multivariate normal. Risk management based on BSM underestimates tails. Hence estimation of tail behaviour is often based on extreme value theory (EVT). Here we discuss a semi-parametric replacement for the multivariate normal involving normal variance–mean mixtures. This allows a more accurate modelling of tails, together with various degrees of tail dependence, while (unlike EVT) the whole return distribution can be modelled. We use a parametric component, incorporating the mean vector μ and covariance matrix Σ, and a non-parametric component, which we can think of as a density on [0,∞), modelling the shape (in particular the tail decay) of the distribution. We work mainly within the family of elliptically contoured distributions, focusing particularly on normal variance mixtures with self-decomposable mixing distributions. We discuss efficient methods to estimate the parametric and non-parametric components of our model and provide an algorithm for simulating from such a model. We fit our model to several financial data series. Finally, we calculate value at risk (VaR) quantities for several portfolios and compare these VaRs to those obtained from simple multivariate normal and parametric mixture models.  相似文献   

13.
The quality of operational risk data sets suffers from missing or contaminated data points. This may lead to implausible characteristics of the estimates. Outliers, especially, can make a modeler's task difficult and can result in arbitrarily large capital charges. Robust statistics provides ways to deal with these problems as well as measures for the reliability of estimators. We show that using maximum likelihood estimation can be misleading and unreliable assuming typical operational risk severity distributions. The robustness of the estimators for the Generalized Pareto distribution, and the Weibull and Lognormal distributions is measured considering both global and local reliability, which are represented by the breakdown point and the influence function of the estimate.  相似文献   

14.
This paper analyzes the optimal portfolio choice problem when security returns have a joint multivariate normal distribution with unknown parameters. For the case of limited, but sufficient (sample plus prior) information, we show that for a general family of conjugate priors, the optimal portfolio choice is obtained by the use of a mean-variance analysis that differs from traditional mean-variance analysis due to estimation risk. We also consider two illustrative cases of insufficient sample information and minimal prior information and show that in these cases it is asymptotically optimal for an investor to limit diversification to a subset of the securities. These theoretical results corroborate observed investor behavior in capital markets.  相似文献   

15.
针对Gamma,LognormM和Weibull等传统厚尾分布拟合巨灾风险的不足,本文一方面从理论上分析探讨了POT模型及其拟合巨灾厚尾风险的相对优势,另一方面应用POT模型和GPD分布,对我国1952年~2008年间地震直接经济损失数据进行拟合,发现了POT模型拟合巨灾风险厚尾部分的效果比Gamma、LognormM...  相似文献   

16.
Given multivariate time series, we study the problem of forming portfolios with maximum mean reversion while constraining the number of assets in these portfolios. We show that it can be formulated as a sparse canonical correlation analysis and study various algorithms to solve the corresponding sparse generalized eigenvalue problems. After discussing penalized parameter estimation procedures, we study the sparsity versus predictability trade-off and the significance of predictability in various markets.  相似文献   

17.
The realized-GARCH framework is extended to incorporate the two-sided Weibull distribution, for the purpose of volatility and tail risk forecasting in a financial time series. Further, the realized range, as a competitor for realized variance or daily returns, is employed as the realized measure in the realized-GARCH framework. Sub-sampling and scaling methods are applied to both the realized range and realized variance, to help deal with inherent micro-structure noise and inefficiency. A Bayesian Markov Chain Monte Carlo (MCMC) method is adapted and employed for estimation and forecasting, while various MCMC efficiency and convergence measures are employed to assess the validity of the method. In addition, the properties of the MCMC estimator are assessed and compared with maximum likelihood, via a simulation study. Compared to a range of well-known parametric GARCH and realized-GARCH models, tail risk forecasting results across seven market indices, as well as two individual assets, clearly favour the proposed realized-GARCH model incorporating the two-sided Weibull distribution; especially those employing the sub-sampled realized variance and sub-sampled realized range.  相似文献   

18.
This paper examines international equity market co-movements using time-varying copulae. We examine distributions from the class of Symmetric Generalized Hyperbolic (SGH) distributions for modelling univariate marginals of equity index returns. We show based on the goodness-of-fit testing that the SGH class outperforms the normal distribution, and that the Student-t assumption on marginals leads to the best performance, and thus, can be used to fit multivariate copula for the joint distribution of equity index returns. We show in our study that the Student-t copula is not only superior to the Gaussian copula, where the dependence structure relates to the multivariate normal distribution, but also outperforms some alternative mixture copula models which allow to reflect asymmetric dependencies in the tails of the distribution. The Student-t copula with Student-t marginals allows to model realistically simultaneous co-movements and to capture tail dependency in the equity index returns. From the point of view of risk management, it is a good candidate for modelling the returns arising in an international equity index portfolio where the extreme losses are known to have a tendency to occur simultaneously. We apply copulae to the estimation of the Value-at-Risk and the Expected Shortfall, and show that the Student-t copula with Student-t marginals is superior to the alternative copula models investigated, as well the Riskmetics approach.  相似文献   

19.
A bivariate generalized autoregressive conditional heteroskedastic model with dynamic conditional correlation and leverage effect (DCC-GJR-GARCH) for modelling financial time series data is considered. For robustness it is helpful to assume a multivariate Student-t distribution for the innovation terms. This paper proposes a new modified multivariate t-distribution which is a robustifying distribution and offers independent marginal Student-t distributions with different degrees of freedom, thereby highlighting the relationship among different assets. A Bayesian approach with adaptive Markov chain Monte Carlo methods is used for statistical inference. A simulation experiment illustrates good performance in estimation over reasonable sample sizes. In the empirical studies, the pairwise relationship between the Australian stock market and foreign exchange market, and between the US stock market and crude oil market are investigated, including out-of-sample volatility forecasts.  相似文献   

20.
We develop portfolio optimization problems for a nonlife insurance company seeking to find the minimum capital required that simultaneously satisfies solvency and portfolio performance constraints. Motivated by standard insurance regulations, we consider solvency capital requirements based on three criteria: ruin probability, conditional Value-at-Risk, and expected policyholder deficit ratio. We propose a novel semiparametric formulation for each problem and explore the advantages of implementing this methodology over other potential approaches. When liabilities follow a Lognormal distribution, we provide sufficient conditions for convexity for each problem. Using different expected return on capital target levels, we construct efficient frontiers when portfolio assets are modeled with a special class of multivariate GARCH models. We find that the correlation between asset returns plays an important role in the behavior of the optimal capital required and the portfolio structure. The stability and out-of-sample performance of our optimal solutions are empirically tested with respect to both the solvency requirement and portfolio performance, through a double rolling window estimation exercise.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号