首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The CreditRisk+ model is widely used in industry for computing the loss of a credit portfolio. The standard CreditRisk+ model assumes independence among a set of common risk factors, a simplified assumption that leads to computational ease. In this article, we propose to model the common risk factors by a class of multivariate extreme copulas as a generalization of bivariate Fréchet copulas. Further we present a conditional compound Poisson model to approximate the credit portfolio and provide a cost-efficient recursive algorithm to calculate the loss distribution. The new model is more flexible than the standard model, with computational advantages compared to other dependence models of risk factors.  相似文献   

2.
Operational risk data, when available, are usually scarce, heavy-tailed and possibly dependent. In this work, we introduce a model that captures such real-world characteristics and explicitly deals with heterogeneous pairwise and tail dependence of losses. By considering flexible families of copulas, we can easily move beyond modeling bivariate dependence among losses and estimate the total risk capital for the seven- and eight-dimensional distributions of event types and business lines. Using real-world data, we then evaluate the impact of realistic dependence modeling on estimating the total regulatory capital, which turns out to be up to 38% smaller than what the standard Basel approach would prescribe.  相似文献   

3.
ABSTRACT

Modeling multivariate time-series aggregate losses is an important actuarial topic that is very challenging due to the fact that losses can be serially dependent with heterogeneous dependence structures across loss types and business lines. In this paper, we investigate a flexible class of multivariate Cox Hidden Markov Models for the joint arrival process of loss events. Some of the nice properties possessed by this class of models, such as closed-form expressions, thinning properties and model versatility are discussed in details. We provide the expectation-maximization (EM) algorithm for efficient model calibration. Applying the proposed model to an operational risk dataset, we demonstrate that the model offers sufficient flexibility to capture most characteristics of the observed loss frequencies. By modeling the log-transformed loss severities through mixture of Erlang distributions, we can model the aggregate losses. Finally, out-of-sample testing shows that the proposed model is adequate to predict short-term future operational risk losses.  相似文献   

4.
Abstract

Credibility is a form of insurance pricing that is widely used, particularly in North America. The theory of credibility has been called a “cornerstone” in the field of actuarial science. Students of the North American actuarial bodies also study loss distributions, the process of statistical inference of relating a set of data to a theoretical (loss) distribution. In this work, we develop a direct link between credibility and loss distributions through the notion of a copula, a tool for understanding relationships among multivariate outcomes.

This paper develops credibility using a longitudinal data framework. In a longitudinal data framework, one might encounter data from a cross section of risk classes (towns) with a history of insurance claims available for each risk class. For the marginal claims distributions, we use generalized linear models, an extension of linear regression that also encompasses Weibull and Gamma regressions. Copulas are used to model the dependencies over time; specifically, this paper is the first to propose using a t-copula in the context of generalized linear models. The t-copula is the copula associated with the multivariate t-distribution; like the univariate tdistributions, it seems especially suitable for empirical work. Moreover, we show that the t-copula gives rise to easily computable predictive distributions that we use to generate credibility predictors. Like Bayesian methods, our copula credibility prediction methods allow us to provide an entire distribution of predicted claims, not just a point prediction.

We present an illustrative example of Massachusetts automobile claims, and compare our new credibility estimates with those currently existing in the literature.  相似文献   

5.
In this paper, we propose to identify the dependence structure that exists between returns on equity and commodity futures and its development over the past 20 years. The key point is that we do not impose any dependence structure, but let the data select it. To do so, we model the dependence between commodity (metal, agriculture and energy) and stock markets using a flexible approach that allows us to investigate whether the co-movement is: (i) symmetrical and frequent, (ii) (a) symmetrical and mostly present during extreme events and (iii) asymmetrical and mostly present during extreme events. We also allow for this dependence to be time-varying from January 1990 to February 2012. Our analysis uncovers three major stylised facts. First, we find that the dependence between commodity and stock markets is time-varying, symmetrical and occurs most of the time (as opposed to mostly during extreme events). Second, not allowing for time-varying parameters in the dependence distribution generates a bias towards an evidence of tail dependence. Similarly, considering only tail dependence may lead to false evidence of asymmetry. Third, a growing co-movement between industrial metals and equity markets is identified as early as 2003; this co-movement spreads to all commodity classes and becomes unambiguously stronger with the global financial crisis after Fall 2008.  相似文献   

6.
In this paper, we study data from the yearly reports the four major Swedish non-life insurers have sent to the Swedish Financial Supervisory Authority (FSA). We aim at finding marginal distributions of, and dependence between, losses on the five largest lines of business (LoBs) in order to create models for solvency capital requirement (SCR) calculation. We try to use data in an optimal way by sensibly defining an accounting year loss in terms of actuarial liability predictions and by pooling observations from several companies when possible to decrease the uncertainty about the underlying distributions and their parameters. We find that dependence between LoBs is weaker in our data than what is assumed in the Solvency II standard formula. We also find dependence between companies that may affect financial stability and must be taken into account when estimating loss distribution parameters. Moreover, we discuss under what circumstances an insurer is better (or worse) off using an internal model for SCR calculation, instead of the standard formula.  相似文献   

7.
Abstract

Dufresne et al. (1991) introduced a general risk model defined as the limit of compound Poisson processes. Such a model is either a compound Poisson process itself or a process with an infinite number of small jumps. Later, in a series of now classical papers, the joint distribution of the time of ruin, the surplus before ruin, and the deficit at ruin was studied (Gerber and Shiu 1997, 1998a, 1998b; Gerber and Landry 1998). These works use the classical and the perturbed risk models and hint that the results can be extended to gamma and inverse Gaussian risk processes.

In this paper we work out this extension to a generalized risk model driven by a nondecreasing Lévy process. Unlike the classical case that models the individual claim size distribution and obtains from it the aggregate claims distribution, here the aggregate claims distribution is known in closed form. It is simply the one-dimensional distribution of a subordinator. Embedded in this wide family of risk models we find the gamma, inverse Gaussian, and generalized inverse Gaussian processes. Expressions for the Gerber-Shiu function are given in some of these special cases, and numerical illustrations are provided.  相似文献   

8.
《Quantitative Finance》2013,13(3):373-382
In this paper we have analysed asset returns of the New York Stock Exchange and the Helsinki Stock Exchange using various time-independent models such as normal, general stable, truncated Lévy, mixed diffusion-jump, compound normal, Student t distribution and power exponential distribution and the time-dependent GARCH(1, 1) model with Gaussian and Student t distributed innovations. In order to study changes of pattern at different event horizons, as well as changes of pattern over time for a given event horizon, we have analysed high-frequency or short-horizon intraday returns up from 20 s intervals to a full trading day, medium-frequency or medium-horizon daily returns and low-frequency or long-horizon returns with holding period up to 30 days. As for changes of pattern over time, we found that for medium-frequency returns there are relatively long periods of business-as-usual when the return-generating process is well-described by a normal distribution. We also found periods of ferment, when the volatility grows and complex time dependences tend to emerge, but the known time dependences cannot explain the variability of the distribution. Such changes of pattern are also observed for high-frequency or short-horizon returns, with the exception that the return-generating process never becomes normal. It also turned out that the time dependence of the distribution shape is far more prominent at high frequencies or short horizons than the time dependence of the variance. For long-horizon or low-frequency returns, the distribution is found to converge towards a normal distribution with the time dependences vanishing after a few days.  相似文献   

9.
Pricing distressed CDOs with stochastic recovery   总被引:1,自引:0,他引:1  
In this article, a framework for the joint modelling of default and recovery risk in a portfolio of credit risky assets is presented. The model especially accounts for the correlation of defaults on the one hand and correlation of default rates and recovery rates on the other hand. Nested Archimedean copulas are used to model different dependence structures. For the recovery rates a very flexible continuous distribution with bounded support is applied, which allows for an efficient sampling of the loss process. Due to the relaxation of the constant 40% recovery assumption and the negative correlation of default rates and recovery rates, the model is especially suited for distressed market situations and the pricing of super senior tranches. A calibration to CDO tranche spreads of the European iTraxx portfolio is performed to demonstrate the fitting capability of the model. Applications to delta hedging as well as base correlations are presented.  相似文献   

10.
In the recent insurance literature, a variety of finite-dimensional parametric models have been proposed for analyzing the hump-shaped, heavy-tailed, and highly skewed loss data often encountered in applications. These parametric models are relatively simple, but they lack flexibility in the sense that an actuary analyzing a new data-set cannot be sure that any one of these parametric models will be appropriate. As a consequence, the actuary must make a non-trivial choice among a collection of candidate models, putting him/herself at risk for various model misspecification biases. In this paper, we argue that, at least in cases where prediction of future insurance losses is the ultimate goal, there is reason to consider a single but more flexible nonparametric model. We focus here on Dirichlet process mixture models, and we reanalyze several of the standard insurance data-sets to support our claim that model misspecification biases can be avoided by taking a nonparametric approach, with little to no cost, compared to existing parametric approaches.  相似文献   

11.
The loss distribution approach is one of the three advanced measurement approaches to the Pillar I modeling proposed by Basel II in 2001. In this paper, one possible approximation of the aggregate and maximum loss distribution in the extremely low frequency/high severity case is given, i.e. the case of infinite mean of the loss sizes and loss inter-arrival times. In this study, independent but not identically distributed losses are considered. The minimum loss amount is considered increasing over time. A Monte Carlo simulation algorithm is presented and several quantiles are estimated. The same approximation is used for modeling the maximum and aggregate worldwide economy losses caused by very rare and very extreme events such as 9/11, the Russian rouble crisis, and the U.S. subprime mortgage crisis. The model parameters are fit on a data sample of operational losses. The respective aggregate and extremal loss quantiles are calculated.  相似文献   

12.
What is the catastrophe risk a life insurance company faces? What is the correct price of a catastrophe cover? During a review of the current standard model, due to Strickler, we found that this model has some serious shortcomings. We therefore present a new model for the pricing of catastrophe excess of loss cover (Cat XL). The new model for annual claim cost C is based on a compound Poisson process of catastrophe costs. To evaluate the distribution of the cost of each catastrophe, we use the Peaks Over Threshold model for the total number of lost lives in each catastrophe and the beta binomial model for the proportion of these corresponding to customers of the insurance company. To be able to estimate the parameters of the model, international and Swedish data were collected and compiled, listing accidents claiming at least twenty and four lives, respectively. Fitting the new model to data, we find the fit to be good. Finally we give the price of a Cat XL contract and perform a sensitivity analysis of how some of the parameters affect the expected value and standard deviation of the cost and thus the price.  相似文献   

13.
Up to the 2007 crisis, research within bottom-up CDO models mainly concentrated on the dependence between defaults. Since then, due to substantial increases in market prices of systemic credit risk protection, more attention has been paid to recovery rate assumptions. In this paper, we use stochastic orders theory to assess the impact of recovery on CDOs and show that, in a factor copula framework, a decrease of recovery rates leads to an increase of the expected loss on senior tranches, even though the expected loss on the portfolio is kept fixed. This result applies to a wide range of latent factor models and is not specific to the Gaussian copula model. We then suggest introducing stochastic recovery rates in such a way that the conditional on the factor expected loss (or, equivalently, the large portfolio approximation) is the same as in the recovery markdown case. However, granular portfolios behave differently. We show that a markdown is associated with riskier portfolios than when using the stochastic recovery rate framework. As a consequence, the expected loss on a senior tranche is larger in the former case, whatever the attachment point. We also deal with implementation and numerical issues related to the pricing of CDOs within the stochastic recovery rate framework. Due to differences across names regarding the conditional (on the factor) losses given default, the standard recursion approach becomes problematic. We suggest approximating the conditional on the factor loss distributions, through expansions around some base distribution. Finally, we show that the independence and comonotonic cases provide some easy to compute bounds on expected losses of senior or equity tranches.  相似文献   

14.
This paper suggests formulas able to capture potential strong connection among credit losses in downturns without assuming any specific distribution for the variables involved. We first show that the current model adopted by regulators (Basel) is equivalent to a conditional distribution derived from the Gaussian Copula (which does not identify tail dependence). We then use conditional distributions derived from copulas that express tail dependence (stronger dependence across higher losses) to estimate the probability of credit losses in extreme scenarios (crises). Next, we use data on historical credit losses incurred in American banks to compare the suggested approach to the Basel formula with respect to their performance when predicting the extreme losses observed in 2009 and 2010. Our results indicate that, in general, the copula approach outperforms the Basel method in two of the three credit segments investigated. The proposed method is extendable to other differentiable copula families and this gives flexibility to future practical applications of the model.  相似文献   

15.
Reference dependence, loss aversion, and risk seeking for losses together comprise the preference-based component of prospect theory that sets its value function apart from the standard risk-aversion model. Using an elasticity analysis, we show that this distinctive preference component serves to underpin negative-feedback trading propensities, but cannot manifest itself in behavior directly or holistically at the individual-choice level. We then propose and demonstrate that the market interaction between prospect-theory investors and regular CRRA investors allows this preference component to dominate in equilibrium behavior and hence helps to reestablish the intuitive link between prospect-theory preferences and negative-feedback trading patterns. In the model, the interaction also reconciles the contrarian behavior of prospect-theory investors with asymmetric volatility and short-term return reversal. The results suggest that prospect-theory preferences can lead investors to behave endogenously as contrarian noise traders in the market interaction process.  相似文献   

16.
This paper provide a large-deviations approximation of the tail distribution of total financial losses on a portfolio consisting of many positions. Applications include the total default losses on a bank portfolio, or the total claims against an insurer. The results may be useful in allocating exposure limits, and in allocating risk capital across different lines of business. Assuming that, for a given total loss, the distress caused by the loss is larger if the loss occurs within a smaller time period, we provide a large-deviations estimate of the likelihood that there will exist a sub-period of the future planning period during which a total loss of the critical severity occurs. Under conditions, this calculation is reduced to the calculation of the likelihood of the same sized loss over an initial time interval whose length is a property of the portfolio and the critical loss level.Received: March 2003Mathematics Subject Classification: 60F10, 91B28, 91B28JEL Classification: G21, G22, G33Amir Dembo is with the Department of Statistics, Stanford University. His research was partially supported by NSF grant #DMS-0072331. Jean-Dominique Deuschel is with the Department of Mathematics, Technische Universität, Berlin. His research was partially supported by DFG grant #663/2-3 and DFG FZT 86. Darrell Duffie is with the Graduate School of Business, Stanford University. We are extremely grateful for research assistance by Nicolae Gârleanu and Gustavo Manso, for conversations with Michael Gordy, and for comments from Michael Stutzer, Peter Carr, David Heath, and David Siegmund.  相似文献   

17.
ABSTRACT

This paper considers a Cramér–Lundberg risk setting, where the components of the underlying model change over time. We allow the more general setting of the cumulative claim process being modeled as a spectrally positive Lévy process. We provide an intuitively appealing mechanism to create such parameter uncertainty: at Poisson epochs, we resample the model components from a finite number of d settings. It results in a setup that is particularly suited to describe situations in which the risk reserve dynamics are affected by external processes. We extend the classical Cramér–Lundberg approximation (asymptotically characterizing the all-time ruin probability in a light-tailed setting) to this more general setup. In addition, for the situation that the driving Lévy processes are sums of Brownian motions and compound Poisson processes, we find an explicit uniform bound on the ruin probability. In passing we propose an importance-sampling algorithm facilitating efficient estimation, and prove it has bounded relative error. In a series of numerical experiments we assess the accuracy of the asymptotics and bounds, and illustrate that neglecting the resampling can lead to substantial underestimation of the risk.  相似文献   

18.
The increased trading in multi-name financial products has required the development of state-of-the-art multivariate models. These models should be computationally tractable and, at the same time, flexible enough to explain the stylized facts of asset log-returns and of their dependence structure. The popular class of multivariate Lévy models provides a variety of tractable models, but suffers from one major shortcoming: Lévy models can replicate single-name derivative prices for a given time-to-maturity, but not for the whole range of quoted strikes and maturities, especially during periods of market turmoil. Moreover, there is a significant discrepancy between the moment term structure of Lévy models and the one observed in the market. Sato processes on the other hand exhibit a moment term structure that is more in line with empirical evidence and allow for a better replication of single-name option price surfaces. In this paper, we propose a general framework for multivariate models characterized by independent and time-inhomogeneous increments, where the asset log-return processes at unit time are modeled as linear combinations of independent self-decomposable random variables, where at least one self-decomposable random variable is shared by all the assets. As examples, we consider two general subclasses within this new framework, where we assume a normal variance-mean mixture with a one-sided tempered stable mixing density or a difference of one-sided tempered stable laws for the distribution of the risk factors. Particular attention is given to the models' ability to explain the asset dependence structure. A numerical study reveals the advantages of these new types of models.  相似文献   

19.
In this paper, we consider a Sparre Andersen risk model perturbed by a spectrally negative Lévy process (SNLP). Assuming that the interclaim times follow a Coxian distribution, we show that the Laplace transforms and defective renewal equations for the Gerber–Shiu functions can be obtained by employing the roots of a generalized Lundberg equation. When the SNLP is a combination of a Brownian motion and a compound Poisson process with exponential jumps, explicit expressions and asymptotic formulas for the Gerber–Shiu functions are obtained for exponential claim size distribution and heavy-tailed claim size distribution, respectively.  相似文献   

20.
We contribute to the empirical literature on household finances by introducing a Bayesian multivariate two-part model, which has been developed to further our understanding of household finances. Our flexible approach allows for the potential interdependence between the holding of assets and liabilities at the household level and also encompasses a two-part process to allow for differences in the influences on asset or liability holding and on the respective amounts held. Furthermore, the framework is dynamic in order to allow for persistence in household finances over time. Our findings endorse the joint modelling approach and provide evidence supporting the importance of dynamics. In addition, we find that certain independent variables exert different influences on the binary and continuous parts of the model thereby highlighting the flexibility of our framework and revealing a detailed picture of the nature of household finances.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号