首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
We study a family of distributions generated from multiply monotone functions that includes a multivariate Pareto and, previously unidentified, exponential-Pareto distribution. We utilize an established link with Archimedean survival copulas to provide further examples, including a multivariate Weibull distribution, that may be used to fit light, or heavy-tailed phenomena, and which exhibit various forms of dependence, ranging from positive to negative. Because the model is intended for the study of joint lifetimes, we consider the effect of truncation and formulate properties required for a number of parameter estimation procedures based on moments and quantiles. For the quantile-based estimation procedure applied to the multivariate Weibull distribution, we also address the problem of optimal quantile selection.  相似文献   

2.
A simple option-pricing formula based on the Weibull distribution is introduced. The simplicity of the algebraic form and ease of implementation are comparable to those of Black-Scholes. Application to S&P 500 options shows that the pricing biases present in the Black-Scholes model are eliminated. Prices produced by the presented model generally lie within or close to the bid-ask spread. For long-term options (over one year), the Weibull formula exhibits significantly higher precision than the Black-Scholes formula does. While a rigorous comparison of all available models is necessary, the simplicity and precision of the proposed model are its main advantages over the existing models.  相似文献   

3.
In this paper, Bayesian methods with both Jeffreys and conjugate priors for estimating parameters of the lognormal–Pareto composite (LPC) distribution are considered. With Jeffreys prior, the posterior distributions for parameters of interest are derived and their properties are described. The conjugate priors are proposed and the conditional posterior distributions are provided. In addition, simulation studies are performed to obtain the upper percentage points of Kolmogorov–Smirnov and Anderson–Darling test statistics. Furthermore, these statistics are used to compare Bayesian and likelihood estimators. In order to clarify and advance the validity of Bayesian and likelihood estimators of the LPC distribution, well-known Danish fire insurance data-set is reanalyzed.  相似文献   

4.
Abstract

In this paper we develop several composite Weibull-Pareto models and suggest their use to model loss payments and other forms of actuarial data. These models all comprise a Weibull distribution up to a threshold point, and some form of Pareto distribution thereafter. They are similar in spirit to some composite lognormal-Pareto models that have previously been considered in the literature. All of these models are applied, and their performance compared, in the context of a real-world fire insurance data set.  相似文献   

5.
Recently, Cooray & Ananda (2005) proposed a composite lognormal-Pareto model for use with loss payments data of the sort arising in the actuarial and insurance industries. Their model is based on a lognormal density up to an unknown threshold value and a two-parameter Pareto density thereafter. Here we identify and discuss limitations of this composite lognormal-Pareto model which are likely to severely curtail its potential for practical application to real world data sets. In addition, we present two different composite models based on lognormal and Pareto models in order to address these concerns. The performance of all three composite models is discussed and compared in the context of an example based upon a well-known fire insurance data set.  相似文献   

6.
Abstract

Credibility is a form of insurance pricing that is widely used, particularly in North America. The theory of credibility has been called a “cornerstone” in the field of actuarial science. Students of the North American actuarial bodies also study loss distributions, the process of statistical inference of relating a set of data to a theoretical (loss) distribution. In this work, we develop a direct link between credibility and loss distributions through the notion of a copula, a tool for understanding relationships among multivariate outcomes.

This paper develops credibility using a longitudinal data framework. In a longitudinal data framework, one might encounter data from a cross section of risk classes (towns) with a history of insurance claims available for each risk class. For the marginal claims distributions, we use generalized linear models, an extension of linear regression that also encompasses Weibull and Gamma regressions. Copulas are used to model the dependencies over time; specifically, this paper is the first to propose using a t-copula in the context of generalized linear models. The t-copula is the copula associated with the multivariate t-distribution; like the univariate tdistributions, it seems especially suitable for empirical work. Moreover, we show that the t-copula gives rise to easily computable predictive distributions that we use to generate credibility predictors. Like Bayesian methods, our copula credibility prediction methods allow us to provide an entire distribution of predicted claims, not just a point prediction.

We present an illustrative example of Massachusetts automobile claims, and compare our new credibility estimates with those currently existing in the literature.  相似文献   

7.
We investigate alternative unconditional and conditional distributional models for the returns on Japan's Nikkei 225 stock market index. Among them is the recently introduced class of ARMA-GARCH models driven by α-stable (or stable Paretian) distributed innovations, designed to capture the observed serial dependence, conditional heteroskedasticity and fat-tailedness present in the return data. Of the eight entertained distributions, the partially asymmetric Weibull, Student's t and asymmetric α-stable present themselses as the most viable candidates in terms of overall fit. However, the tails of the sample distribution are approximated best by the asymmetric α-stable distribution. Good tail approximations are particularly important for risk assessments. This revised version was published online in August 2006 with corrections to the Cover Date.  相似文献   

8.
This paper seeks to characterise the distribution of extreme returns for a UK share index over the years 1975 to 2000. In particular, the suitability of the following distributions is investigated: Gumbel, Frechet, Weibull, Generalised Extreme Value, Generalised Pareto, Log‐Normal and Generalised Logistic. Daily returns for the FT All Share index were obtained from Datastream, and the maxima and minima of these daily returns over a variety of selection intervals were calculated. Plots of summary statistics for the weekly maxima and minima on statistical distribution maps suggested that the best fitting distribution would be either the Generalised Extreme Value or the Generalised Logistic. The results from fitting each of these two distributions to extremes of a series of UK share returns support the conclusion that the Generalised Logistic distribution best fits the UK data for extremes over the period of the study. The Generalised Logistic distribution has fatter tails than either the log‐normal or the Generalised Extreme Value distribution, hence this finding is of importance to investors who are concerned with assessing the risk of a portfolio.  相似文献   

9.
This paper is concerned with modelling the behaviour of random sums over time. Such models are particularly useful to describe the dynamics of operational losses, and to correctly estimate tail-related risk indicators. However, time-varying dependence structures make it a difficult task. To tackle these issues, we formulate a new Markov-switching generalized additive compound process combining Poisson and generalized Pareto distributions. This flexible model takes into account two important features: on the one hand, we allow all parameters of the compound loss distribution to depend on economic covariates in a flexible way. On the other hand, we allow this dependence to vary over time, via a hidden state process. A simulation study indicates that, even in the case of a short time series, this model is easily and well estimated with a standard maximum likelihood procedure. Relying on this approach, we analyse a novel data-set of 819 losses resulting from frauds at the Italian bank UniCredit. We show that our model improves the estimation of the total loss distribution over time, compared to standard alternatives. In particular, this model provides estimations of the 99.9% quantile that are never exceeded by the historical total losses, a feature particularly desirable for banking regulators.  相似文献   

10.
针对Gamma,LognormM和Weibull等传统厚尾分布拟合巨灾风险的不足,本文一方面从理论上分析探讨了POT模型及其拟合巨灾厚尾风险的相对优势,另一方面应用POT模型和GPD分布,对我国1952年~2008年间地震直接经济损失数据进行拟合,发现了POT模型拟合巨灾风险厚尾部分的效果比Gamma、LognormM...  相似文献   

11.
In recent years, several composite models based on the lognormal distribution have been developed for the Danish fire insurance data. In this note, we propose new composite models based on the lognormal distribution. At least one of the newly proposed models is shown to give a better fit to the Danish fire insurance data.  相似文献   

12.
We consider a classical risk model with the possibility of investment and positive interest rate for the riskless bond. The stock price movement is modelled as a geometric Brownian motion, the claim sizes are assumed to have a distribution belonging to a certain subclass of subexponential distributions. In this setting, we study the asymptotic behaviour of the optimal investment strategy under the ruin probability as a risk measure. This problem has been already considered before, but no results were obtained, for instance, for Weibull and Benktander-type-II distributions with certain parameters. We introduce a method which closes this gap.  相似文献   

13.
ABSTRACT

Composite models have a long history in actuarial science because they provide a flexible method of curve-fitting for heavy-tailed insurance losses. The ongoing research in this area continuously suggests methodological improvements for existing composite models and considers new composite models. A number of different composite models have been previously proposed in the literature to fit the popular data set related to Danish fire losses. This paper provides the most comprehensive analysis of composite loss models on the Danish fire losses data set to date by evaluating 256 composite models derived from 16 parametric distributions that are commonly used in actuarial science. If not suitably addressed, inevitable computational challenges are encountered when estimating these composite models that may lead to sub-optimal solutions. General implementation strategies are developed for parameter estimation in order to arrive at an automatic way to reach a viable solution, regardless of the specific head and/or tail distributions specified. The results lead to an identification of new well-fitting composite models and provide valuable insights into the selection of certain composite models for which the tail-evaluation measures can be useful in making risk management decisions.  相似文献   

14.
Systematic longevity risk is increasingly relevant for public pension schemes and insurance companies that provide life benefits. In view of this, mortality models should incorporate dependence between lives. However, the independent lifetime assumption is still heavily relied upon in the risk management of life insurance and annuity portfolios. This paper applies a multivariate Tweedie distribution to incorporate dependence, which it induces through a common shock component. Model parameter estimation is developed based on the method of moments and generalized to allow for truncated observations. The estimation procedure is explicitly developed for various important distributions belonging to the Tweedie family, and finally assessed using simulation.  相似文献   

15.
We introduce a jump-diffusion model for asset returns with jumps drawn from a mixture of normal distributions and show that this model adequately fits the historical data of the S&P500 index. We consider a delta-hedging strategy (DHS) for vanilla options under the diffusion model (DM) and the proposed jump-diffusion model (JDM), assuming discrete trading intervals and transaction costs, and derive an approximation for the probability density function (PDF) of the profit-and-loss (P&L) of the DHS under both models. We find that, under the log-normal model of Black–Scholes–Merton, the actual PDF of the P&L can be well approximated by the chi-squared distribution with specific parameters. We derive an approximation for the P&L volatility in the DM and JDM. We show that, under both DM and JDM, the expected loss due to transaction costs is inversely proportional to the square root of the hedging frequency. We apply mean–variance analysis to find the optimal hedging frequency given the hedger's risk tolerance. Since under the JDM it is impossible to reduce the P&L volatility by increasing the hedging frequency, we consider an alternative hedging strategy, following which the P&L volatility can be reduced by increasing the hedging frequency.  相似文献   

16.
We propose a model for price formation in financial markets based on the clearing of a standard call auction with random orders, and verify its validity for prediction of the daily closing price distribution statistically. The model considers random buy and sell orders, placed employing demand- and supply-side valuation distributions; an equilibrium equation then leads to a distribution for clearing price and transacted volume. Bid and ask volumes are left as free parameters, permitting possibly heavy-tailed or very skewed order flow conditions. In highly liquid auctions, the clearing price distribution converges to an asymptotically normal central limit, with mean and variance in terms of supply/demand-valuation distributions and order flow imbalance. By means of simulations, we illustrate the influence of variations in order flow and valuation distributions on price/volume, noting a distinction between high- and low-volume auction price variance. To verify the validity of the model statistically, we predict a year's worth of daily closing price distributions for five constituents of the Eurostoxx 50 index; Kolmogorov–Smirnov statistics and QQ-plots demonstrate with ample statistical significance that the model predicts closing price distributions accurately, and compares favourably with alternative methods of prediction.  相似文献   

17.
Modifying the distributional assumptions of the Black‐Scholes model is one way to accommodate the skewness of underlying asset returns. Simple models based on the compensated gamma and Weibull distributions of asset prices are shown to produce some improvements in option pricing. To evaluate these assertions, I construct and compare delta hedges of all S&P 500 options traded on the Chicago Board Options Exchange between September 2001 and October 2003 for the Weibull, Black‐Scholes, and gamma models. I also compare implied volatilities and their smiles (i.e., nonlinearities) among the three models. None of the three models improves over the others as far as delta hedging is concerned. Volatilities implied by all three models exhibit statistically significant smiles.  相似文献   

18.
A rich variety of probability distributions has been proposed in the actuarial literature for fitting of insurance loss data. Examples include: lognormal, log-t, various versions of Pareto, loglogistic, Weibull, gamma and its variants, and generalized beta of the second kind distributions, among others. In this paper, we supplement the literature by adding the log-folded-normal and log-folded-t families. Shapes of the density function and key distributional properties of the ‘folded’ distributions are presented along with three methods for the estimation of parameters: method of maximum likelihood; method of moments; and method of trimmed moments. Further, large and small-sample properties of these estimators are studied in detail. Finally, we fit the newly proposed distributions to data which represent the total damage done by 827 fires in Norway for the year 1988. The fitted models are then employed in a few quantitative risk management examples, where point and interval estimates for several value-at-risk measures are calculated.  相似文献   

19.
In this paper, we provide three equivalent expressions for ruin probabilities in a Cramér–Lundberg model with gamma distributed claims. The results are solutions of integro-differential equations, derived by means of (inverse) Laplace transforms. All the three formulas have infinite series forms, two involving Mittag–Leffler functions and the third one involving moments of the claims distribution. This last result applies to any other claim size distributions that exhibits finite moments.  相似文献   

20.
ABSTRACT

We propose a dividend stock valuation model where multiple dividend growth series and their dependencies are modelled using a multivariate Markov chain. Our model advances existing Markov chain stock models. First, we determine assumptions that guarantee the finiteness of the price and risk as well as the fulfilment of transversality conditions. Then, we compute the first- and second-order price-dividend ratios by solving corresponding linear systems of equations and show that a different price-dividend ratio is attached to each combination of states of the dividend growth process of each stock. Subsequently, we provide a formula for the computation of the variances and covariances between stocks in a portfolio. Finally, we apply the theoretical model to the dividend series of three US stocks and perform comparisons with existing models. The results could also be applied for actuarial purposes as a general stochastic investment model and for calculating the initial endowment to fund a portfolio of dependent perpetuities.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号