首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
This article offers an alternative proof of the capital asset pricing model (CAPM) when asset returns follow a multivariate elliptical distribution. Empirical studies continue to demonstrate the inappropriateness of the normality assumption for modeling asset returns. The class of elliptically contoured distributions, which includes the more familiar Normal distribution, provides flexibility in modeling the thickness of tails associated with the possibility that asset returns take extreme values with nonnegligible probabilities. As summarized in this article, this class preserves several properties of the Normal distribution. Within this framework, we prove a new version of Stein's lemma for this class of distributions and use this result to derive the CAPM when returns are elliptical. Furthermore, using the probability distortion function approach based on the dual utility theory of choice under uncertainty, we also derive an explicit form solution to call option prices when the underlying is log‐elliptically distributed. The Black–Scholes call option price is a special case of this general result when the underlying is log‐normally distributed.  相似文献   

2.
操作风险损失的广义帕累托分布参数估计及其应用   总被引:1,自引:0,他引:1  
极值理论表明大于某一阀值的样本服从广义帕累托分布,该结论在金融风险计量和保险精算中有着广泛的应用。然而,由于其参数没有可接受的估计方法,致使其应用受到限制。论文在推导出广义帕累托分布的条件矩的基础上,研究了基于操作风险损失的广义帕累托分布的参数估计问题。并且基于我国商业银行1994~2008年的操作风险损失数据对经济资本配置进行了算例分析。  相似文献   

3.
This paper tests empirically Hong and Stein's theoretical finding, that in an environment of short sale constraints, investor disagreement over future equity prices leads to negatively skewed return distributions. This study uses data from the Indian equity market to examine the third and fourth moments of the return distribution. The skewness of the return distribution is estimated both from realized returns and option prices. Empirical results provide partial supportive evidence for Hong and Stein's hypothesis.  相似文献   

4.
In this paper, we study the family of renewal shot-noise processes. The Feynmann–Kac formula is obtained based on the piecewise deterministic Markov process theory and the martingale methodology. We then derive the Laplace transforms of the conditional moments and asymptotic moments of the processes. In general, by inverting the Laplace transforms, the asymptotic moments and the first conditional moments can be derived explicitly; however, other conditional moments may need to be estimated numerically. As an example, we develop a very efficient and general algorithm of Monte Carlo exact simulation for estimating the second conditional moments. The results can be then easily transformed to the counterparts of discounted aggregate claims for insurance applications, and we apply the first two conditional moments for the actuarial net premium calculation. Similarly, they can also be applied to credit risk and reliability modelling. Numerical examples with four distribution choices for interarrival times are provided to illustrate how the models can be implemented.  相似文献   

5.
That the returns on financial assets and insurance claims are not well described by the multivariate normal distribution is generally acknowledged in the literature. This paper presents a review of the use of the skew-normal distribution and its extensions in finance and actuarial science, highlighting known results as well as potential directions for future research. When skewness and kurtosis are present in asset returns, the skew-normal and skew-Student distributions are natural candidates in both theoretical and empirical work. Their parameterization is parsimonious and they are mathematically tractable. In finance, the distributions are interpretable in terms of the efficient markets hypothesis. Furthermore, they lead to theoretical results that are useful for portfolio selection and asset pricing. In actuarial science, the presence of skewness and kurtosis in insurance claims data is the main motivation for using the skew-normal distribution and its extensions. The skew-normal has been used in studies on risk measurement and capital allocation, which are two important research fields in actuarial science. Empirical studies consider the skew-normal distribution because of its flexibility, interpretability, and tractability. This paper comprises four main sections: an overview of skew-normal distributions; a review of skewness in finance, including asset pricing, portfolio selection, time series modeling, and a review of its applications in insurance, in which the use of alternative distribution functions is widespread. The final section summarizes some of the challenges associated with the use of skew-elliptical distributions and points out some directions for future research.  相似文献   

6.
Risk Measurement Performance of Alternative Distribution Functions   总被引:1,自引:0,他引:1  
This paper evaluates the performance of three extreme value distributions, i.e., generalized Pareto distribution (GPD), generalized extreme value distribution (GEV), and Box‐Cox‐GEV, and four skewed fat‐tailed distributions, i.e., skewed generalized error distribution (SGED), skewed generalized t (SGT), exponential generalized beta of the second kind (EGB2), and inverse hyperbolic sign (IHS) in estimating conditional and unconditional value at risk (VaR) thresholds. The results provide strong evidence that the SGT, EGB2, and IHS distributions perform as well as the more specialized extreme value distributions in modeling the tail behavior of portfolio returns. All three distributions produce similar VaR thresholds and perform better than the SGED and the normal distribution in approximating the extreme tails of the return distribution. The conditional coverage and the out‐of‐sample performance tests show that the actual VaR thresholds are time varying to a degree not captured by unconditional VaR measures. In light of the fact that VaR type measures are employed in many different types of financial and insurance applications including the determination of capital requirements, capital reserves, the setting of insurance deductibles, the setting of reinsurance cedance levels, as well as the estimation of expected claims and expected losses, these results are important to financial managers, actuaries, and insurance practitioners.  相似文献   

7.
Abstract

Credibility is a form of insurance pricing that is widely used, particularly in North America. The theory of credibility has been called a “cornerstone” in the field of actuarial science. Students of the North American actuarial bodies also study loss distributions, the process of statistical inference of relating a set of data to a theoretical (loss) distribution. In this work, we develop a direct link between credibility and loss distributions through the notion of a copula, a tool for understanding relationships among multivariate outcomes.

This paper develops credibility using a longitudinal data framework. In a longitudinal data framework, one might encounter data from a cross section of risk classes (towns) with a history of insurance claims available for each risk class. For the marginal claims distributions, we use generalized linear models, an extension of linear regression that also encompasses Weibull and Gamma regressions. Copulas are used to model the dependencies over time; specifically, this paper is the first to propose using a t-copula in the context of generalized linear models. The t-copula is the copula associated with the multivariate t-distribution; like the univariate tdistributions, it seems especially suitable for empirical work. Moreover, we show that the t-copula gives rise to easily computable predictive distributions that we use to generate credibility predictors. Like Bayesian methods, our copula credibility prediction methods allow us to provide an entire distribution of predicted claims, not just a point prediction.

We present an illustrative example of Massachusetts automobile claims, and compare our new credibility estimates with those currently existing in the literature.  相似文献   

8.
Abstract

A model for pricing insurance and financial risks, based on recent developments in actuarial premium principles with elliptical distributions, is developed for application to incomplete markets and heavy-tailed distributions. The pricing model involves an application of a generalized variance premium principle from insurance pricing to the pricing of a portfolio of nontraded risks relative to a portfolio of traded risks. This pricing model for a portfolio of insurance or financial risks reflects preferences for features of the distributions other than mean and variance, including kurtosis. The model reduces to the Capital Asset Pricing Model for multinormal portfolios and to a form of the CAPM in the case where the traded and nontraded risks have the same elliptical distribution.  相似文献   

9.
The probabilistic behavior of the claim severity variable plays a fundamental role in calculation of deductibles, layers, loss elimination ratios, effects of inflation, and other quantities arising in insurance. Among several alternatives for modeling severity, the parametric approach continues to maintain the leading position, which is primarily due to its parsimony and flexibility. In this article, several parametric families are employed to model severity of Norwegian fire claims for the years 1981 through 1992. The probability distributions we consider include generalized Pareto, lognormal-Pareto (two versions), Weibull-Pareto (two versions), and folded-t. Except for the generalized Pareto distribution, the other five models are fairly new proposals that recently appeared in the actuarial literature. We use the maximum likelihood procedure to fit the models and assess the quality of their fits using basic graphical tools (quantile-quantile plots), two goodness-of-fit statistics (Kolmogorov-Smirnov and Anderson-Darling), and two information criteria (AIC and BIC). In addition, we estimate the tail risk of “ground up” Norwegian fire claims using the value-at-risk and tail-conditional median measures. We monitor the tail risk levels over time, for the period 1981 to 1992, and analyze predictive performances of the six probability models. In particular, we compute the next-year probability for a few upper tail events using the fitted models and compare them with the actual probabilities.  相似文献   

10.
A rich variety of probability distributions has been proposed in the actuarial literature for fitting of insurance loss data. Examples include: lognormal, log-t, various versions of Pareto, loglogistic, Weibull, gamma and its variants, and generalized beta of the second kind distributions, among others. In this paper, we supplement the literature by adding the log-folded-normal and log-folded-t families. Shapes of the density function and key distributional properties of the ‘folded’ distributions are presented along with three methods for the estimation of parameters: method of maximum likelihood; method of moments; and method of trimmed moments. Further, large and small-sample properties of these estimators are studied in detail. Finally, we fit the newly proposed distributions to data which represent the total damage done by 827 fires in Norway for the year 1988. The fitted models are then employed in a few quantitative risk management examples, where point and interval estimates for several value-at-risk measures are calculated.  相似文献   

11.
Abstract

Tail conditional expectations refer to the expected values of random variables conditioning on some tail events and are closely related to various coherent risk measures. In the univariate case, the tail conditional expectation is asymptotically proportional to Value-at-Risk, a popular risk mea-sure. The focus of this paper is on asymptotic relations between the multivariate tail conditional expectation and Value-at-Risk for heavy-tailed scale mixtures of multivariate distributions. Explicit tail estimates of multivariate tail conditional expectations are obtained using the method of regular variation. Examples involving multivariate Pareto and elliptical distributions, as well as application to risk allocation, are also discussed.  相似文献   

12.
This article investigates the portfolio selection problem of an investor with three-moment preferences taking positions in commodity futures. To model the asset returns, we propose a conditional asymmetric t copula with skewed and fat-tailed marginal distributions, such that we can capture the impact on optimal portfolios of time-varying moments, state-dependent correlations, and tail and asymmetric dependence. In the empirical application with oil, gold and equity data from 1990 to 2010, the conditional t copulas portfolios achieve better performance than those based on more conventional strategies. The specification of higher moments in the marginal distributions and the type of tail dependence in the copula has significant implications for the out-of-sample portfolio performance.  相似文献   

13.
Abstract

This article investigates performance of interval estimators of various actuarial risk measures. We consider the following risk measures: proportional hazards transform (PHT), Wang transform (WT), value-at-risk (VaR), and conditional tail expectation (CTE). Confidence intervals for these measures are constructed by applying nonparametric approaches (empirical and bootstrap), the strict parametric approach (based on the maximum likelihood estimators), and robust parametric procedures (based on trimmed means).

Using Monte Carlo simulations, we compare the average lengths and proportions of coverage (of the true measure) of the intervals under two data-generating scenarios: “clean” data and “contaminated” data. In the “clean” case, data sets are generated by the following (similar shape) parametric families: exponential, Pareto, and lognormal. Parameters of these distributions are selected so that all three families are equally risky with respect to a fixed risk measure. In the “contaminated” case, the “clean” data sets from these distributions are mixed with a small fraction of unusual observations (outliers). It is found that approximate knowledge of the underlying distribution combined with a sufficiently robust estimator (designed for that distribution) yields intervals with satisfactory performance under both scenarios.  相似文献   

14.
Abstract

Estimation of the tail index parameter of a single-parameter Pareto model has wide application in actuarial and other sciences. Here we examine various estimators from the standpoint of two competing criteria: efficiency and robustness against upper outliers. With the maximum likelihood estimator (MLE) being efficient but nonrobust, we desire alternative estimators that retain a relatively high degree of efficiency while also being adequately robust. A new generalized median type estimator is introduced and compared with the MLE and several well-established estimators associated with the methods of moments, trimming, least squares, quantiles, and percentile matching. The method of moments and least squares estimators are found to be relatively deficient with respect to both criteria and should become disfavored, while the trimmed mean and generalized median estimators tend to dominate the other competitors. The generalized median type performs best overall. These findings provide a basis for revision and updating of prevailing viewpoints. Other topics discussed are applications to robust estimation of upper quantiles, tail probabilities, and actuarial quantities, such as stop-loss and excess-of-loss reinsurance premiums that arise concerning solvency of portfolios. Robust parametric methods are compared with empirical nonparametric methods, which are typically nonrobust.  相似文献   

15.
Loss functions play an important role in analyzing insurance portfolios. A fundamental issue in the study of loss functions involves the selection of probability models for claim frequencies. In this article, we propose a semi‐parametric approach based on the generalized method of moments (GMM) to solve the specification problems concerning claim frequency distributions. The GMM‐based testing procedure provides a general framework that encompasses many specification problems of interest in actuarial applications. As an alternative approach to the Pearson χ2 and other goodness‐of‐fit tests, it is easy to implement and should be of practical use in applications involving selecting and validating probability models with complex characteristics.  相似文献   

16.
We consider the fractional independence (FI) survival model, studied by Willmot (1997), for which the curtate future lifetime and the fractional part of it satisfy the statistical independence assumption, called the fractional independence assumption.

The ordering of risks of the FI survival model is analyzed, and its consequences for the evaluation of actuarial present values in life insurance is discussed. Our main fractional reduction (FR) theorem states that two FI future lifetime random variables with identical distributed curtate future lifetime are stochastically ordered (stop-loss ordered) if, and only if, their fractional parts are stochastically ordered (stop-loss ordered).

The well-known properties of these stochastic orders allow to find lower and upper bounds for different types of actuarial present values, for example when the random payoff functions of the considered continuous life insurances are convex (concave), or decreasing (increasing), or convex not decreasing (concave not increasing) in the future lifetime as argument. These bounds are obtained under the assumption that some information concerning the moments of the fractional part is given. A distinction is made according to whether the fractional remaining lifetime has a fixed mean or a fixed mean and variance. In the former case, simple unique optimal bounds are obtained in case of a convex (concave) present value function.

The obtained results are illustrated at the most important life insurance quantities in a continuous random environment, which include bounds for net single premiums, net level annual premiums and prospective net reserves.  相似文献   

17.
A convolution representation is derived for the equilibrium or integrated tail distribution associated with a compound distribution. This result allows for the derivation of reliability properties of compound distributions, as well as an explicit analytic representation for the stop-loss premium, of interest in connection with insurance claims modelling. This result is extended to higher order equilibrium distributions, or equivalently to higher stop-loss moments. Special cases where the counting distribution is mixed Poisson or discrete phase-type are considered in some detail. An approach to handle more general counting distributions is also outlined.  相似文献   

18.
Abstract

Standard actuarial theory of multiple life insurance traditionally postulates independence for the remaining lifetimes mainly due to computational convenience rather than realism. In this paper, we propose a general common shock model for modelling dependent coupled lives and apply it to a life insurance model. In the proposed shock model, we consider not only simultaneous deaths of the coupled members due to a single shock (e.g. a critical accident), but also cumulative effect in the mortality rate when they survive shocks. Under the model, we derive a bivariate lifetime distribution and its marginal distributions in closed forms. We study the bivariate ageing property, dependence structure and the dependence orderings of the lifetime distribution. Based on it, we investigate the influence of dependence on the pricings of insurance policies involving multiple lives which are subject to common shocks. Furthermore, we discuss relevant useful stochastic bounds.  相似文献   

19.
Mixed Normal Conditional Heteroskedasticity   总被引:4,自引:0,他引:4  
Both unconditional mixed normal distributions and GARCH modelswith fat-tailed conditional distributions have been employedin the literature for modeling financial data. We consider amixed normal distribution coupled with a GARCH-type structure(termed MN-GARCH) which allows for conditional variance in eachof the components as well as dynamic feedback between the components.Special cases and relationships with previously proposed specificationsare discussed and stationarity conditions are derived. For theempirically most relevant GARCH(1,1) case, the conditions forexistence of arbitrary integer moments are given and analyticexpressions of the unconditional skewness, kurtosis, and autocorrelationsof the squared process are derived. Finally, employing dailyreturn data on the NASDAQ index, we provide a detailed empiricalanalysis and compare both the in-sample fit and out-of-sampleforecasting performance of the MN-GARCH as well as recentlyproposed Markov-switching models. We show that the MN-GARCHapproach can generate a plausible disaggregation of the conditionalvariance process in which the components' volatility dynamicshave a clearly distinct behavior, which is, for example, compatiblewith the well-known leverage effect.  相似文献   

20.
In this paper, a new class of composite model is proposed for modeling actuarial claims data of mixed sizes. The model is developed using the Stoppa distribution and a mode-matching procedure. The use of the Stoppa distribution allows for more flexibility over the thickness of the tail, and the mode-matching procedure gives a simple derivation of the model compositing with a variety of distributions. In particular, the Weibull–Stoppa and the Lognormal–Stoppa distributions are investigated. Their performance is compared with existing composite models in the context of the well-known Danish fire insurance data-set. The results suggest the composite Weibull–Stoppa model outperforms the existing composite models in all seven goodness-of-fit measures considered.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号