首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Abstract

Life insurance companies deal with two fundamental types of risks when issuing annuity contracts: financial risk and demographic risk. Recent work on the latter has focused on modeling the trend in mortality as a stochastic process. A popular method for modeling death rates is the Lee-Carter model. This methodology has become widely used, and various extensions and modifications have been proposed to obtain a broader interpretation and to capture the main features of the dynamics of mortality rates. In order to improve the measurement of uncertainty in survival probability estimates, in particular for older ages, the paper proposes an extension based on simulation procedures and on the bootstrap methodology. It aims to obtain more reliable and accurate mortality projections, based on the idea of obtaining an acceptable accuracy of the estimate by means of variance reducing techniques. In this way the forecasting procedure becomes more efficient. The longevity question constitutes a critical element in the solvency appraisal of pension annuities. The demographic models used for the cash flow distributions in a portfolio impact on the mathematical reserve and surplus calculations and affect the risk management choices for a pension plan. The paper extends the investigation of the impact of survival uncertainty for life annuity portfolios and for a guaranteed annuity option in the case where interest rates are stochastic. In a framework in which insurance companies need to use internal models for risk management purposes and for determining their solvency capital requirement, the authors consider the surplus value, calculated as the ratio between the market value of the projected assets to that of the liabilities, as a meaningful measure of the company’s financial position, expressing the degree to which the liabilities are covered by the assets.  相似文献   

2.
Abstract

The increasing risk of poverty in retirement has been well documented; it is projected that current and future retirees’ living expenses will significantly exceed their savings and income. In this paper, we consider a retiree who does not have sufficient wealth and income to fund her future expenses, and we seek the asset allocation that minimizes the probability of financial ruin during her lifetime. Building on the work of Young (2004) and Milevsky, Moore, and Young (2006), under general mortality assumptions, we derive a variational inequality that governs the ruin probability and optimal asset allocation. We explore the qualitative properties of the ruin robability and optimal strategy, present a numerical method for their estimation, and examine their sensitivity to changes in model parameters for specific examples. We then present an easy-to-implement allocation rule and demonstrate via simulation that it yields nearly optimal ruin probability, even under discrete portfolio rebalancing.  相似文献   

3.
Abstract

In today’s world of financial uncertainty, one major public concern is to assess (and possibly improve) the stability of companies that take on risks. Actuaries have been aware of that issue for a very long time and have a great experience in modeling the activity of a risk business. During the first part of the twentieth century, they focused on the probability of ruin to assess the stability of their company. In his seminal paper of 1957 Bruno de Finetti criticized this approach and laid the foundations of what would become an increasingly popular topic: the study of dividend strategies. The contributions made by actuaries in that field constitute a substantial body of knowledge, whose interest is relevant not only to insurance but also to a much broader range of areas of practice. In this paper we aim at a taxonomical synthesis of the 50 years of actuarial research that followed de Finetti’s original paper.  相似文献   

4.
5.
Abstract

The reverse mortgage market has been expanding rapidly in developed economies in recent years. The onset of demographic transition places a rapidly rising number of households in an age window in which reverse mortgages have potential appeal. Increasing prices for residential real estate over the last decade have further stimulated interest.

Reverse mortgages involve various risks from the provider-s perspective that may hinder the further development of these financial products. This paper addresses one method of transferring and financing the risks associated with these products through the form of securitization. Securitization is becoming a popular and attractive alternative form of risk transfer of insurance liabilities. Here we demonstrate how to construct a securitization structure for reverse mortgages similar to the one applied in traditional insurance products.

Specifically, we investigate the merits of developing survivor bonds and survivor swaps for reverse mortgage products. In the case of survivor bonds, for example, we are able to compute premiums, both analytically and numerically through simulations, and to examine how the longevity risk may be transferred to the financial investors. Our numerical calculations provide an indication of the economic benefits derived from developing survivor bonds to securitize the “longevity risk component” of reverse mortgage products. Moreover, some sensitivity analysis of these economic benefits indicates that these survivor bonds provide for a promising tool for investment diversification.  相似文献   

6.
ABSTRACT

The precise measurement of the association between asset returns is important for financial investors and risk managers. In this paper, we focus on a recent class of association models: Dynamic Conditional Score (DCS) copula models. Our contributions are the following: (i) We compare the statistical performance of several DCS copulas for several portfolios. We study the Clayton, rotated Clayton, Frank, Gaussian, Gumbel, rotated Gumbel, Plackett and Student's t copulas. We find that the DCS model with the Student's t copula is the most parsimonious model. (ii) We demonstrate that the copula score function discounts extreme observations. (iii) We jointly estimate the marginal distributions and the copula, by using the Maximum Likelihood method. We use DCS models for mean, volatility and association of asset returns. (iv) We estimate robust DCS copula models, for which the probability of a zero return observation is not necessarily zero. (v) We compare different patterns of association in different regions of the distribution for different DCS copulas, by using density contour plots and Monte Carlo (MC) experiments. (vi) We undertake a portfolio performance study with the estimation and backtesting of MC Value-at-Risk for the DCS model with the Student's t copula.  相似文献   

7.
Abstract

The problem of allocating responsibility for risk among members of a portfolio arises in a variety of financial and risk-management contexts. Examples are particularly prominent in the insurance sector, where actuaries have long sought methods for distributing capital (net worth) across a number of distinct exposure units or accounts according to their relative contributions to the total “risk” of an insurer’s portfolio. Although substantial work has been done on this problem, no satisfactory solution has yet been presented for the case of inhomogeneous loss distributions— that is, losses XF X| λ such that F X|tλ (X) ≠ F tX| λ (X) for some t > 0. The purpose of this article is to show that the value-assignment method of nonatomic cooperative games proposed in 1974 by Aumann and Shapley may be used to solve risk-allocation problems involving losses of this type. This technique is illustrated by providing analytical solutions for a useful class of multivariatenormal loss distributions.  相似文献   

8.
Abstract

A portfolio of different insurance policies, such as temporary, endowment, and whole life, is studied in a stochastic mortality and interest environment. The first two moments of the present value of the benefits of the portfolio are derived. The riskiness of the portfolio as measured by the variance of the present value of the benefits can be divided into an insurance risk and an investment risk in two different ways. One way leads to a more natural interpretation of the two risk components. A simple portfolio is used to illustrate the results.  相似文献   

9.
Abstract

This paper presents a general probabilistic model, including stochastic discounting, for life insurance contracts, either a single policy or a portfolio of policies. In § 4 we define prospective reserves and annual losses in terms of our model and we show that these are generalisations of the corresponding concepts in conventional life insurance mathematics. Our main results are in § 5 where we use the martingale property of the loss process to derive upper bounds for the probability of ruin for the portfolio. These results are illustrated by two numerical examples in § 6.  相似文献   

10.
Abstract

This paper examines the so-called 1/n investment puzzle that has been observed in defined contribution plans whereby some participants divide their contributions equally among the available asset classes. It has been argued that this is a very naive strategy since it contradicts the fundamental tenets of modern portfolio theory. We use simple arguments to show that this behavior is perhaps less naive than it at first appears. It is well known that the optimal portfolio weights in a mean-variance setting are extremely sensitive to estimation errors, especially those in the expected returns. We show that when we account for estimation error, the 1/n rule has some advantages in terms of robustness; we demonstrate this with numerical experiments. This rule can provide a risk-averse investor with protection against very bad outcomes.  相似文献   

11.
In several countries a major factor contributing to the current economic crisis was massive borrowing to fund investment projects on the basis of, in retrospect, grossly optimistic valuations. The purpose of this paper is to initiate an approach to project valuation and risk management in which ‘behavioural’ factors—Keynes’ ‘animal spirits’ or Greenspan’s ‘irrational exuberance’—can be explicitly included. An appropriate framework is risk-neutral valuation based on the use of the numéraire portfolio—the ‘benchmark’ approach advocated by Platen and Heath (A benchmark approach to quantitative finance. Springer, Berlin, 2006). In the paper, we start by discussing the ingredients of the problem: ‘animal spirits’, financial instability, market-consistent valuation, the numéraire portfolio and structural models of credit risk. We then study a project finance problem in which a bank lends money to an entrepreneur, collateralized by the value of the latter’s investment project. This contains all the components of our approach in a simple setting and illustrates what steps are required. In a final section, we briefly discuss the econometric problems that need to be solved next.  相似文献   

12.
《Quantitative Finance》2013,13(4):345-351
Abstract

In this paper, we investigate the relative performance of stocks and bonds for various investment horizons on the French market. We use a new matched block bootstrap approach to take account of estimation risk. Furthermore, in the light of non-normality of returns, we use two different risk approaches as inputs in portfolio optimization: the traditional variance, and a downside risk measure, the semi-variance. Our results suggest that an investor should avoid bonds in the long run due to the time diversification effect.  相似文献   

13.
ABSTRACT

Participating contracts provide a maturity guarantee for the policyholder. However, the terminal payoff to the policyholder should be related to financial risks of participating insurance contracts. We investigate an optimal investment problem under a joint value-at-risk and portfolio insurance constraint faced by the insurer who offers participating contracts. The insurer aims to maximize the expected utility of the terminal payoff to the insurer. We adopt a concavification technique and a Lagrange dual method to solve the problem and derive the representations of the optimal wealth process and trading strategies. We also carry out some numerical analysis to show how the joint value-at-risk and the portfolio insurance constraint impacts the optimal terminal wealth.  相似文献   

14.
《Quantitative Finance》2013,13(6):426-441
Abstract

The benchmark theory of mathematical finance is the Black–Scholes–Merton (BSM) theory, based on Brownian motion as the driving noise process for stock prices. Here the distributions of financial returns of the stocks in a portfolio are multivariate normal. Risk management based on BSM underestimates tails. Hence estimation of tail behaviour is often based on extreme value theory (EVT). Here we discuss a semi-parametric replacement for the multivariate normal involving normal variance–mean mixtures. This allows a more accurate modelling of tails, together with various degrees of tail dependence, while (unlike EVT) the whole return distribution can be modelled. We use a parametric component, incorporating the mean vector μ and covariance matrix Σ, and a non-parametric component, which we can think of as a density on [0,∞), modelling the shape (in particular the tail decay) of the distribution. We work mainly within the family of elliptically contoured distributions, focusing particularly on normal variance mixtures with self-decomposable mixing distributions. We discuss efficient methods to estimate the parametric and non-parametric components of our model and provide an algorithm for simulating from such a model. We fit our model to several financial data series. Finally, we calculate value at risk (VaR) quantities for several portfolios and compare these VaRs to those obtained from simple multivariate normal and parametric mixture models.  相似文献   

15.
Abstract

Solvency II splits life insurance risk into seven risk classes consisting of three biometric risks (mortality risk, longevity risk, and disability/morbidity risk) and four nonbiometric risks (lapse risk, expense risk, revision risk, and catastrophe risk). The best estimate liabilities for the biometric risks are valued with biometric life tables (mortality and disability tables), while those of the nonbiometric risks require alternative valuation methods. The present study is restricted to biometric risks encountered in traditional single-life insurance contracts with multiple causes of decrement. Based on the results of quantitative impact studies, process risk was deemed to be not significant enough to warrant an explicit calculation. It was therefore assumed to be implicitly included in the systematic/parameter risk, resulting in a less complex standard formula. For the purpose of internal models and improved risk management, it appears important to capture separately or simultaneously all risk components of biometric risks. Besides its being of interest for its own sake, this leads to a better understanding of the standard approach and its application extent. Based on a total balance sheet approach we express the liability risk solvency capital of an insurance portfolio as value-at-risk and conditional value-at-risk of the prospective liability risk understood as random present value of future cash flows at a given time. The proposed approach is then applied to determine the biometric solvency capital for a portfolio of general life contracts. Using the conditional mean and variance of a portfolio’s prospective liability risk and a gamma distribution approximation we obtain simple solvency capital formulas as well as corresponding solvency capital ratios. To account for the possibility of systematic/parameter risk, we propose either to shift the biometric life tables or to apply a stochastic biometric model, which allows for random biometric rates. A numerical illustration for a cohort of immediate life annuities in arrears reveals the importance of process risk in the assessment of longevity risk solvency capital.  相似文献   

16.
Abstract

This paper considers a modification of the well known constant elasticity of variance model where it is used to model the growth optimal portfolio (GOP). It is shown that, for this application, there is no equivalent risk neutral pricing measure and therefore the classical risk neutral pricing methodology fails. However, a consistent pricing and hedging framework can be established by application of the benchmark approach.

Perfect hedging strategies can be constructed for European style contingent claims, where the underlying risky asset is the GOP. In this framework, fair prices for contingent claims are the minimal prices that permit perfect replication of the claims. Numerical examples show that these prices may differ significantly from the corresponding ‘risk neutral’ prices.  相似文献   

17.
We offer evidence that the use of relative performance evaluation (RPE) in CEOs’ incentive contracts influences the effect of risk‐taking incentives on both the magnitude and composition of firm risk. We find that, when the incentive design lacks RPE features, the incentive portfolio vega motivates CEOs to increase total risk through the systematic component because it can be hedged. In contrast, when the incentive design includes RPE features, CEOs prefer idiosyncratic risk because RPE filters out the systematic component of firm performance. We also document that the use of RPE reinforces the incentive portfolio vega's effect on the total risk.  相似文献   

18.
Abstract

We examine properties of risk measures that can be considered to be in line with some “best practice” rules in insurance, based on solvency margins. We give ample motivation that all economic aspects related to an insurance portfolio should be considered in the definition of a risk measure. As a consequence, conditions arise for comparison as well as for addition of risk measures. We demonstrate that imposing properties that are generally valid for risk measures, in all possible dependency structures, based on the difference of the risk and the solvency margin, though providing opportunities to derive nice mathematical results, violates best practice rules. We show that so-called coherent risk measures lead to problems. In particular we consider an exponential risk measure related to a discrete ruin model, depending on the initial surplus, the desired ruin probability, and the risk distribution.  相似文献   

19.

Norberg (1989) analyses the heterogeneity in a portfolio of group life insurances using a parametric empirical Bayesian approach. In the present paper the model of Norberg is compared to a parametric fully Bayesian model and to a non-parametric fully Bayesian model.  相似文献   

20.
Abstract

This paper examines a portfolio of equity-linked life insurance contracts and determines risk-minimizing hedging strategies within a discrete-time setup. As a principal example, I consider the Cox-Ross-Rubinstein model and an equity-linked pure endowment contract under which the policyholder receives max(ST , K) at time T if he or she is then alive, where ST is the value of a stock index at the term T of the contract and K is a guarantee stipulated by the contract. In contrast to most of the existing literature, I view the contracts as contingent claims in an incomplete model and discuss the problem of choosing an optimality criterion for hedging strategies. The subsequent analysis leads to a comparison of the risk (measured by the variance of the insurer’s loss) inherent in equity-linked contracts in the two situations where the insurer applies the risk-minimizing strategy and the insurer does not hedge. The paper includes numerical results that can be used to quantify the effect of hedging and describe how this effect varies with the size of the insurance portfolio and assumptions concerning the mortality.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号