首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Longevity risk arising from uncertain mortality improvement is one of the major risks facing annuity providers and pension funds. In this article, we show how applying trend models from non-life claims reserving to age-period-cohort mortality trends provides new insight in estimating mortality improvement and quantifying its uncertainty. Age, period and cohort trends are modelled with distinct effects for each age, calendar year and birth year in a generalised linear models framework. The effects are distinct in the sense that they are not conjoined with age coefficients, borrowing from regression terminology, we denote them as main effects. Mortality models in this framework for age-period, age-cohort and age-period-cohort effects are assessed using national population mortality data from Norway and Australia to show the relative significance of cohort effects as compared to period effects. Results are compared with the traditional Lee–Carter model. The bilinear period effect in the Lee–Carter model is shown to resemble a main cohort effect in these trend models. However, the approach avoids the limitations of the Lee–Carter model when forecasting with the age-cohort trend model.  相似文献   

2.
We review a number of multi-population mortality models: variations of the Li & Lee model, and the common-age-effect (CAE) model of Kleinow. Model parameters are estimated using maximum likelihood. Although this introduces some challenging identifiability problems and complicates the estimation process it allows a fair comparison of the different models. We propose to solve these identifiability problems by applying two-dimensional constraints over the parameters. Using data from six countries, we compare and rank, both visually and numerically, the models’ fitting qualities and develop forecasting models that produce non-diverging, joint mortality rate scenarios. It is found that the CAE model fits best. But we also find that the Li and Lee model potentially suffers from robustness problems when calibrated using maximum likelihood.  相似文献   

3.
Renshaw and Verrall [] specified the generalized linear model (GLM) underlying the chain-ladder technique and suggested some other GLMs which might be useful in claims reserving. The purpose of this paper is to construct bounds for the discounted loss reserve within the framework of GLMs. Exact calculation of the distribution of the total reserve is not feasible, and hence the determination of lower and upper bounds with a simpler structure is a possible way out. The paper ends with numerical examples illustrating the usefulness of the presented approximations.  相似文献   

4.
5.
Most mortality models proposed in recent literature rely on the standard ARIMA framework (in particular: a random walk with drift) to project mortality rates. As a result the projections are highly sensitive to the calibration period. We therefore analyse the impact of allowing for multiple structural changes on a large collection of mortality models. We find that this may lead to more robust projections for the period effect but that there is only a limited effect on the ranking of the models based on backtesting criteria, since there is often not yet sufficient statistical evidence for structural changes. However, there are cases for which we do find improvements in estimates and we therefore conclude that one should not exclude on beforehand that structural changes may have occurred.  相似文献   

6.
A new market for so-called mortality derivatives is now appearing with survivor swaps (also called mortality swaps), longevity bonds and other specialized solutions. The development of these new financial instruments is triggered by the increased focus on the systematic mortality risk inherent in life insurance contracts, and their main focus is thus to allow the life insurance companies to hedge their systematic mortality risk. At the same time, this new class of financial contract is interesting from an investor's point of view, since it increases the possibility for an investor to diversify the investment portfolio. The systematic mortality risk stems from the uncertainty related to the future development of the mortality intensities. Mathematically, this uncertainty is described by modeling the underlying mortality intensities via stochastic processes. We consider two different portfolios of insured lives, where the underlying mortality intensities are correlated, and study the combined financial and mortality risk inherent in a portfolio of general life insurance contracts. In order to hedge this risk, we allow for investments in survivor swaps and derive risk-minimizing strategies in markets where such contracts are available. The strategies are evaluated numerically.  相似文献   

7.
It is well established that annuities can fully diversify idiosyncratic mortality risks. However, survival rates at the cohort level are changing, raising the question what is the scope of annuities in the presence of aggregate mortality risk? In an overlapping generations setting, we show that risk free annuities exist, but offer a return below the (fair) certainty equivalent return, and agents do not fully annuitize their savings. Higher aggregate mortality risk increases savings and thus the mean level of the capital stock. This lowers the mean rate of return on capital, the survival premium on annuities and the share of individual savings in annuities.  相似文献   

8.
In the first part of the paper, we consider the wide range of extrapolative stochastic mortality models that have been proposed over the last 15–20 years. A number of models that we consider are framed in discrete time and place emphasis on the statistical aspects of modelling and forecasting. We discuss how these models can be evaluated, compared and contrasted. We also discuss a discrete-time market model that facilitates valuation of mortality-linked contracts with embedded options. We then review several approaches to modelling mortality in continuous time. These models tend to be simpler in nature, but make it possible to examine the potential for dynamic hedging of mortality risk. Finally, we review a range of financial instruments (traded and over-the-counter) that could be used to hedge mortality risk. Some of these, such as mortality swaps, already exist, while others anticipate future developments in the market.  相似文献   

9.
It is well known that the exponential dispersion family (EDF) of univariate distributions is closed under Bayesian revision in the presence of natural conjugate priors. However, this is not the case for the general multivariate EDF. This paper derives a second-order approximation to the posterior likelihood of a naturally conjugated generalised linear model (GLM), i.e., multivariate EDF subject to a link function (Section 5.5). It is not the same as a normal approximation. It does, however, lead to second-order Bayes estimators of parameters of the posterior. The family of second-order approximations is found to be closed under Bayesian revision. This generates a recursion for repeated Bayesian revision of the GLM with the acquisition of additional data. The recursion simplifies greatly for a canonical link. The resulting structure is easily extended to a filter for estimation of the parameters of a dynamic generalised linear model (DGLM) (Section 6.2). The Kalman filter emerges as a special case. A second type of link function, related to the canonical link, and with similar properties, is identified. This is called here the companion canonical link. For a given GLM with canonical link, the companion to that link generates a companion GLM (Section 4). The recursive form of the Bayesian revision of this GLM is also obtained (Section 5.5.3). There is a perfect parallel between the development of the GLM recursion and its companion. A dictionary for translation between the two is given so that one is readily derived from the other (Table 5.1). The companion canonical link also generates a companion DGLM. A filter for this is obtained (Section 6.3). Section 1.2 provides an indication of how the theory developed here might be applied to loss reserving. A sequel paper, providing numerical illustrations of this, is planned.  相似文献   

10.
Density forecasts have become important in finance and play a key role in modern risk management. Using a flexible density forecast evaluation framework that extends the Berkowitz likelihood ratio test this paper evaluates in- and out-of-sample density forecasts of daily returns on the DAX, ATX and S&P 500 stock market indices from models of financial returns that are currently widely used in the financial industry. The results indicate that GARCH-t models produce good in-sample forecasts. No model considered in this study delivers fully acceptable out-of-sample forecasts. The empirical findings emphasize that proper distributional assumptions combined with an adequate specification of relevant conditional higher moments are necessary to obtain good density forecasts.  相似文献   

11.
This paper uses generalized spectral tests to examine whether international stock index returns are predictable using the history of the series. Unlike many other testing procedures, the generalized spectral tests used in this paper are robust to distributional assumptions, the presence of time-varying volatility, and allow for various forms of non-linear predictability. We find evidence of predictability in mean for over half of the international returns examined. In addition, we find most of the predictability to be non-linear in nature. The patterns of predictability are consistent with calendar effects and in some cases long-run dependence. Regardless of the implications of predictability of returns, this study is important because the generalized spectrum is defined for a range of different frequencies (corresponding to cycles of 2 days and greater), and we can therefore examine at what frequencies predictability occurs. This provides insight into whether there exists short-run, long-run, or both types of dependence.  相似文献   

12.
Missing data is a problem that may be faced by actuaries when analysing mortality data. In this paper we deal with pension scheme data, where the future lifetime of each member is modelled by means of parametric survival models incorporating covariates, which may be missing for some individuals. Parameters are estimated by likelihood-based techniques. We analyse statistical issues, such as parameter identifiability, and propose an algorithm to handle the estimation task. Finally, we analyse the financial impact of including covariates maximally, compared with excluding parts of the mortality experience where data are missing; in particular we consider annuity factors and mis-estimation risk capital requirements.  相似文献   

13.
We evaluate linear stochastic discount factor models using an ex-post portfolio metric: the realized out-of-sample Sharpe ratio of mean–variance portfolios backed by alternative linear factor models. Using a sample of monthly US portfolio returns spanning the period 1968–2016, we find evidence that multifactor linear models have better empirical properties than the CAPM, not only when the cross-section of expected returns is evaluated in-sample, but also when they are used to inform one-month ahead portfolio selection. When we compare portfolios associated to multifactor models with mean–variance decisions implied by the single-factor CAPM, we document statistically significant differences in Sharpe ratios of up to 10 percent. Linear multifactor models that provide the best in-sample fit also yield the highest realized Sharpe ratios.  相似文献   

14.
Abstract

This article focuses on inferring critical comparative conclusions as far as the application of both linear and non-linear risk measures in non-convex portfolio optimization problems. We seek to co-assess a set of sophisticated real-world non-convex investment policy limitations, such as cardinality constraints, buy-in thresholds, transaction costs, particular normative rules, etc. within the frame of four popular portfolio selection cases: (a) the mean-variance model, (b) the mean-semi variance model, (c) the mean-MAD (mean-absolute deviation) model and (d) the mean-semi MAD model. In such circumstances, the portfolio selection process reflects to a mixed-integer bi-objective (or in general multiobjective) mathematical programme. We precisely develop all corresponding modelling procedures and then solve the underlying problem by use of a novel generalized algorithm, which was exclusively introduced to cope with the above-mentioned singularities. The validity of the attempt is verified through empirical testing on the S&P 500 universe of securities. The technical conclusions obtained not only confirm certain findings of the particular limited existing theory but also shed light on computational issues and running times. Moreover, the results derived are characterized as encouraging enough, since a sufficient number of efficient or Pareto optimal portfolios produced by the models appear to possess superior out-of-sample returns with respect to the benchmark.  相似文献   

15.
Once a pricing kernel is established, bond prices and all other interest rate claims can be computed. Alternatively, the pricing kernel can be deduced from observed prices of bonds and selected interest rate claims. Examples of the former approach include the celebrated Cox, Ingersoll, and Ross (1985b) model and the more recent model of Constantinides (1992). Examples of the latter include the Black, Derman, and Toy (1990) model and the Heath, Jarrow, and Morton paradigm (1992) (hereafter HJM). In general, these latter models are not Markov. Fortunately, when suitable restrictions are imposed on the class of volatility structures of forward rates, then finite-state variable HJM models do emerge. This article provides a linkage between the finite-state variable HJM models, which use observables to induce a pricing kernel, and the alternative approach, which proceeds directly to price after a complete specification of a pricing kernel. Given such linkages, we are able to explicitly reveal the relationship between state-variable models, such as Cox, Ingersoll, and Ross, and the finite-state variable HJM models. In particular, our analysis identifies the unique map between the set of investor forecasts about future levels of the drift of the pricing kernel and the manner by which these forecasts are revised, to the shape of the term structure and its volatility. For an economy with square root innovations, the exact mapping is made transparent.  相似文献   

16.
17.
In this paper, we develop a framework for evaluating the impact of conservative accounting on the structure of residual income models of equity valuation. We explore specific examples of both unconditional and conditional conservatism and observe a common mathematical structure. We proceed to generalise our model and identify the joint dependency of conservatism and the persistence of abnormal earnings on the weights attached to book values, earnings and dividends. We are able to show theoretically the likely numerical impact of conservatism on price-earnings ratios and under-valuations produced by residual income models. We investigate empirically the interaction between conservatism and persistence and find they accord well with the theory developed. We briefly discuss the implications of testing the effect of conservatism on valuation and linear information dynamics.  相似文献   

18.
Pension buy-out is a special financial asset issued to offload the pension liabilities holistically in exchange for an upfront premium. In this paper, we concentrate on the pricing of pension buy-outs under dependence between interest and mortality rates risks with an explicit correlation structure in a continuous time framework. Change of measure technique is invoked to simplify the valuation. We also present how to obtain the buy-out price for a hypothetical benefit pension scheme using stochastic models to govern the dynamics of interest and mortality rates. Besides employing a non-mean reverting specification of the Ornstein–Uhlenbeck process and a continuous version of Lee–Carter setting for modeling mortality rates, we prefer Vasicek and Cox–Ingersoll–Ross models for short rates. We provide numerical results under various scenarios along with the confidence intervals using Monte Carlo simulations.  相似文献   

19.
20.
Among many strategies for financial trading, pairs trading has played an important role in practical and academic frameworks. Loosely speaking, it involves a statistical arbitrage tool for identifying and exploiting the inefficiencies of two long-term, related financial assets. When a significant deviation from this equilibrium is observed, a profit might result. In this paper, we propose a pairs trading strategy entirely based on linear state space models designed for modelling the spread formed with a pair of assets. Once an adequate state space model for the spread is estimated, we use the Kalman filter to calculate conditional probabilities that the spread will return to its long-term mean. The strategy is activated upon large values of these conditional probabilities: the spread is bought or sold accordingly. Two applications with real data from the US and Brazilian markets are offered, and even though they probably rely on limited evidence, they already indicate that a very basic portfolio consisting of a sole spread outperforms some of the main market benchmarks.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号