首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
In this paper, we incorporate the Bühlmann credibility into three mortality models (the Lee–Carter model, the Cairns–Blake–Dowd model, and a linear relational model) to improve their forecasting performances, as measured by the MAPE (mean absolute percentage error), using mortality data for the UK. The results show that the MAPE reduction ratios for the three mortality models with the Bühlmann credibility are all significant. More importantly, the MAPEs under the three mortality models with the Bühlmann credibility are very close to each other for each age and forecast year. Thus, by incorporating the Bühlmann credibility we are able to converge the forecasting MAPEs resulting from the three different mortality models to a lower and more consistent level. Moreover, we provide a credibility interpretation with an individual time trend for age x and a group time trend for all ages. Finally, we apply the forecasted mortality rates both with and without the Bühlmann credibility to the net single premiums of life insurance products, and compare the corresponding MAPEs.  相似文献   

2.
Longevity risk arising from uncertain mortality improvement is one of the major risks facing annuity providers and pension funds. In this article, we show how applying trend models from non-life claims reserving to age-period-cohort mortality trends provides new insight in estimating mortality improvement and quantifying its uncertainty. Age, period and cohort trends are modelled with distinct effects for each age, calendar year and birth year in a generalised linear models framework. The effects are distinct in the sense that they are not conjoined with age coefficients, borrowing from regression terminology, we denote them as main effects. Mortality models in this framework for age-period, age-cohort and age-period-cohort effects are assessed using national population mortality data from Norway and Australia to show the relative significance of cohort effects as compared to period effects. Results are compared with the traditional Lee–Carter model. The bilinear period effect in the Lee–Carter model is shown to resemble a main cohort effect in these trend models. However, the approach avoids the limitations of the Lee–Carter model when forecasting with the age-cohort trend model.  相似文献   

3.
This article investigates the natural hedging strategy to deal with longevity risks for life insurance companies. We propose an immunization model that incorporates a stochastic mortality dynamic to calculate the optimal life insurance–annuity product mix ratio to hedge against longevity risks. We model the dynamic of the changes in future mortality using the well‐known Lee–Carter model and discuss the model risk issue by comparing the results between the Lee–Carter and Cairns–Blake–Dowd models. On the basis of the mortality experience and insurance products in the United States, we demonstrate that the proposed model can lead to an optimal product mix and effectively reduce longevity risks for life insurance companies.  相似文献   

4.
We provide a self-contained analysis of a class of continuous-time stochastic mortality models that have gained popularity in the last few years. We describe some of their advantages and limitations, examining whether their features survive equivalent changes of measures. This is important when using the same model for both market-consistent valuation and risk management of life insurance liabilities. We provide a numerical example based on the calibration to the French annuity market of a risk-neutral version of the model proposed by Lee & Carter (1992).  相似文献   

5.
《Quantitative Finance》2013,13(3):163-172
Abstract

Support vector machines (SVMs) are a new nonparametric tool for regression estimation. We will use this tool to estimate the parameters of a GARCH model for predicting the conditional volatility of stock market returns. GARCH models are usually estimated using maximum likelihood (ML) procedures, assuming that the data are normally distributed. In this paper, we will show that GARCH models can be estimated using SVMs and that such estimates have a higher predicting ability than those obtained via common ML methods.  相似文献   

6.
Missing data is a problem that may be faced by actuaries when analysing mortality data. In this paper we deal with pension scheme data, where the future lifetime of each member is modelled by means of parametric survival models incorporating covariates, which may be missing for some individuals. Parameters are estimated by likelihood-based techniques. We analyse statistical issues, such as parameter identifiability, and propose an algorithm to handle the estimation task. Finally, we analyse the financial impact of including covariates maximally, compared with excluding parts of the mortality experience where data are missing; in particular we consider annuity factors and mis-estimation risk capital requirements.  相似文献   

7.
We develop and implement a technique for closed-form maximum likelihood estimation (MLE) of multifactor affine yield models. We derive closed-form approximations to likelihoods for nine Dai and Singleton (2000) affine models. Simulations show our technique very accurately approximates true (but infeasible) MLE. Using US Treasury data, we estimate nine affine yield models with different market price of risk specifications. MLE allows non-nested model comparison using likelihood ratio tests; the preferred model depends on the market price of risk. Estimation with simulated and real data suggests our technique is much closer to true MLE than Euler and quasi-maximum likelihood (QML) methods.  相似文献   

8.
This paper considers discrete time GARCH and continuous time SV models and uses these for American option pricing. We first of all show that with a particular choice of framework the parameters of the SV models can be estimated using simple maximum likelihood techniques. We then perform a Monte Carlo study to examine their differences in terms of option pricing, and we study the convergence of the discrete time option prices to their implied continuous time values. Finally, a large scale empirical analysis using individual stock options and options on an index is performed comparing the estimated prices from discrete time models to the corresponding continuous time model prices. The results show that, while the overall differences in performance are small, for the in the money put options on individual stocks the continuous time SV models do generally perform better than the discrete time GARCH specifications.  相似文献   

9.
Point and interval estimation of future disability inception and recovery rates is predominantly carried out by combining generalized linear models with time series forecasting techniques into a two-step method involving parameter estimation from historical data and subsequent calibration of a time series model. This approach may lead to both conceptual and numerical problems since any time trend components of the model are incoherently treated as both model parameters and realizations of a stochastic process. We suggest that this general two-step approach can be improved in the following way: First, we assume a stochastic process form for the time trend component. The corresponding transition densities are then incorporated into the likelihood, and the model parameters are estimated using the Expectation-Maximization algorithm. We illustrate the modeling procedure by fitting the model to Swedish disability claims data.  相似文献   

10.
Multiple state functional disability models do not generally include systematic trend and uncertainty. We develop and estimate a multistate latent factor intensity model with transition and recovery rates depending on a stochastic frailty factor to capture trend and uncertainty. We estimate the model parameters using U.S. Health and Retirement Study data between 1998 and 2012 with Monte Carlo maximum likelihood estimation method. The model shows significant reductions in disability and mortality rates during this period and allows us to quantify uncertainty in transition rates arising from the stochastic frailty factor. Recovery rates are very sensitive to the stochastic frailty. There is an increase in expected future lifetimes as well as an increase in future healthy life expectancy. The proportion of lifetime spent in disability on average remains stable with no strong support in the data for either morbidity compression or expansion. The model has widespread application in costing of government-funded aged care and pricing and risk management of long-term-care insurance products.  相似文献   

11.
Maximum likelihood estimation of stochastic volatility models   总被引:1,自引:0,他引:1  
We develop and implement a method for maximum likelihood estimation in closed-form of stochastic volatility models. Using Monte Carlo simulations, we compare a full likelihood procedure, where an option price is inverted into the unobservable volatility state, to an approximate likelihood procedure where the volatility state is replaced by proxies based on the implied volatility of a short-dated at-the-money option. The approximation results in a small loss of accuracy relative to the standard errors due to sampling noise. We apply this method to market prices of index options for several stochastic volatility models, and compare the characteristics of the estimated models. The evidence for a general CEV model, which nests both the affine Heston model and a GARCH model, suggests that the elasticity of variance of volatility lies between that assumed by the two nested models.  相似文献   

12.
Dynamics of Trade-by-Trade Price Movements: Decomposition and Models   总被引:1,自引:0,他引:1  
In this article we introduce a decomposition of the joint distributionof price changes of assets recorded trade-by-trade. Our decompositionmeans that we can model the dynamics of price changes usingquite simple and interpretable models which are easily extendedin a great number of directions, including using durations andvolume as explanatory variables. Thus we provide an econometricbasis for empirical work on market microstructure using timeseries of transaction data. We use maximum likelihood estimationand testing methods to assess the fit of the model to one yearof IBM stock price data taken from the New York Stock Exchange.  相似文献   

13.
Pension buy-out is a special financial asset issued to offload the pension liabilities holistically in exchange for an upfront premium. In this paper, we concentrate on the pricing of pension buy-outs under dependence between interest and mortality rates risks with an explicit correlation structure in a continuous time framework. Change of measure technique is invoked to simplify the valuation. We also present how to obtain the buy-out price for a hypothetical benefit pension scheme using stochastic models to govern the dynamics of interest and mortality rates. Besides employing a non-mean reverting specification of the Ornstein–Uhlenbeck process and a continuous version of Lee–Carter setting for modeling mortality rates, we prefer Vasicek and Cox–Ingersoll–Ross models for short rates. We provide numerical results under various scenarios along with the confidence intervals using Monte Carlo simulations.  相似文献   

14.
The issue of identification arises whenever structural models are estimated. Lack of identification means that the empirical implications of some model parameters are either undetectable or indistinguishable from the implications of other parameters. Therefore, identifiability must be verified prior to estimation. This paper provides a simple method for conducting local identification analysis in linearized DSGE models, estimated in both full and limited information settings. In addition to establishing which parameters are locally identified and which are not, researchers can determine whether the identification failures are due to data limitations, such as lack of observations for some variables, or whether they are intrinsic to the structure of the model. The methodology is illustrated using a medium-scale DSGE model.  相似文献   

15.
This paper is concerned with modelling the behaviour of random sums over time. Such models are particularly useful to describe the dynamics of operational losses, and to correctly estimate tail-related risk indicators. However, time-varying dependence structures make it a difficult task. To tackle these issues, we formulate a new Markov-switching generalized additive compound process combining Poisson and generalized Pareto distributions. This flexible model takes into account two important features: on the one hand, we allow all parameters of the compound loss distribution to depend on economic covariates in a flexible way. On the other hand, we allow this dependence to vary over time, via a hidden state process. A simulation study indicates that, even in the case of a short time series, this model is easily and well estimated with a standard maximum likelihood procedure. Relying on this approach, we analyse a novel data-set of 819 losses resulting from frauds at the Italian bank UniCredit. We show that our model improves the estimation of the total loss distribution over time, compared to standard alternatives. In particular, this model provides estimations of the 99.9% quantile that are never exceeded by the historical total losses, a feature particularly desirable for banking regulators.  相似文献   

16.
Friction models are used to examine the market reaction to the simultaneous disclosure of earnings and dividends in a thin‐trading environment. Friction modelling, a procedure using maximum likelihood estimation, can be used to replace both the market model and restricted least‐squares regression in event studies where there are two quantifiable variables and a number of possible interaction effects associated with the news that constitutes the study's event. The results indicate that the dividend signal can be separated from the earnings signal.  相似文献   

17.
In credit scoring, survival analysis models have been widely applied to answer the question as to whether and when an applicant would default. In this paper, we propose a novel mixture cure proportional hazards model under competing risks. Most existing mixture cure models either do not consider competing risks or generally assume that a subpopulation of subjects is immune to any risk from all the competing risks. Compared with existing models, the proposed model is more flexible since it assumes that a subpopulation of subjects is immune to a subset of risks instead of being immune to all the risks. To estimate model parameters, we derive the likelihood function of the proposed model, based on which an expectation maximization estimation algorithm is developed. A simulation algorithm is designed to simulate time-to-event observations from the proposed model, and simulation studies are conducted to verify the proposed methodology. A real world example of credit scoring for online customer loans based on the proposed model is demonstrated.  相似文献   

18.
The aim of this article is to propose a new approach to the estimation of the mortality rates based on two extended Milevsky and Promislov models: the first one with colored excitations modeled by Gaussian linear filters and the second one with excitations modeled by a continuous non-Gaussian process. The exact analytical formulas for theoretical mortality rates based on Gaussian linear scalar filter models have been derived. The theoretical values obtained in both cases were compared with theoretical mortality rates based on a classical Lee–Carter model, and verified on the basis of empirical Polish mortality data. The obtained results confirm the usefulness of the switched model based on the continuous non-Gaussian process for modeling mortality rates.  相似文献   

19.
Abstract

Extract

While in some linear estimation problems the principle of unbiasedness can be said to be appropriate, we have just seen that in the present context we will have to appeal to other criteria. Let us first consider what we get from the maximum likelihood method. We do not claim any particular optimum property for this estimate of the risk distribution: it seems plausible however that one can prove a large sample result analogous to the classical result on maximum likelihood estimation.  相似文献   

20.
In this paper, we focus on a Multi-dimensional Data Analysis approach to the Lee–Carter (LC) model of mortality trends. In particular, we extend the bilinear LC model and specify a new model based on a three-way structure, which incorporates a further component in the decomposition of the log-mortality rates. A multi-way component analysis is performed using the Tucker3 model. The suggested methodology allows us to obtain combined estimates for the three modes: (1) time, (2) age groups and (3) different populations. From the results obtained by the Tucker3 decomposition, we can jointly compare, in both a numerical and graphical way, the relationships among all three modes and obtain a time-series component as a leading indicator of the mortality trend for a group of populations. Further, we carry out a correlation analysis of the estimated trends in order to assess the reliability of the results of the three-way decomposition. The model's goodness of fit is assessed using an analysis of the residuals. Finally, we discuss how the synthesised mortality index can be used to build concise projected life tables for a group of populations. An application which compares 10 European countries is used to illustrate the approach and provide a deeper insight into the model and its implementation.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号