首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Abstract

This paper has been inspired by a very interesting article by Taylor (1979) in which he considered the effect of claims cost inflation on a compound Poisson risk process. The present paper divides naturally into two parts. In the first part we show, under very general conditions, that if claims costs are increasing and if the premiums are increasing at the same rate then ultimate ruin is certain for the risk process. In the second part we try to determine how fast the premiums should increase in order that ultimate ruin should not be certain for such a risk process.  相似文献   

2.
Abstract

Introduction.

Livförsäkringsbolaget Framtiden, ömsesidigt (the Life Insurance Co. “The Future”, Mutual), to the policyholders of which I have the honour of devoting my actuarial work, is a relatively new enterprise. It was founded in 1911. Its main work is industrial life insurance with monthly premiums. All industrial policies include the waiver of premiums on disablement by sickness or accident for at least four weeks. Disablement lasting more than four weeks involves the waiver of premiums from the beginning of the disablement. For industrial policies issued in late years, the payment of premiums is limited to 65 years of age. Besides ordinary endowment (or whole life) policies, many childrens' insurances are written, including some or other insurance of the bread-winner, at all events the waiver of premiums till the 20th birthday of the child, if the supporter should die before that time, and for the periods during this time when the supporter is disabled by sickness or accident for at least four weeks. It has been tried to propagate more protective forms of children's policies, but it has partly failed, owing to lack of interest from agents and public. Nevertheless, it has been possible to exclude pure childrens' tariffs, and so in almost every policy issued, there is a moment of sickness insurance. In late years, this holds true also for the ordinary branch. The mass of experience collected is thus by no means unimportant, despite the smallness of the Company, as compared with foreign Companies. The experience is, moreover, collected during a relatively short period of time, giving a rather homogeneous material.  相似文献   

3.
Abstract

Two types of default risk are discussed in the article: The traditional “probability of ruin” (insurer being unable to meet his obligations) and a “perceived probability of ruin” (the probability of the insured being affected by ruin). The explicit relationship between these probabilities on the actuarial loading factors of a mutual insurer were developed. The explicit mathematical formulae obtained for these complex relationships were followed also by numerical results. A second concept presented in the paper is related to the idea of actuarially fair premiums. It is shown that the premium must also be a function of the payments of the other insured as well as their claim distributions, reflecting thereby the simultaneity and mutual dependence of the insured.  相似文献   

4.
Abstract

In a non-life insurance business an insurer often needs to build up a reserve to able to meet his or her future obligations arising from incurred but not reported completely claims. To forecast these claims reserves, a simple but generally accepted algorithm is the classical chain-ladder method. Recent research essentially focused on the underlying model for the claims reserves to come to appropriate bounds for the estimates of future claims reserves. Our research concentrates on scenarios with outlying data. On closer examination it is demonstrated that the forecasts for future claims reserves are very dependent on outlying observations. The paper focuses on two approaches to robustify the chain-ladder method: the first method detects and adjusts the outlying values, whereas the second method is based on a robust generalized linear model technique. In this way insurers will be able to find a reserve that is similar to the reserve they would have found if the data contained no outliers. Because the robust method flags the outliers, it is possible to examine these observations for further examination. For obtaining the corresponding standard errors the bootstrapping technique is applied. The robust chain-ladder method is applied to several run-off triangles with and without outliers, showing its excellent performance.  相似文献   

5.
Abstract

Consider a portfolio containing number of risk classes. Each class has its own demand function, which determines the number of insureds in this class as a function of the premium. The insurer determines the premiums based on the number of insureds in each class. The “market” reacts by updating the number of the policyholders, then the insurer updates the premium, and so on. We show that this process has an equilibrium point, and then we characterize this point.  相似文献   

6.
Abstract

The conventional approach to evolutionary credibility theory assumes a linear state-space model for the longitudinal claims data so that Kalman filters can be used to estimate the claims’ expected values, which are assumed to form an autoregressive time series. We propose a class of linear mixed models as an alternative to linear state-space models for evolutionary credibility and show that the predictive performance is comparable to that of the Kalman filter when the claims are generated by a linear state-space model. More importantly, this approach can be readily extended to generalized linear mixed models for the longitudinal claims data. We illustrate its applications by addressing the “excess zeros” issue that a substantial fraction of policies does not have claims at various times in the period under consideration.  相似文献   

7.
Abstract

Adult polycystic kidney disease (APKD) is a single-gene autosomal dominant genetic disorder leading to end-stage renal disease (ESRD, meaning kidney failure). It is associated with mutations in at least two genes, APKD1 and APKD2, but diagnosis is mostly by ultrasonography. We propose a model for critical illness (CI) insurance and estimate rates of onset of ESRD from APKD using two studies. Other events leading to claims under CI policies are included in the model, which we use to study (a) extra premiums under CI policies if the presence of an APKD mutation is known, and (b) the possible costs arising from adverse selection if this information is unavailable to insurers. The extra premiums are typically very high, but because APKD is rare, the possible cost of adverse selection is low. However, APKD is just one of a significant number of single-gene disorders, and this benign conclusion cannot be assumed to apply to all genetic disorders taken together. Moreover, ignoring known genetic risks in underwriting sets a precedent that could have unintended consequences for the underwriting of nongenetic risks of similar magnitude.  相似文献   

8.
Abstract

In almost all stochastic claims reserving models one assumes that accident years are independent. In practice this assumption is violated most of the time. Typical examples are claims inflation and accounting year effects that influence all accident years simultaneously. We study a Bayesian chain ladder model that allows for accounting (calendar) year effects modeling. A case study of a general liability dataset shows that such accounting year effects contribute substantially to the prediction uncertainty and therefore need a careful treatment within a risk management and solvency framework.  相似文献   

9.
Abstract

We present an unsupervised learning method for classifying consumer insurance claims according to their suspiciousness of fraud versus nonfraud. The predictor variables contained within a claim file that are used in this analysis can be binary, ordinal categorical, or continuous variates. They are constructed such that the ordinal position of the response to the predictor variable bears a monotonic relationship with the fraud suspicion of the claim. Thus, although no individual variable is of itself assumed to be determinative of fraud, each of the individual variables gives a “hint” or indication as to the suspiciousness of fraud for the overall claim file. The presented method statistically concatenates the totality of these “hints” to make an overall assessment of the ranking of fraud risk for the claim files without using any a priori fraud-classified or -labeled subset of data. We first present a scoring method for the predictor variables that puts all the variables (whether binary “red flag indicators,” ordinal categorical variables with different categories of possible response values, or continuous variables) onto a common –1 to 1 scale for comparison and further use. This allows us to aggregate variables with disparate numbers of potential values. We next show how to concatenate the individual variables and obtain a measure of variable worth for fraud detection, and then how to obtain an overall holistic claim file suspicion value capable of being used to rank the claim files for determining which claims to pay and the order in which to investigate claims further for fraud. The proposed method provides three useful outputs not usually available with other unsupervised methods: (1) an ordinal measure of overall claim file fraud suspicion level, (2) a measure of the importance of each individual predictor variable in determining the overall suspicion levels of claims, and (3) a classification function capable of being applied to existing claims as well as new incoming claims. The overall claim file score is also available to be correlated with exogenous variables such as claimant demographics or highvolume physician or lawyer involvement. We illustrate that the incorporation of continuous variables in their continuous form helps classification and that the method has internal and external validity via empirical analysis of real data sets. A detailed application to automobile bodily injury fraud detection is presented.  相似文献   

10.
Abstract

Statistical extreme value theory provides a flexible and theoretically well motivated approach to the study of large losses in insurance. We give a brief review of the modem version of this theory and a “step by step” example of how to use it in large claims insurance. The discussion is based on a detailed investigation of a wind storm insurance problem. New results include a simulation study of estimators in the peaks over thresholds method with Generalised Pareto excesses, a discussion of Pareto and lognormal modelling and of methods to detect trends. Further results concern the use of meteorological information in the wind storm insurance and, of course, the results of the study of the wind storm claims.  相似文献   

11.
ABSTRACT

Empirical studies suggest that many insurance companies recontract with their clients on premiums by extrapolating past losses: a client is offered a decrease in premium if the monetary amounts of his claims do not exceed some prespecified quantities, otherwise, an increase in premium. In this paper, we formulate the empirical studies and investigate optimal reinsurance problems of a risk-averse insurer by introducing a loss-dependent premium principle, which uses a weighted average of history losses and the expectation of future losses to replace the expectation in the expected premium principle. This premium principle satisfies the bonus-malus and smoothes the insurer's wealth. Explicit expressions for the optimal reinsurance strategies and value functions are derived. If the reinsurer applies the loss-dependent premium principle to continuously adjust his premium, we show that the insurer always needs less reinsurance when he also adopts this premium principle than when he adopts the expected premium principle.  相似文献   

12.
Abstract

This paper explores alternative methods for computing earnings per share (EPS) for a company whose capital structure consists of ordinary shares and warrants. The methods for computing EPS identified by the FASB (1996) are critically evaluated and an alternative measure, the holding period approach,is developed within the framework of contingent claims analysis. Two types of errors are shown to characterize the accounting measures of EPS. One arises from failure of accounting measures to fully recognize the contingent nature of the warrant. The other arises from the practice of not recognizing instances of anti-dilution. A further factor is the treatment of any difference between the proceeds from the issue of the warrants and their fair value at that time. This is ignored in existing measures and yet may have a significant effect on the value of the claims of ordinary shareholders on the company’s earnings. Using a simulation method it is shown that the imputed earnings method of computing EPS is a very close approximation to the holding period method and is considerably more accurate than treasury stock measures favoured by accounting standards bodies.  相似文献   

13.
Abstract

This paper uses fuzzy set theory (FST) to solve a problem in actuarial science, the financial pricing of property-liability insurance contracts. The fundamental concept of FST is the alternative formalization of membership in a set to include the degree or strength of membership. FST provides consistent mathematical rules for incorporating vague, subjective, or judgmental information into complex decision processes. It is potentially important in insurance pricing because much of the information about cash flows, future economic conditions, risk premiums, and other factors affecting the pricing decision is subjective and thus difficult to quantify by using conventional methods. To illustrate the use of FST, we “fuzzify” a well-known insurance financial pricing model, provide numerical examples of fuzzy pricing, and propose rules for project decision-making using FST. The results indicate that FST can lead to significantly different decisions than the conventional approach.  相似文献   

14.
Abstract

In classical risk theory often stationary premium and claim processes are considered. In some cases it is more convenient to model non-stationary processes which describe a movement from environmental conditions, for which the premiums were calculated, to less favorable circumstances. This is done by a Markov-modulated Poisson claim process. Moreover the insurance company is allowed to stop the process at some random time, if the situation seems unfavorable, in order to calculate new premiums. This leads to an optimal stopping problem which is solved explicitly to some extent.  相似文献   

15.
Abstract

An insurance company can be considered as an adjustment institution for the policyholders. The individual risks of the policyholders are taken over by the company at the price of a comparatively small stake, the premium. This is so calculated that the premiums from all the policyholders will, according to statistical experience, on the average cover the company's payments for claims. With respect to unfavourable random deviations from the average, the premiums contain security loadings. For the same purpose the company also makes other precautions. The most important of these are reinsurance and the building up of adjustment funds. On the other hand, extensive precautions increase the price of the insurance. Therefore the objective fixing of the precautions in order to get a satisfactory solidity as well as a reasonable price constitutes a weighing problem, demanding a measure of the effect of diverse precautions.  相似文献   

16.

This paper derives two-sided bounds for tails of compound negative binomial distributions, both in the exponential and heavy-tailed cases. Two approaches are employed to derive the two-sided bounds in the case of exponential tails. One is the convolution technique, as in Willmot & Lin (1997). The other is based on an identity of compound negative binomial distributions; they can be represented as a compound Poisson distribution with a compound logarithmic distribution as the underlying claims distribution. This connection between the compound negative binomial, Poisson and logarithmic distributions results in two-sided bounds for the tails of the compound negative binomial distribution, which also generalize and improve a result of Willmot & Lin (1997). For the heavy-tailed case, we use the method developed by Cai & Garrido (1999b). In addition, we give two-sided bounds for stop-loss premiums of compound negative binomial distributions. Furthermore, we derive bounds for the stop-loss premiums of general compound distributions among the classes of HNBUE and HNWUE.  相似文献   

17.
Abstract

The reverse mortgage market has been expanding rapidly in developed economies in recent years. The onset of demographic transition places a rapidly rising number of households in an age window in which reverse mortgages have potential appeal. Increasing prices for residential real estate over the last decade have further stimulated interest.

Reverse mortgages involve various risks from the provider-s perspective that may hinder the further development of these financial products. This paper addresses one method of transferring and financing the risks associated with these products through the form of securitization. Securitization is becoming a popular and attractive alternative form of risk transfer of insurance liabilities. Here we demonstrate how to construct a securitization structure for reverse mortgages similar to the one applied in traditional insurance products.

Specifically, we investigate the merits of developing survivor bonds and survivor swaps for reverse mortgage products. In the case of survivor bonds, for example, we are able to compute premiums, both analytically and numerically through simulations, and to examine how the longevity risk may be transferred to the financial investors. Our numerical calculations provide an indication of the economic benefits derived from developing survivor bonds to securitize the “longevity risk component” of reverse mortgage products. Moreover, some sensitivity analysis of these economic benefits indicates that these survivor bonds provide for a promising tool for investment diversification.  相似文献   

18.
Abstract

The aim of this paper is to analyse two functions that are of general interest in the collective risk theory, namely F, the distribution function of the total amount of claims, and II, the Stop Loss premium. Section 2 presents certain basic formulae. Sections 17-18 present five claim distributions. Corresponding to these claim distributions, the functions F and II were calculated under various assumptions as to the distribution of the number of claims. These calculations were performed on an electronic computer and the numerical method used for this purpose is presented in sections 9, 19 and 20 under the name of the C-method which method has the advantage of furnishing upper and lower limits of the quantities under estimation. The means of these limits, in the following regarded as the “exact” results, are given in Tables 4-20. Sections 11-16 present certain approximation methods. The N-method of section 11 is an Edgeworth expansion, while the G-method given in section 12 is an approximation by a Pearson type III curve. The methods presented in sections 13-16, and denoted AI-A4, are all applications and modifications of the Esscher method. These approximation methods have been applied for the calculation of F and II in the cases mentioned above in which “exact” results were obtained. The results are given in Tables 4-20. The object of this investigation was to obtain information as to the precision of the approximation methods in question, and to compare their relative merits. These results arc discussed in sections 21-24.  相似文献   

19.
Abstract

Credibility is a form of insurance pricing that is widely used, particularly in North America. The theory of credibility has been called a “cornerstone” in the field of actuarial science. Students of the North American actuarial bodies also study loss distributions, the process of statistical inference of relating a set of data to a theoretical (loss) distribution. In this work, we develop a direct link between credibility and loss distributions through the notion of a copula, a tool for understanding relationships among multivariate outcomes.

This paper develops credibility using a longitudinal data framework. In a longitudinal data framework, one might encounter data from a cross section of risk classes (towns) with a history of insurance claims available for each risk class. For the marginal claims distributions, we use generalized linear models, an extension of linear regression that also encompasses Weibull and Gamma regressions. Copulas are used to model the dependencies over time; specifically, this paper is the first to propose using a t-copula in the context of generalized linear models. The t-copula is the copula associated with the multivariate t-distribution; like the univariate tdistributions, it seems especially suitable for empirical work. Moreover, we show that the t-copula gives rise to easily computable predictive distributions that we use to generate credibility predictors. Like Bayesian methods, our copula credibility prediction methods allow us to provide an entire distribution of predicted claims, not just a point prediction.

We present an illustrative example of Massachusetts automobile claims, and compare our new credibility estimates with those currently existing in the literature.  相似文献   

20.
This paper extends basic results on arbitrage bounds and attainable claims to illiquid markets and general swap contracts where both claims and premiums may have multiple payout dates. Explicit consideration of swap contracts is essential in illiquid markets where the valuation of swaps cannot be reduced to the valuation of cumulative claims at maturity. We establish the existence of optimal trading strategies and the lower semicontinuity of the optimal value of optimal investment under conditions that extend the no-arbitrage condition in the classical linear market model. All results are derived with the “direct method” without resorting to duality arguments.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号