首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Summary

A formula for U(w, t), the distribution function of the waiting time of a potential customer who joins a queue with a single server at epoch t after service commences without a queue was derived for dam theory by Gani & Prabhu (1959) and for queues by Bene? (1960). Here we use it to calculate numerically the probability of non-ruin in risk theory with an assumption that X(t), the accumulated claims during the interval (0, t), is a stochastic process with independent increments occurring at the event points of a stationary process. The difficulties encountered are described in some detail and suggestions made for the attainment of three-decimal accuracy in U(w, t).  相似文献   

2.
Abstract

The traditional theory of collective risk is concerned with fluctuations in the capital reserve {Y(t): t ?O} of an insurance company. The classical model represents {Y(t)} as a positive constant x (initial capital) plus a deterministic linear function (cumulative income) minus a compound Poisson process (cumulative claims). The central problem is to determine the ruin probability ψ(x) that capital ever falls to zero. It is known that, under reasonable assumptions, one can approximate {Y(t)} by an appropriate Wiener process and hence ψ(.) by the corresponding exponential function of (Brownian) first passage probabilities. This paper considers the classical model modified by the assumption that interest is earned continuously on current capital at rate β > O. It is argued that Y(t) can in this case be approximated by a diffusion process Y*(t) which is closely related to the classical Ornstein-Uhlenbeck process. The diffusion {Y*(t)}, which we call compounding Brownian motion, reduces to the ordinary Wiener process when β = O. The first passage probabilities for Y*(t) are found to form a truncated normal distribution, which approximates the ruin function ψ(.) for the model with compounding assets. The approximate expression for ψ(.) is compared against the exact expression for a special case in which the latter is known. Assuming parameter values for which one would anticipate a good approximation, the two expressions are found to agree extremely well over a wide range of initial asset levels.  相似文献   

3.
Abstract

The Sparre Andersen risk model assumes that the interclaim times (also the time between the origin and the first claim epoch is considered as an interclaim time) and the amounts of claim are independent random variables such that the interclaim times have the common distribution function K(t), t|>/ 0, K(O)= 0 and the amounts of claim have the common distribution function P(y), - ∞ < y < ∞. Although the Sparre Andersen risk process is not a process with strictly stationary increments in continuous time it is asymptotically so if K(t) is not a lattice distribution. That is an immediate consequence of known properties of renewal processes. Another also immediate consequence of such properties is the fact that if we assume that the time between the origin and the first claim epoch has not K(t) but as its distribution function (kb1 denotes the mean of K(t)) then the so modified Sparre Andersen process has stationary increments (this works even if K(t) is a lattice distribution).

In the present paper some consequences of the above-mentioned stationarity properties are given for the corresponding ruin probabilities in the case when the gross risk premium is positive.  相似文献   

4.
Abstract

We consider risk processes t t?0 with the property that the rate β of the Poisson arrival process and the distribution of B of the claim sizes are not fixed in time but depend on the state of an underlying Markov jump process {Zt } t?0 such that β=β i and B=Bi when Zt=i . A variety of methods, including approximations, simulation and numerical methods, for assessing the values of the ruin probabilities are studied and in particular we look at the Cramér-Lundberg approximation and diffusion approximations with correction terms. The mathematical framework is Markov-modulated random walks in discrete and continuous time, and in particular Wiener-Hopf factorisation problems and conjugate distributions (Esscher transforms) are involved.  相似文献   

5.
Abstract

In classical risk theory often stationary premium and claim processes are considered. In some cases it is more convenient to model non-stationary processes which describe a movement from environmental conditions, for which the premiums were calculated, to less favorable circumstances. This is done by a Markov-modulated Poisson claim process. Moreover the insurance company is allowed to stop the process at some random time, if the situation seems unfavorable, in order to calculate new premiums. This leads to an optimal stopping problem which is solved explicitly to some extent.  相似文献   

6.
In this paper, a Sparre Andersen risk process with arbitrary interclaim time distribution is considered. We analyze various ruin-related quantities in relation to the expected present value of total operating costs until ruin, which was first proposed by Cai et al. [(2009a). On the expectation of total discounted operating costs up to default and its applications. Advances in Applied Probability 41(2), 495–522] in the piecewise-deterministic compound Poisson risk model. The analysis in this paper is applicable to a wide range of quantities including (i) the insurer's expected total discounted utility until ruin; and (ii) the expected discounted aggregate claim amounts until ruin. On one hand, when claims belong to the class of combinations of exponentials, explicit results are obtained using the ruin theoretic approach of conditioning on the first drop via discounted densities (e.g. Willmot [(2007). On the discounted penalty function in the renewal risk model with general interclaim times. Insurance: Mathematics and Economics 41(1), 17–31]). On the other hand, without any distributional assumption on the claims, we also show that the expected present value of total operating costs until ruin can be expressed in terms of some potential measures, which are common tools in the literature of Lévy processes (e.g. Kyprianou [(2014). Fluctuations of L'evy processes with applications: introductory lectures, 2nd ed. Berlin Heidelberg: Springer-Verlag]). These potential measures are identified in terms of the discounted distributions of ascending and descending ladder heights. We shall demonstrate how the formulas resulting from the two seemingly different methods can be reconciled. The cases of (i) stationary renewal risk model and (ii) surplus-dependent premium are briefly discussed as well. Some interesting invariance properties in the former model are shown to hold true, extending a well-known ruin probability result in the literature. Numerical illustrations concerning the expected total discounted utility until ruin are also provided.  相似文献   

7.
Abstract

Credibility is a form of insurance pricing that is widely used, particularly in North America. The theory of credibility has been called a “cornerstone” in the field of actuarial science. Students of the North American actuarial bodies also study loss distributions, the process of statistical inference of relating a set of data to a theoretical (loss) distribution. In this work, we develop a direct link between credibility and loss distributions through the notion of a copula, a tool for understanding relationships among multivariate outcomes.

This paper develops credibility using a longitudinal data framework. In a longitudinal data framework, one might encounter data from a cross section of risk classes (towns) with a history of insurance claims available for each risk class. For the marginal claims distributions, we use generalized linear models, an extension of linear regression that also encompasses Weibull and Gamma regressions. Copulas are used to model the dependencies over time; specifically, this paper is the first to propose using a t-copula in the context of generalized linear models. The t-copula is the copula associated with the multivariate t-distribution; like the univariate tdistributions, it seems especially suitable for empirical work. Moreover, we show that the t-copula gives rise to easily computable predictive distributions that we use to generate credibility predictors. Like Bayesian methods, our copula credibility prediction methods allow us to provide an entire distribution of predicted claims, not just a point prediction.

We present an illustrative example of Massachusetts automobile claims, and compare our new credibility estimates with those currently existing in the literature.  相似文献   

8.
Abstract

It was the Swiss actuary Chr. Moser who, in lectures at Bern University at the turn of the century, gave the name “self-renewing aggregate” to what Vajda (1947) has called the “unstationary community” of lives, namely where deaths at any epoch are immediately replaced by an equivalent number of births. It was Moser too (1926) who coined the expression “steady state” for the stationary community in which the age distribution at any time follows the life table (King, 1887). With such a distinguished actuarial history, excellently summarized by Saxer (1958, Ch. IV), it behoves every actuary to know at least the definitions and modus operandi of today's so-called renewal (point), or recurrent event, processes.  相似文献   

9.
Analysis of a generalized Gerber–Shiu function is considered in a discrete-time (ordinary) Sparre Andersen renewal risk process with time-dependent claim sizes. The results are then applied to obtain ruin-related quantities under some renewal risk processes assuming specific interclaim distributions such as a discrete K n distribution and a truncated geometric distribution (i.e. compound binomial process). Furthermore, the discrete delayed renewal risk process is considered and results related to the ordinary process are derived as well.  相似文献   

10.
Abstract

The purpose of this short note is to demonstrate the power of very straightforward branching process methods outside their traditional realm of application. We shall consider an insurance scheme where claims do not necessarily arise as a stationary process. Indeed, the number of policy-holders is changing so that each of them generates a random number of new insurants. Each one of these make claims of random size at random instants, independently but with the same distribution for different individuals. Premiums are supposed equal for all policy-holders. It is proved that there is, for an expanding portfolio, only one premium size which is fair in the sense that if the premium is larger than that, then the profit of the insurer grows infinite with time, whereas a smaller premium leads to his inevitable ruin. (Branching process models for the development of the portfolio may seem unrealistic. However, they do include the classical theory, where independent and identically distributed claims arise at the points of a renewal process.)  相似文献   

11.
Abstract

Some authors define the (elementary) compound Poisson process in wide sense {χ t , 0 ? t < ∞} with help of probability distributions where τ is a so-called operational time, a continuous non-decreasing function of t vanishing for t = 0, and V(q, t) is a non-negative distribution function for every t.  相似文献   

12.
Abstract

1. For the definition of general processes with special regard to those concerned in Collective Risk Theory reference is made to Cramér (Collective Risk Theory, Skandia Jubilee Volume, Stockholm, 1955). Let the independent parameter of such a process be denoted by τ, with the origin at the point of departure of the process and on a scale independent of the number of expected changes of the random function. Denote with p(τ, n)dt the asymptotic expression for the conditional probability of one change in the random function while the parameter passes from τ to τ + dτ: relative to the hypothesis that n changes have occurred, while the parameter passes from 0 to τ. Assume further—unless the contrary is stated—that the probability of more than one change, while the parameter passes from τ to τ + dτ, is of smaller order than dτ.  相似文献   

13.
We present in a Monte Carlo simulation framework, a novel approach for the evaluation of hybrid local volatility [Risk, 1994, 7, 18–20], [Int. J. Theor. Appl. Finance, 1998, 1, 61–110] models. In particular, we consider the stochastic local volatility model—see e.g. Lipton et al. [Quant. Finance, 2014, 14, 1899–1922], Piterbarg [Risk, 2007, April, 84–89], Tataru and Fisher [Quantitative Development Group, Bloomberg Version 1, 2010], Lipton [Risk, 2002, 15, 61–66]—and the local volatility model incorporating stochastic interest rates—see e.g. Atlan [ArXiV preprint math/0604316, 2006], Piterbarg [Risk, 2006, 19, 66–71], Deelstra and Rayée [Appl. Math. Finance, 2012, 1–23], Ren et al. [Risk, 2007, 20, 138–143]. For both model classes a particular (conditional) expectation needs to be evaluated which cannot be extracted from the market and is expensive to compute. We establish accurate and ‘cheap to evaluate’ approximations for the expectations by means of the stochastic collocation method [SIAM J. Numer. Anal., 2007, 45, 1005–1034], [SIAM J. Sci. Comput., 2005, 27, 1118–1139], [Math. Models Methods Appl. Sci., 2012, 22, 1–33], [SIAM J. Numer. Anal., 2008, 46, 2309–2345], [J. Biomech. Eng., 2011, 133, 031001], which was recently applied in the financial context [Available at SSRN 2529691, 2014], [J. Comput. Finance, 2016, 20, 1–19], combined with standard regression techniques. Monte Carlo pricing experiments confirm that our method is highly accurate and fast.  相似文献   

14.
This paper examines the use of random matrix theory as it has been applied to model large financial datasets, especially for the purpose of estimating the bias inherent in Mean-Variance portfolio allocation when a sample covariance matrix is substituted for the true underlying covariance. Such problems were observed and modeled in the seminal work of Laloux et al. [Noise dressing of financial correlation matrices. Phys. Rev. Lett., 1999, 83, 1467] and rigorously proved by Bai et al. [Enhancement of the applicability of Markowitz's portfolio optimization by utilizing random matrix theory. Math. Finance, 2009, 19, 639–667] under minimal assumptions. If the returns on assets to be held in the portfolio are assumed independent and stationary, then these results are universal in that they do not depend on the precise distribution of returns. This universality has been somewhat misrepresented in the literature, however, as asymptotic results require that an arbitrarily long time horizon be available before such predictions necessarily become accurate. In order to reconcile these models with the highly non-Gaussian returns observed in real financial data, a new ensemble of random rectangular matrices is introduced, modeled on the observations of independent Lévy processes over a fixed time horizon.  相似文献   

15.
This article explores the relationships between several forecasts for the volatility built from multi-scale linear ARCH processes, and linear market models for the forward variance. This shows that the structures of the forecast equations are identical, but with different dependencies on the forecast horizon. The process equations for the forward variance are induced by the process equations for an ARCH model, but postulated in a market model. In the ARCH case, they are different from the usual diffusive type. The conceptual differences between both approaches and their implication for volatility forecasts are analysed. The volatility forecast is compared with the realized volatility (the volatility that will occur between date t and t + ΔT), and the implied volatility (corresponding to an at-the-money option with expiry at t + ΔT). For the ARCH forecasts, the parameters are set a priori. An empirical analysis across multiple time horizons ΔT shows that a forecast provided by an I-GARCH(1) process (one time scale) does not capture correctly the dynamics of the realized volatility. An I-GARCH(2) process (two time scales, similar to GARCH(1,1)) is better, while a long-memory LM-ARCH process (multiple time scales) replicates correctly the dynamics of the implied and realized volatilities and delivers consistently good forecasts for the realized volatility.  相似文献   

16.
Abstract

1. In a s. or n.s. cPp (stationary or non-stationary compound Poisson process) the probability for occurrence of m events, while the parameter (one-or more-dimensional) passes from zero to τ 0 as measured on an absolute scale (the τ-scale), is defined as a mean of Poisson probabilities with intensities, which are distributed with distribution functions defining another random process, called the primary process with respect to the s. or n.s cPp. The stationarity (in the weak sence) and the non-stationarity of the primary process imply the same properties of the s. or n.s. cPp.  相似文献   

17.
Abstract

What follows has grown out of a discussion with Carl Philipson following a lecture [1] on the collective theory of risk. Although I give here nothing else but a refined interpretation of Paul Lévy's form (see, e.g., [2], p. 322) of identically distributed random variables the result still seems of interest for all those working in the field of collective risk theory. I thank Carl Philipson for stimulating my interest in this matter.  相似文献   

18.
Abstract

In this paper asymptotic properties for the risk process will be studied when the number of risk units tends to infinity. The paper extends asymptotic properties for the classical risk process to more general processes. In the classical risk process the claim amounts are assumed independent and identically distributed, and the claim number process is a homogeneous Poisson process.

The key tool is point process theory with associated martingale theory. The results are illustrated by examples.  相似文献   

19.
The aim of this paper is to apply a nonparametric methodology developed by Donoho et al(2003 IEEE Trans. Signal Processing 53614–27) for estimating an autocovariance sequence to the statistical analysis of the return of securities and discuss the advantages offered by this approach over other existing methods such as fixed-window-length segmentation procedures. Theoretical properties of adaptivity of this estimation method have been proved for a specific class of time series, namely the class of locally stationary processes, with an autocovariance structure which varies slowly over time in most cases but might exhibit abrupt changes of regime. This method is based on an algorithm that selects empirically from the data the tiling of the time–frequency plane which exposes best in the least-squares sense the underlying second-order time-varying structure of the time series, and so may properly describe the time-inhomogeneous variations of speculative prices. The applications we consider here mainly concern the analysis of structural changes occurring in stock market returns, VaR estimation and the comparison between the variation structure of stock index returns in developed markets and in developing markets.  相似文献   

20.
Nian Yang 《Quantitative Finance》2018,18(10):1767-1779
The stochastic-alpha-beta-rho (SABR) model is widely used by practitioners in interest rate and foreign exchange markets. The probability of hitting zero sheds light on the arbitrage-free small strike implied volatility of the SABR model (see, e.g. De Marco et al. [SIAM J. Financ. Math., 2017, 8(1), 709–737], Gulisashvili [Int. J. Theor. Appl. Financ., 2015, 18, 1550013], Gulisashvili et al. [Mass at zero in the uncorrelated SABR modeland implied volatility asymptotics, 2016b]), and the survival probability is also closely related to binary knock-out options. Besides, the study of the survival probability is mathematically challenging. This paper provides novel asymptotic formulas for the survival probability of the SABR model as well as error estimates. The formulas give the probability that the forward price does not hit a nonnegative lower boundary before a fixed time horizon.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号