首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
    
Data insufficiency and reporting threshold are two main issues in operational risk modelling. When these conditions are present, maximum likelihood estimation (MLE) may produce very poor parameter estimates. In this study, we first investigate four methods to estimate the parameters of truncated distributions for small samples—MLE, expectation-maximization algorithm, penalized likelihood estimators, and Bayesian methods. Without any proper prior information, Jeffreys’ prior for truncated distributions is used. Based on a simulation study for the log-normal distribution, we find that the Bayesian method gives much more credible and reliable estimates than the MLE method. Finally, an application to the operational loss severity estimation using real data is conducted using the truncated log-normal and log-gamma distributions. With the Bayesian method, the loss distribution parameters and value-at-risk measure for every cell with loss data can be estimated separately for internal and external data. Moreover, confidence intervals for the Bayesian estimates are obtained via a bootstrap method.  相似文献   

2.
    
Systematic longevity risk is increasingly relevant for public pension schemes and insurance companies that provide life benefits. In view of this, mortality models should incorporate dependence between lives. However, the independent lifetime assumption is still heavily relied upon in the risk management of life insurance and annuity portfolios. This paper applies a multivariate Tweedie distribution to incorporate dependence, which it induces through a common shock component. Model parameter estimation is developed based on the method of moments and generalized to allow for truncated observations. The estimation procedure is explicitly developed for various important distributions belonging to the Tweedie family, and finally assessed using simulation.  相似文献   

3.
    
The increase in interconnectivity and developments in technology have caused cyber security to become a universal concern. This paper highlights the dangers of the evolution of cyber risk, the challenges of quantifying the impact of cyber-attacks and the feasibility of the traditional actuarial methodologies for quantifying cyber losses. In this paper, we present a practical roadmap for assessing cyber risk, a roadmap that emphasizes the importance of developing a company and culture-specific risk and resilience model. We develop a structure for a Bayesian network to model the financial loss as a function of the key drivers of risk and resilience. We use qualitative scorecard assessment to determine the level of cyber risk exposure and evaluate the effectiveness of resilience efforts in the organization. We highlight the importance of capitalizing on the knowledge of experts within the organization and discuss methods for aggregating multiple assessments. From an enterprise risk management perspective, impact on value should be the primary concern of managers. This paper uses a value-centric/reputational approach to risk management rather than a regulatory/capital-centric approach to risk.  相似文献   

4.
    
Brown and Gibbons (1985) developed a theory of relative risk aversion estimation in terms of average market rates of return and the variance of market rates of return. However, the exact sampling distributions of the relative risk aversion estimators have not been derived. The main purpose of this paper is to derive the exact sampling distribution of an appropriate relative risk aversion estimator. First, we have derived theoretically the density of Brown and Gibbons' maximum likelihood estimator. It is shown that the centralt is not appropriate for testing the significance of estimated relative risk aversion distribution. Then we derived the minimum variance unbiased estimator by a linear transformation of the Brown and Gibbons' maximum likelihood estimator. The density function is neither a central nor a noncentralt distribution. The density function of this new distribution has been tabulated. There is an empirical example to illustrate the application of this new sampling distribution.  相似文献   

5.
We examine the question of deposit insurance through the lens of risk management by constructing the loss distribution faced by the Federal Deposit Insurance Corporation (FDIC). We take a novel approach by arguing that the risk management problem faced by the FDIC is similar to that of a bank managing a loan portfolio, only in the FDIC’s case the risk arises from the potential for loss of the individual banks in its portfolio. We explicitly estimate the cumulative loss distribution of FDIC insured banks using two variations of the Merton model and find that reserves are sufficient to cover roughly 99.85% of the loss distribution, corresponding to about a BBB+ rating. However, under different stress scenarios (higher correlations, fat-tailed bank returns, increased loss severity) that level can be much lower: approximately 96% corresponding to about a B+ rating.JEL classification: G210, G280.Any views expressed represent those of the author only and not necessarily those of the Federal Reserve Bank of New York or the Federal Reserve System.  相似文献   

6.
A class of sequential procedures is developed for the point estimation of the parameter(s) of a population under a family of loss functions plus cost function of the general form. Condition on the initial sample size is determined which ensures the asymptotic ‘risk-efficiency’ of the proposed class. By means of various examples, it is shown that many sequential point estimation problems can be handled with the help of the proposed class.  相似文献   

7.
ABSTRACT

Accurate estimation of value-at-risk (VaR) and assessment of associated uncertainty is crucial for both insurers and regulators, particularly in Europe. Existing approaches link data and VaR indirectly by first linking data to the parameter of a probability model, and then expressing VaR as a function of that parameter. This indirect approach exposes the insurer to model misspecification bias or estimation inefficiency, depending on whether the parameter is finite- or infinite-dimensional. In this paper, we link data and VaR directly via what we call a discrepancy function, and this leads naturally to a Gibbs posterior distribution for VaR that does not suffer from the aforementioned biases and inefficiencies. Asymptotic consistency and root-n concentration rate of the Gibbs posterior are established, and simulations highlight its superior finite-sample performance compared to other approaches.  相似文献   

8.
随着巴塞尔协议的公布,操作风险(Operational Risk)的量化模型已经成为银行业目前研究的主要课题。本文按照巴塞尔协议规定,利用损失分布方法(Loss Distribution Approach,LDA)来度量操作风险,这种方法的优点在于分别度量损失事件发生频度以及损失幅度,然后利用组合分布方法来研究一段时间内的累积损失分布。本文主要讨论在商业银行内部如何执行LDA以及引入操作风险在险值(Valueat Risk,VaR)的概念,并且介绍了能够反映损失分布的分布函数。同时按照巴塞尔协议公布的方法和策略,从损失事件类型、业务部门以及损失分布额度的估计方法探讨利用高级度量方法的可能性和现实性以及操作中的现实问题。  相似文献   

9.
以机动车辆保险为研究对象,分析车险核心业务流程中操作风险的表现形式,借助拓扑数据模型及Monte Carlo模拟对其风险进行度量.结果显示:损失强度表现出较强的厚尾性,事件频率具有高频性,但总损失额分布的厚尾特征不明显,这些结果为操作风险进行经济资本配置及管理提供了依据.  相似文献   

10.
11.
    
This paper introduces the class of Bayesian infinite mixture time series models first proposed in Lau & So (2004) for modelling long-term investment returns. It is a flexible class of time series models and provides a flexible way to incorporate full information contained in all autoregressive components with various orders by utilizing the idea of Bayesian averaging or mixing. We adopt a Bayesian sampling scheme based on a weighted Chinese restaurant process for generating partitions of investment returns to estimate the Bayesian infinite mixture time series models. Instead of using the point estimates, as in the classical or non-Bayesian approach, the estimation in this paper is performed by the full Bayesian approach, utilizing the idea of Bayesian averaging to incorporate all information contained in the posterior distributions of the random parameters. This provides a natural way to incorporate model risk or uncertainty. The proposed models can also be used to perform clustering of investment returns and detect outliers of returns. We employ the monthly data from the Toronto Stock Exchange 300 (TSE 300) indices to illustrate the implementation of our models and compare the simulated results from the estimated models with the empirical characteristics of the TSE 300 data. We apply the Bayesian predictive distribution of the logarithmic returns obtained by the Bayesian averaging or mixing to evaluate the quantile-based and conditional tail expectation risk measures for segregated fund contracts via stochastic simulation. We compare the risk measures evaluated from our models with those from some well-known and important models in the literature, and highlight some features that can be obtained from our models.  相似文献   

12.
This paper presents an extension of the classical compound Poisson risk model for which the inter-claim time and the forthcoming claim amount are no longer independent random variables (rv's). Asymptotic tail probabilities for the discounted aggregate claims are presented when the force of interest is constant and the claim amounts are heavy tail distributed rv's. Furthermore, we derive asymptotic finite time ruin probabilities, as well as asymptotic approximations for some common risk measures associated with the discounted aggregate claims. A simulation study is performed in order to validate the results obtained in the free interest risk model.  相似文献   

13.
14.
    
It is well known that the exponential dispersion family (EDF) of univariate distributions is closed under Bayesian revision in the presence of natural conjugate priors. However, this is not the case for the general multivariate EDF. This paper derives a second-order approximation to the posterior likelihood of a naturally conjugated generalised linear model (GLM), i.e., multivariate EDF subject to a link function (Section 5.5). It is not the same as a normal approximation. It does, however, lead to second-order Bayes estimators of parameters of the posterior. The family of second-order approximations is found to be closed under Bayesian revision. This generates a recursion for repeated Bayesian revision of the GLM with the acquisition of additional data. The recursion simplifies greatly for a canonical link. The resulting structure is easily extended to a filter for estimation of the parameters of a dynamic generalised linear model (DGLM) (Section 6.2). The Kalman filter emerges as a special case. A second type of link function, related to the canonical link, and with similar properties, is identified. This is called here the companion canonical link. For a given GLM with canonical link, the companion to that link generates a companion GLM (Section 4). The recursive form of the Bayesian revision of this GLM is also obtained (Section 5.5.3). There is a perfect parallel between the development of the GLM recursion and its companion. A dictionary for translation between the two is given so that one is readily derived from the other (Table 5.1). The companion canonical link also generates a companion DGLM. A filter for this is obtained (Section 6.3). Section 1.2 provides an indication of how the theory developed here might be applied to loss reserving. A sequel paper, providing numerical illustrations of this, is planned.  相似文献   

15.
    
Forecasting the outstanding claim liabilities to set adequate reserves is critical for a nonlife insurer's solvency. Chain–Ladder and Bornhuetter–Ferguson are two prominent actuarial approaches used for this task. The selection between the two approaches is often ad hoc due to different underlying assumptions. We introduce a Dirichlet model that provides a common statistical framework for the two approaches, with some appealing properties. Depending on the type of information available, the model inference naturally leads to either Chain–Ladder or Bornhuetter–Ferguson prediction. Using claims data on Worker's compensation insurance from several U.S. insurers, we discuss both frequentist and Bayesian inference.  相似文献   

16.
    
We consider the distribution of the deficit at ruin in the Sparre Andersen renewal risk model given that ruin occurs. We show that if the individual claim amounts have a phase-type distribution, then there is a simple phase-type representation for the distribution of the deficit. We illustrate the application of this result with several examples.  相似文献   

17.
This paper presents an explicit characterization for the joint probability density function of the surplus immediately prior to ruin and the deficit at ruin for a general risk process, which includes the Sparre-Andersen risk model with phase-type inter-claim times and claim sizes. The model can also accommodate a Markovian arrival process which enables claim sizes to be correlated with the inter-claim times. The marginal density function of the surplus immediately prior to ruin is specifically considered. Several numerical examples are presented to illustrate the application of this result.  相似文献   

18.
    
We introduce a jump-diffusion model for asset returns with jumps drawn from a mixture of normal distributions and show that this model adequately fits the historical data of the S&P500 index. We consider a delta-hedging strategy (DHS) for vanilla options under the diffusion model (DM) and the proposed jump-diffusion model (JDM), assuming discrete trading intervals and transaction costs, and derive an approximation for the probability density function (PDF) of the profit-and-loss (P&L) of the DHS under both models. We find that, under the log-normal model of Black–Scholes–Merton, the actual PDF of the P&L can be well approximated by the chi-squared distribution with specific parameters. We derive an approximation for the P&L volatility in the DM and JDM. We show that, under both DM and JDM, the expected loss due to transaction costs is inversely proportional to the square root of the hedging frequency. We apply mean–variance analysis to find the optimal hedging frequency given the hedger's risk tolerance. Since under the JDM it is impossible to reduce the P&L volatility by increasing the hedging frequency, we consider an alternative hedging strategy, following which the P&L volatility can be reduced by increasing the hedging frequency.  相似文献   

19.
This paper extends the macroeconomic frailty model to include sectoral frailty factors that capture default correlations among firms in a similar business. We estimate sectoral and macroeconomic frailty factors and their effects on default intensity using the data for Japanese firms from 1992 to 2010. We find strong evidence for the presence of sectoral frailty factors even after accounting for the effects of observable covariates and macroeconomic frailty on default intensity. The model with sectoral frailties performs better than that without. Results show that accounting for the sources of unobserved sectoral default risk covariations improves the accuracy of default probability estimation.  相似文献   

20.
This study compares the performance of the widely used risk measure, value at risk (VaR), across a large sample of developed and emerging countries. The performance of VaR is assessed using both the unconditional and conditional tests of Kupiec and Christoffersen, respectively, as well as the quadratic loss function. The results indicate that VaR performs much more poorly when measuring the risk of developed countries than of emerging ones. One possible reason might be the deeper initial impact of the global financial crisis on developed countries. The results also provide evidence of the decoupling of the market risk of emerging and developed countries during the global financial crisis.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号