首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 593 毫秒
1.
Operational risk is an increasingly important area of risk management. Scenarios are an important modelling tool in operational risk management as alternative viable methods may not exist. This can be due to challenging modelling, data and implementation issues, and other methods fail to take into account expert information. The use of scenarios has been recommended by regulators; however, scenarios can be unreliable, unrealistic and fail to take into account quantitative data. These problems have also been identified by regulators such as Basel, and presently little literature exists on addressing the problem of generating scenarios for operational risk. In this paper we propose a method for generating operational risk scenarios. We employ the method of cluster analysis to generate scenarios that enable one to combine expert opinion scenarios with quantitative operational risk data. We show that this scenario generation method leads to significantly improved scenarios and significant advantages for operational risk applications. In particular for operational risk modelling, our method leads to resolving the key problem of combining two sources of information without eliminating the information content gained from expert opinions, tractable computational implementation for operational risk modelling, improved stress testing, what‐if analyses and the ability to apply our method to a wide range of quantitative operational risk data (including multivariate distributions). We conduct numerical experiments on our method to demonstrate and validate its performance and compare it against scenarios generated from statistical property matching for comparison. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

2.
Integrated risk management for financial institutions requires an approach for aggregating risk types (market, credit, and operational) whose distributional shapes vary considerably. We construct the joint risk distribution for a typical large, internationally active bank using the method of copulas. This technique allows us to incorporate realistic marginal distributions that capture essential empirical features of these risks such as skewness and fat-tails while allowing for a rich dependence structure. We explore the impact of business mix and inter-risk correlations on total risk. We then compare the copula-based method with several conventional approaches to computing risk.  相似文献   

3.
《Journal of Banking & Finance》2006,30(10):2635-2658
Due to the new regulatory guidelines known as Basel II for banking and Solvency 2 for insurance, the financial industry is looking for qualitative approaches to and quantitative models for operational risk. Whereas a full quantitative approach may never be achieved, in this paper we present some techniques from probability and statistics which no doubt will prove useful in any quantitative modelling environment. The techniques discussed are advanced peaks over threshold modelling, the construction of dependent loss processes and the establishment of bounds for risk measures under partial information, and can be applied to other areas of quantitative risk management.1  相似文献   

4.
Data insufficiency and reporting threshold are two main issues in operational risk modelling. When these conditions are present, maximum likelihood estimation (MLE) may produce very poor parameter estimates. In this study, we first investigate four methods to estimate the parameters of truncated distributions for small samples—MLE, expectation-maximization algorithm, penalized likelihood estimators, and Bayesian methods. Without any proper prior information, Jeffreys’ prior for truncated distributions is used. Based on a simulation study for the log-normal distribution, we find that the Bayesian method gives much more credible and reliable estimates than the MLE method. Finally, an application to the operational loss severity estimation using real data is conducted using the truncated log-normal and log-gamma distributions. With the Bayesian method, the loss distribution parameters and value-at-risk measure for every cell with loss data can be estimated separately for internal and external data. Moreover, confidence intervals for the Bayesian estimates are obtained via a bootstrap method.  相似文献   

5.
6.
确切的操作风险损失分布保障了风险度量的准确性。对银行操作风险损失数据的分析,国外学者一致认为操作风险分布近似泊松分布或负的贝奴里分布。基于中国商业银行1994~2008年的操作风险损失数据,通过对操作风险损失分布的检验、贝叶斯马尔科夫蒙特卡洛频率分析,发现中国商业银行操作风险损失分布近似服从广义极值分布(Generalized Extreme Value)。  相似文献   

7.
Over recent years, a study on risk management has been prompted by the Basel committee for regular banking supervisory. There are however limitations of some widely-used risk management methods that either calculate risk measures under the Gaussian distributional assumption or involve numerical difficulty. The primary aim of this paper is to present a realistic and fast method, GHICA, which overcomes the limitations in multivariate risk analysis. The idea is to first retrieve independent components (ICs) out of the observed high-dimensional time series and then individually and adaptively fit the resulting ICs in the generalized hyperbolic (GH) distributional framework. For the volatility estimation of each IC, the local exponential smoothing technique is used to achieve the best possible accuracy of estimation. Finally, the fast Fourier transformation technique is used to approximate the density of the portfolio returns.The proposed GHICA method is applicable to covariance estimation as well. It is compared with the dynamic conditional correlation (DCC) method based on the simulated data with d = 50 GH distributed components. We further implement the GHICA method to calculate risk measures given 20-dimensional German DAX portfolios and a dynamic exchange rate portfolio. Several alternative methods are considered as well to compare the accuracy of calculation with the GHICA one.  相似文献   

8.
操作风险是金融机构需要面对的主要风险之一,操作风险的量化是对操作风险进行有效管理的前提之一,由于对它的认识尚处于初步阶段,商业银行对操作风险的度量和管理仍处于探索之中.本文对操作风险主要量化方法进行了分析,并对国内两家股份制商业银行的操作风险状况进行了实证研究.  相似文献   

9.
An important issue in global corporate risk management is whether the multinationality of a firm matters in terms of its effect on exchange risk exposure. In this paper, we examine the exchange risk exposure of US firms during 1983–2006, comparing multinational and non-multinational firms and focusing on the role of operational hedging. Since MNCs and non-multinationals differ in size and other characteristics, we construct matched samples of MNCs and non-multinationals based on the propensity score method. We find that the multinationality in fact matters for a firm’s exchange exposure but not in the way usually presumed – the exchange risk exposures are actually smaller and less significant for MNCs than non-multinationals. The results are robust with respect to different samples and model specifications. There is evidence that operational hedging decreases a firm’s exchange risk exposure and increases its stock returns. The effective deployment of operational risk management strategies provides one reason why MNCs may have insignificant exchange risk exposure estimates.  相似文献   

10.
Böcker and Klüppelberg [Risk Mag., 2005, December, 90–93] presented a simple approximation of OpVaR of a single operational risk cell. The present paper derives approximations of similar quality and simplicity for the multivariate problem. Our approach is based on the modelling of the dependence structure of different cells via the new concept of a Lévy copula.  相似文献   

11.
The mean-Gini framework has been suggested as a robust alternative to the portfolio approach to futures hedging given its optimality under general distributional conditions. However, calculation of the Gini hedge ratio requires estimation of the underlying price distribution. We estimate minimum-Gini hedge ratios using two widely-used estimation procedures, the empirical distribution function method and the kernel method, for three emerging market and three developed market currencies. We find that these methods yield different Gini hedge ratios. These differences increase with risk aversion and are statistically significant for all developed market currencies but only one emerging market currency. In-sample analyses show that the empirical distribution function method is more effective at risk reduction than the kernel method for developed market currencies, whereas the kernel method is superior for emerging market currencies. Post-sample analyses strengthen the superiority of the empirical distribution function method for developed market and, in several cases, for emerging market currencies.JEL Classification: F31, G15  相似文献   

12.
This study explores the importance of imposing a correct distributional hypothesis in a risk management strategy, by comparing hedge ratios under the restrictive normality assumption to those under the generalized stable distribution. Concepts are illustrated for the case of a representative Pennsylvania dairy farm manager who purchases corn as a feed input. The results show that time processes of corn prices and basis risk in five Pennsylvania regions do not correspond to the normal distribution, and they more correctly correspond to one of the stable distribution set. The estimated hedge ratios under the stable distribution are typically larger than those under the normal distribution. The difference would be a bias from imposing a wrong distributional assumption.  相似文献   

13.
操作风险可以说是商业银行面临的最古老的风险,但目前商业银行对市场风险和信用风险的重视程度远远超过了操作风险,事实上操作风险管理可能是治理结构不善的银行最应关注和最有可能取得成效的领域。而操作风险的计量则是操作风险管理的基础,只有准确地对操作风险进行计量才有可能实施有效的风险管理。  相似文献   

14.
当前我国商业银行操作风险问题研究   总被引:2,自引:0,他引:2  
本课题通过对我国银行业当前操作风险特征、成因进行全面、深入的剖析,科学系统地提出有针对性的防范操作风险的对策,对我国商业银行操作风险问题的破解提供理论思路和实际工作指导.  相似文献   

15.
Basel II introduced a three pillar approach which concentrated upon new capital ratios (Pillar I), new supervisory procedures (Pillar II) and demanded better overall disclosure to ensure effective market discipline and transparency. Importantly, it introduced operational risk as a standalone area of the bank which for the first time was required to be measured, managed and capital allocated to calculated operational risks. Concurrently, Solvency II regulation in the insurance industry was also re-imagining regulations within the insurance industry and also developing operational risk measures. Given that Basel II was first published in 2004 and Solvency II was set to go live in January 2014. This paper analyses the strategic challenges of Basel II in the UK banking sector and then uses the results to inform a survey of a major UK insurance provider. We report that the effectiveness of Basel II was based around: the reliance upon people for effective decision making; the importance of good training for empowerment of staff; the importance of Board level engagement; and an individual's own world view and perceptions influenced the adoption of an organizational risk culture. We then take the findings to inform a survey utilizing structural equation modelling to analyze risk reporting and escalation in a large UK insurance company. The results indicate that attitude and uncertainty significantly affect individual's intention to escalate operational risk and that if not recognized by insurance companies and regulators will hinder the effectiveness of Solvency II implementation.  相似文献   

16.
我国财险公司操作风险探析及其管理对策   总被引:2,自引:0,他引:2  
风险管理能力是金融企业的核心竞争力之一,在当前金融危机席卷全球的背景下,金融企业自身的风险管理显得尤为重要。从国际经验数据来看,财险公司出现财务危机或者破产的主要原因是操作风险导致的,因此操作风险的管理是财险公司风险管理的重点。本文借鉴银行业的操作风险的管理经验,对操作风险的范畴进行了界定,阐述了操作风险的特点,分析了我国财险业目前面临的主要操作风险,并提出了管理对策。  相似文献   

17.
Modeling Operational Risk With Bayesian Networks   总被引:2,自引:0,他引:2  
Bayesian networks is an emerging tool for a wide range of risk management applications, one of which is the modeling of operational risk. This comes at a time when changes in the supervision of financial institutions have resulted in increased scrutiny on the risk management of banks and insurance companies, thus giving the industry an impetus to measure and manage operational risk. The more established methods for risk quantification are linear models such as time series models, econometric models, empirical actuarial models, and extreme value theory. Due to data limitations and complex interaction between operational risk variables, various nonlinear methods have been proposed, one of which is the focus of this article: Bayesian networks. Using an idealized example of a fictitious on line business, we construct a Bayesian network that models various risk factors and their combination into an overall loss distribution. Using this model, we show how established Bayesian network methodology can be applied to: (1) form posterior marginal distributions of variables based on evidence, (2) simulate scenarios, (3) update the parameters of the model using data, and (4) quantify in real‐time how well the model predictions compare to actual data. A specific example of Bayesian networks application to operational risk in an insurance setting is then suggested.  相似文献   

18.
商业银行操作风险的统计特征及其资本模拟实证   总被引:2,自引:0,他引:2  
本文通过对近年我国发生的商业银行操作风险事件的统计,得出了我国商业银行操作风险的重要特征,包括:内部欺诈及其导致的操作风险损失所占比重最大,操作风险资本的顺经济周期效应表现明显,欺诈性操作风险与地区法治水平呈现背离走势等.在对操作风险事件各损失类型发生的频率和损失金额分布进行拟合的基础上,运用蒙特卡洛模拟方法对我国商业银行操作风险资本进行10 231次模拟计算,结果显示,在置信水平为99.9%的条件下,我国整个商业银行业在拨备了3 163亿元的操作风险资本以后,大致可以抵御150年所遭遇的全部操作风险损失带来的冲击.  相似文献   

19.
商业银行操作风险的实证分析与风险资本计量   总被引:8,自引:0,他引:8  
利用自上而下模型中的收入模型对银行操作风险进行度量,从宏观视角建立商业银行净利润与经济增长及银行不良资产间的对应关系,对国内两家商业银行的操作风险状况进行实证分析,并在此基础上得出对操作风险资本的计量.研究发现:收入模型可以在某种程度上反映操作风险的大小.目前,我国商业银行面临着较为严重的操作风险,其中,市场因素对收入的影响比信用因素的影响更为明显.  相似文献   

20.
总分行制度下基于Delta-EVT模型的操作风险度量研究   总被引:1,自引:0,他引:1  
邹薇  陈云 《金融论坛》2007,(6):40-45
总分行制度下制度设计缺陷及人员管理等方面的失误是商业银行操作风险产生的重要原因之一.在操作风险损失数据不完全的情况下,首先将损失进行分类,并选取总行对分行管理人员配备的误差、机构内部人员的组合误差、员工素质问题、分行出现问题后向总行报告的时间延误和总行及时处理问题分行的能力等特定的风险因子,继而借助Delta-EVT模型,采用Delta方法计算由以上风险因子导致的操作损失.通过计算,由于控制失效或外部事件引起的超额损失,再利用门槛值将两者结合起来,用EVT方法可以较准确地估算出在分行经营过程中和在向总行传递信息过程中由于制度设计不合理导致的操作风险.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号