首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 62 毫秒
1.
The loss distribution approach is one of the three advanced measurement approaches to the Pillar I modeling proposed by Basel II in 2001. In this paper, one possible approximation of the aggregate and maximum loss distribution in the extremely low frequency/high severity case is given, i.e. the case of infinite mean of the loss sizes and loss inter-arrival times. In this study, independent but not identically distributed losses are considered. The minimum loss amount is considered increasing over time. A Monte Carlo simulation algorithm is presented and several quantiles are estimated. The same approximation is used for modeling the maximum and aggregate worldwide economy losses caused by very rare and very extreme events such as 9/11, the Russian rouble crisis, and the U.S. subprime mortgage crisis. The model parameters are fit on a data sample of operational losses. The respective aggregate and extremal loss quantiles are calculated.  相似文献   

2.
Operational risk is an increasingly important area of risk management. Scenarios are an important modelling tool in operational risk management as alternative viable methods may not exist. This can be due to challenging modelling, data and implementation issues, and other methods fail to take into account expert information. The use of scenarios has been recommended by regulators; however, scenarios can be unreliable, unrealistic and fail to take into account quantitative data. These problems have also been identified by regulators such as Basel, and presently little literature exists on addressing the problem of generating scenarios for operational risk. In this paper we propose a method for generating operational risk scenarios. We employ the method of cluster analysis to generate scenarios that enable one to combine expert opinion scenarios with quantitative operational risk data. We show that this scenario generation method leads to significantly improved scenarios and significant advantages for operational risk applications. In particular for operational risk modelling, our method leads to resolving the key problem of combining two sources of information without eliminating the information content gained from expert opinions, tractable computational implementation for operational risk modelling, improved stress testing, what‐if analyses and the ability to apply our method to a wide range of quantitative operational risk data (including multivariate distributions). We conduct numerical experiments on our method to demonstrate and validate its performance and compare it against scenarios generated from statistical property matching for comparison. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

3.
We derive worst-case scenarios in a life insurance model in the case where the interest rate and the various transition intensities are mutually dependent. Examples of this dependence are that (a) surrender intensities and interest rates are high at the same time, (b) mortality intensities of a policyholder as active and disabled, respectively, are low at the same time, and (c) mortality intensities of the policyholders in a portfolio are low at the same time. The set from which the worst-case scenario is taken reflects the dependence structure and allows us to relate the worst-case scenario-based reserve, qualitatively, to a Value-at-Risk-based calculation of solvency capital requirements. This brings out perspectives for our results in relation to qualifying the standard formula of Solvency II or using a scenario-based approach in internal models. Our results are powerful for various applications and the techniques are non-standard in control theory, exactly because our worst-case scenario is deterministic and not adapted to the stochastic development of the portfolio. The formalistic results are exemplified in a series of numerical studies.  相似文献   

4.
以机动车辆保险为研究对象,分析车险核心业务流程中操作风险的表现形式,借助拓扑数据模型及Monte Carlo模拟对其风险进行度量。结果显示:损失强度表现出较强的厚尾性,事件频率具有高频性,但总损失额分布的厚尾特征不明显,这些结果为操作风险进行经济资本配置及管理提供了依据。  相似文献   

5.
In this exploratory paper we propose ‘worldmaking’ as a framework for pluralistic, imaginative scenario development. Our points of departure are the need in scenario practice to embrace uncertainty, discomfort and knowledge gaps, and the connected need to capture and make productive fundamental plurality among understandings of the future. To help respond to these needs, we introduce what Nelson Goodman calls worldmaking. It holds that there is no singular, objective world (or “real reality”), and instead that worlds are multiple, constructed through creative processes instead of given, and always in the process of becoming. We then explore how worldmaking can operationalise discordant pluralism in scenario practice by allowing participants to approach not only the future but also the present in a constructivist and pluralistic fashion; and by extending pluralism to ontological domains. Building on this, we investigate how scenario worldmaking could lead to more imaginative scenarios: worldmaking is framed as a fully creative process which gives participants ontological agency, and it helps make contrasts, tensions and complementarities between worlds productive. We go on to propose questions that can be used to operationalize scenario worldmaking, and conclude with the expected potential and limitations the approach, as well as suggestions for practical experimentation.  相似文献   

6.
Literature on Losses Given Default (LGD) usually focuses on mean predictions, even though losses are extremely skewed and bimodal. This paper proposes a Quantile Regression (QR) approach to get a comprehensive view on the entire probability distribution of losses. The method allows new insights on covariate effects over the whole LGD spectrum. In particular, middle quantiles are explainable by observable covariates while tail events, e.g., extremely high LGDs, seem to be rather driven by unobservable random events. A comparison of the QR approach with several alternatives from recent literature reveals advantages when evaluating downturn and unexpected credit losses. In addition, we identify limitations of classical mean prediction comparisons and propose alternative goodness of fit measures for the validation of forecasts for the entire LGD distribution.  相似文献   

7.
Not all claims are reported when a database for financial operational risk is created. The probability of reporting increases with the size of the operational risk loss, and converges towards one for big losses. Losses in operational risk have different causes, and usually follow a wide variety of distributional shapes. Therefore, a method for modelling operational risk based on one or two parametric models is deemed to fail. In this paper, we introduce a semi-parametric method for modelling operational risk that is capable of taking under-reporting into account and being guided by prior knowledge of the distributional shape.  相似文献   

8.
In this article, I identify challenges to the loss distribution approach in modeling operational risk. I propose a scenario-based methodology for operational risk assessment, which recognizes that each risk can occur under a number of wide-ranging scenarios and that association between risks may behave differently for different scenarios. The model that is developed internally in the company provides a practical quantitative assessment of risk exposure that reflects a deep understanding of the company and its environment, making the risk calculation more responsive to the actual state, ensuring that the company is attending to its key operational risks. In this model qualitative and quantitative approaches are combined to build a loss distribution for individual and aggregate operational risk exposure. The model helps to portray the company's internal control systems and aspects of business environment. These features can help the company increase its operational efficiency, reduce loss from undesirable incidents, and maintain the integrity of internal control.  相似文献   

9.
The quality of scenario planning activities can be difficult to assess, as one cannot know how likely any projected future scenario is. Here, we introduce one approach for gaining greater confidence. Historical analogy provides the means for achieving this, whereby the model upon which scenarios are constructed is analysed in terms of how well it predicts and establishes links with recent historical environments. We apply this approach to a previously developed scenario tree, constructed using the field anomaly relaxation method, as a case study to indicate how historical analogy can be used to assess and enhance the model from which the scenarios are constructed.  相似文献   

10.
The dynamic input-output model DIMITRI (Dynamic Input-output Model to study the Impacts of Technology Resulted Innovations) can be used for long-term scenario explorations on technology, demand and environmental effects. The model describes at a sectoral level the relations and dynamics between consumption, production and emissions. Technologies are introduced bottom-up at a sectoral level, through variations on the inputs from other sectors and by changes in the coefficients for capital, labour and emissions. This paper presents a methodology for future explorations of technologies in four scenarios, based on the International Panel on Climate Change (IPCC) scenario framework. Trend analysis combines detailed information on specific technologies. Differentiations are made between scenarios, based on their specific storylines. The adjustment of coefficients influences model outcomes such as production, balance of trade and emissions. This paper briefly outlines the methodology and presents the main outcomes for four scenarios for the period 2000-2030.  相似文献   

11.
We investigate whether management's decision regarding the recognition of the valuation allowance (VA) for deferred tax assets provides incremental information about the persistence of accounting losses. We introduce a classification scheme that assigns loss firm‐years into three categories based on whether management appears to have recognized a material change in the VA, and whether or not the firm has positive taxable income (e.g., a net operating loss). The results of our study show that our tax categories contain information about the persistence of accounting losses over the following three years beyond variables previously identified to predict loss persistence. This incremental information is consistent with management using private information about the firm's future prospects in setting the VA. Finally, we find that investors’ pricing of the VA varies with the saliency of the tax signal and the information environment of the firm.  相似文献   

12.
Fixed income options contain substantial information on the price of interest rate volatility risk. In this paper, we ask if those options will also provide information related to other moments of the objective distribution of interest rates. Based on dynamic term structure models within the class of affine models, we find that interest rate options are useful for the identification of interest rate quantiles. Two three-factor models are adopted and their adequacy to estimate Value at Risk of zero-coupon bonds is tested. We find significant difference on the quantitative assessment of risk when options are (or not) included in the estimation process of each of these dynamic models. Statistical backtests indicate that bond estimated risk is clearly more adequate when options are adopted, although not yet completely satisfactory.  相似文献   

13.
The stability of estimates is critical when applying advanced measurement approaches (AMA) such as loss distribution approach (LDA) for operational risk capital modeling. Recent studies have identified issues associated with capital estimates by applying the maximum likelihood estimation (MLE) method for truncated distributions: significant upward mean-bias, considerable uncertainty about the estimates, and non-robustness to both small and large losses. Although alternative estimation approaches have been proposed, there has not been any comprehensive study of how alternative approaches perform compared to the MLE method. This paper is the first comprehensive study on the performance of various potentially promising alternative approaches (including minimum distance approach, quantile distance approach, scaling-based bias correction, upward scaling of lower quantiles, and right-truncated distributions) as compared to MLE with regards to accuracy, precision and robustness. More importantly, based on the properties of each estimator, we propose a right-truncation with probability weighted least squares method, by combining the right-truncated distribution and minimizing a probability weighted distance (i.e., the quadratic upper-tail Anderson–Darling distance), and we find it significantly reduces the bias and volatility of capital estimates and improves the robustness of capital estimates to small losses near the threshold or moving the threshold, demonstrated by both simulation results and real data application.  相似文献   

14.
Common sense tells us that the future is an essential element in any strategy. In addition, there is a good deal of literature on scenario planning, which is an important tool in considering the future in terms of strategy. However, in many organizations there is serious resistance to the development of scenarios, and they are not broadly implemented by companies. But even organizations that do not rely heavily on the development of scenarios do, in fact, construct visions to guide their strategies. But it might be asked, what happens when this vision is not consistent with the future? To address this problem, the present article proposes a method for checking the content and consistency of an organization's vision of the future, no matter how it was conceived. The proposed method is grounded on theoretical concepts from the field of future studies, which are described in this article. This study was motivated by the search for developing new ways of improving and using scenario techniques as a method for making strategic decisions. The method was then tested on a company in the field of information technology in order to check its operational feasibility. The test showed that the proposed method is, in fact, operationally feasible and was capable of analyzing the vision of the company being studied, indicating both its shortcomings and points of inconsistency.  相似文献   

15.
16.
When correlations between assets turn positive, multi-asset portfolios can become riskier than single assets. This article presents the estimation of tail risk at very high quantiles using a semiparametric estimator which is particularly suitable for portfolios with a large number of assets. The estimator captures simultaneously the information contained in each individual asset return that composes the portfolio, and the interrelation between assets. Noticeably, the accuracy of the estimates does not deteriorate when the number of assets in the portfolio increases. The implementation is as easy for a large number of assets as it is for a small number. We estimate the probability distribution of large losses for the American stock market considering portfolios with ten, fifty and one hundred assets of stocks with different market capitalization. In either case, the approximation for the portfolio tail risk is very accurate. We compare our results with well known benchmark models.  相似文献   

17.
Futures literature invites researchers to investigate stakeholders’ interests, actions and reactions, as well as to introduce an analysis of power and influence in scenario thinking. The purpose of this paper is to assess how the concept of dominance can help to improve scenario building and futures thinking as dominance transforms leadership within action processes. First, we examine power at work at different levels using concepts that relate to dominance and leadership shifts. Secondly, we discuss methodological proposals to implement the concepts of weak and strong dominance in action-based scenarios design and the implications of theses concepts for refining the approach of leadership in futures thinking. We conclude that paying attention to dominance transformations in scenarios is a promising direction to develop stakeholder and leadership analysis in scenario thinking. We suggest further research on the connection between history and futures thinking.  相似文献   

18.
We derive an analytic approximation to the credit loss distribution of large portfolios by letting the number of exposures tend to infinity. Defaults and rating migrations for individual exposures are driven by a factor model in order to capture co-movements in changing credit quality. The limiting credit loss distribution obeys the empirical stylized facts of skewness and heavy tails. We show how portfolio features like the degree of systematic risk, credit quality and term to maturity affect the distributional shape of portfolio credit losses. Using empirical data, it appears that the Basle 8% rule corresponds to quantiles with confidence levels exceeding 98%. The limit law's relevance for credit risk management is investigated further by checking its applicability to portfolios with a finite number of exposures. Relatively homogeneous portfolios of 300 exposures can be well approximated by the limit law. A minimum of 800 exposures is required if portfolios are relatively heterogeneous. Realistic loan portfolios often contain thousands of exposures implying that our analytic approach can be a fast and accurate alternative to the standard Monte-Carlo simulation techniques adopted in much of the literature.  相似文献   

19.
从利益相关者的效用和福利的角度,构建PPP模式和公共管理模式建设公租房的比较模型,分别分析在信息对称情形下和信息不对称情形下的比较结果.研究发现:在信息对称情形下,PPP模式建设公租房所产生的福利永远不可能高于公共管理模式;然而更加贴近实际的是信息不对称情形,此情形下,政府的资金约束愈大,项目的风险愈大,采用PPP模式愈优.  相似文献   

20.
Reverse stress tests are a relatively new stress test instrument that aims at finding exactly those scenarios that cause a bank to cross the frontier between survival and default. Afterward, the scenario which is most probable has to be identified. This paper sketches a framework for a quantitative reverse stress test for maturity-transforming banks that are exposed to credit and interest rate risk and demonstrates how the model can be calibrated empirically. The main features of the proposed framework are: (1) the necessary steps of a reverse stress test (solving an inversion problem and computing the scenario probabilities) can be performed within one model, (2) scenarios are characterized by realizations of macroeconomic risk factors, (3) principal component analysis helps to reduce the dimensionality of the space of systematic risk factors, (4) due to data limitations, the results of reverse stress tests are exposed to considerable model and estimation risk, which makes numerous robustness checks necessary.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号