首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 250 毫秒
1.
The loss distribution approach is one of the three advanced measurement approaches to the Pillar I modeling proposed by Basel II in 2001. In this paper, one possible approximation of the aggregate and maximum loss distribution in the extremely low frequency/high severity case is given, i.e. the case of infinite mean of the loss sizes and loss inter-arrival times. In this study, independent but not identically distributed losses are considered. The minimum loss amount is considered increasing over time. A Monte Carlo simulation algorithm is presented and several quantiles are estimated. The same approximation is used for modeling the maximum and aggregate worldwide economy losses caused by very rare and very extreme events such as 9/11, the Russian rouble crisis, and the U.S. subprime mortgage crisis. The model parameters are fit on a data sample of operational losses. The respective aggregate and extremal loss quantiles are calculated.  相似文献   

2.
We propose a new procedure to estimate the loss given default (LGD) distribution. Owing to the complicated shape of the LGD distribution, using a smooth density function as a driver to estimate it may result in a decline in model fit. To overcome this problem, we first apply the logistic regression to estimate the LGD cumulative distribution function. Then, we convert the result into the LGD distribution estimate. To implement the newly proposed estimation procedure, we collect a sample of 5269 defaulted debts from Moody’s Default and Recovery Database. A performance study is performed using 2000 pairs of in-sample and out-of-sample data-sets with different sizes that are randomly selected from the entire sample. Our results show that the newly proposed procedure has better and more robust performance than its alternatives, in the sense of yielding more accurate in-sample and out-of-sample LGD distribution estimates. Thus, it is useful for studying the LGD distribution.  相似文献   

3.
The New Basel Accord allows internationally active banking organizations to calculate their credit risk capital requirements using an internal ratings based approach, subject to supervisory review. One of the modeling components is the loss-given default (LGD): it represents the credit loss for a bank when extreme events occur that influence the obligor ability to repay his debts to the bank. Among researchers and practitioners the use of statistical models such as linear regression, Tobit or decision trees is quite common in order to compute LGDs as a forecasting of historical losses. However, these statistical techniques do not seem to provide robust estimation and show low performance. These results could be driven by some factors that make differences in LGD, such as the presence and quality of collateral, timing of the business cycle, workout process management and M&A activity among banks. This paper evaluates an alternative method of modeling LGD using a technique based on advanced credibility theory typically used in actuarial modeling. This technique provides a statistical component to the credit and workout experts’ opinion embedded in the collateral and workout management process and improve the predictive power of forecasting. The model has been applied to an Italian Bank Retail portfolio represented by Overdrafts; the application of credibility theory provides a higher predictive power of LGD estimation and an out-of-time sample backtesting has shown a stable accuracy of estimates with respect to the traditional LGD model.  相似文献   

4.
We present a macro variable-based empirical model for corporate bank loans’ credit risk. The model captures the well-known positive relationship between probability of default (PD) and loss given default (LGD; i.e., the inverse of recovery) and their counter-cyclical movement with the business cycle. In the absence of proper micro data on LGD, we use a random-sampling method to estimate the annual average LGD. We specify a two equation model for PD and LGD which is estimated with Finnish time-series data from 1989 to 2008. We also use a system of time-series models for the exogenous macro variables to derive the main macroeconomic shocks which are then used in stress testing aggregate loan losses. We show that the endogenous LGD makes a considerable difference in stress tests compared to a constant LGD assumption.  相似文献   

5.
In this article, a generic severity risk framework in which loss given default (LGD) is dependent upon probability of default (PD) in an intuitive manner is developed. By modeling the conditional mean of LGD as a function of PD, which also varies with systemic risk factors, this model allows an arbitrary functional relationship between PD and LGD. Based on this framework, several specifications of stochastic LGD are proposed with detailed calibration methods. By combining these models with an extension of CreditRisk+, a versatile mixed Poisson credit risk model that is capable of handling both risk factor correlation and PD–LGD dependency is developed. An efficient simulation algorithm based on importance sampling is also introduced for risk calculation. Empirical studies suggest that ignoring or incorrectly specifying severity risk can significantly underestimate credit risk and a properly defined severity risk model is critical for credit risk measurement as well as downturn LGD estimation.  相似文献   

6.
Using loan-level foreclosure auction data we study the loss given default (LGD) of defaulted residential mortgages originated in Korea, a low LTV regime. We find that senior mortgages generate very low loss rates (5–10%) while losses of subordinated claims are in 30–50% range. We document the effects of housing market cycles on loss severity by showing that collateral characteristics that are overvalued during the boom increase loss severity during the market downturn. We also investigate how a broad set of time-of-origination and post-origination information on loan, collateral and borrower characteristics and foreclosure auction process influence the LGD of residential mortgages.  相似文献   

7.
This study employs a dataset from three German leasing companies with 14,322 defaulted leasing contracts to analyze different approaches to estimating the loss given default (LGD). Using the historical average LGD and simple OLS-regression as benchmarks, we compare hybrid finite mixture models (FMMs), model trees and regression trees and we calculate the mean absolute error, root mean squared error, and the Theil inequality coefficient. The relative estimation accuracy of the methods depends, among other things, on the number of observations and whether in-sample or out-of-sample estimations are considered. The latter is decisive for proper risk management and is required for regulatory purposes. FMMs aim to reproduce the distribution of realized LGDs and, therefore, perform best with respect to in-sample estimations, but they show poor performance with respect to out-of-sample estimations. Model trees, by contrast, are more robust and outperform all other methods if the sample size is sufficiently large.  相似文献   

8.
This paper extends the existing literature on deposit insurance by proposing a new approach for the estimation of the loss distribution of a Deposit Insurance Scheme (DIS) that is based on the Basel 2 regulatory framework. In particular, we generate the distribution of banks’ losses following the Basel 2 theoretical approach and focus on the part of this distribution that is not covered by capital (tail risk). We also refine our approach by considering two major sources of systemic risks: the correlation between banks’ assets and interbank lending contagion. The application of our model to 2007 data for a sample of Italian banks shows that the target size of the Italian deposit insurance system covers up to 98.96% of its potential losses. Furthermore, it emerges that the introduction of bank contagion via the interbank lending market could lead to the collapse of the entire Italian banking system. Our analysis points out that the existing Italian deposit insurance system can be assessed as adequate only in normal times and not in bad market conditions with substantial contagion between banks. Overall, we argue that policy makers should explicitly consider the following when estimating DIS loss distributions: first, the regulatory framework within which banks operate such as (Basel 2) capital requirements; and, second, potential sources of systemic risk such as the correlation between banks’ assets and the risk of interbank contagion.  相似文献   

9.
Journal of Financial Services Research - We propose a new procedure to predict the loss given default (LGD) distribution. Studies find empirical evidence that LGD values have a high concentration...  相似文献   

10.
Using equity returns for financial institutions we estimate both catastrophic and operational risk measures over the period 1973–2003. We find evidence of cyclical components in both the catastrophic and operational risk measures obtained from the generalized Pareto distribution and the skewed generalized error distribution. Our new, comprehensive approach to measuring operational risk shows that approximately 18% of financial institutions’ returns represent compensation for operational risk. However, depository institutions are exposed to operational risk levels that average 39% of the overall equity risk premium. Moreover, operational risk events are more likely to be the cause of large unexpected catastrophic losses, although when they occur, the losses are smaller than those resulting from a combination of market risk, credit risk or other risk events.  相似文献   

11.
Statistical analyses on actual data depict operational risk as an extremely heavy-tailed phenomenon, able to generate losses so extreme as to suggest the use of infinite-mean models. But no loss can actually destroy more than the entire value of a bank or of a company, and this upper bound should be considered when dealing with tail-risk assessment. Introducing what we call the dual distribution, we show how to deal with heavy-tailed phenomena with a remote yet finite upper bound. We provide methods to compute relevant tail quantities such as the Expected Shortfall, which is not available under infinite-mean models, allowing adequate provisioning and capital allocation. This also permits a measurement of fragility. The main difference between our approach and a simple truncation is in the smoothness of the transformation between the original and the dual distribution. Our methodology is useful with apparently infinite-mean phenomena, as in the case of operational risk, but it can be applied in all those situations involving extreme fat tails and bounded support.  相似文献   

12.
A copula approach is used to examine the extreme return–volume relationship in six emerging East-Asian equity markets. The empirical results indicate that there is significant and asymmetric return–volume dependence at extremes for these markets. In particular, extremely high returns (large gains) tend to be associated with extremely large trading volumes, but extremely low returns (big losses) tend not to be related to either large or small volumes.  相似文献   

13.
In this paper we propose a method to enhance the performance of knowledge‐based decision‐support systems, knowledge of which is volatile and incomplete by nature in a dynamically changing situation, by providing meta‐knowledge augmented by the Qualitative Reasoning (QR) approach. The proposed system intends to overcome the potential problem of completeness of the knowledge base. Using the deep meta‐knowledge incorporated into the QR module, along with the knowledge we gain from applying inductive learning, we then identify the ongoing process and amplify the effects of each pending process to the attribute values. In doing so, we apply the QR models to enhance or reveal the patterns which are otherwise less obvious. The enhanced patterns can eventually be used to improve the classification of the data samples. The success factor hinges on the completeness of the QR process knowledge base. With enough processes taking place, the influences of each process will lead prediction in a direction that can reflect more of the current trend. The preliminary results are successful and shed light on the smooth introduction of Qualitative Reasoning to the business domain from the physical laboratory application. © 2001 John Wiley & Sons, Ltd.  相似文献   

14.
In certain segments, IBNR calculations on paid triangles are more stable than on incurred triangles. However, calculations on payments often do not adequately take large losses into account. An IBNR method which separates large and attritional losses and thus allows to use payments for the attritional and incurred amounts for the large losses has been introduced by Riegel (see Riegel, U. (2014). A bifurcation approach for attritional and large losses in chain ladder calculations. Astin Bulletin 44, 127–172). The method corresponds to a stochastic model that is based on Mack’s chain ladder model. In this paper, we analyse a quasi-additive version of this model, i.e. a version which is in essence based on the assumptions of the additive (or incremental loss ratio) method. We describe the corresponding IBNR method and derive formulas for the mean squared error of prediction.  相似文献   

15.
Although there are many definitions of systemic risk, most agree that it manifests itself by an initial shock that results in the failure of one or more banks and then spreads out to the entire system by a contagion mechanism which can result in the failure of more banks in the system. Assuming that bank failures in the initial shock are randomly dependent on the failure probabilities of the individual banks and that the ensuing contagion process is deterministic, depending on interbank exposures, in this paper we propose a network model to analyse systemic risk in the banking system that, in contrast to other proposed models, seeks to obtain the probability distribution of losses for the financial system resulting from the shock/contagion process. Thus, calculating the probabilities of joint failures by simulation and assuming that the matrix of bilateral interbank exposures is known, we represent systemic risk in the financial system by means of a graph and use discrete modelling techniques to characterize the dynamics of contagion and corresponding losses within the network. The probability distribution of losses, risk profile for the Mexican banking system, is obtained through an efficient, complete enumeration procedure of all possible bank default events in the system. This, in turn, allows the use of the wide variety of well-established risk measures to describe the fragility of the financial system. Additionally, the model allows us to perform stress tests along both the bank default probabilities and the interbank exposures and is used to assess the risk of the Mexican banking system. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

16.
As documented in the literature, the effects of firm size, financial leverage, and R&D expenditures on firm earnings are inclusive. Our hypothesis is that the inconsistent empirical results of such effects may be driven by the regression models implemented in data analysis. Using the quantile regression (QR) approach developed by Koenker and Basset (1978), this study analyses S&P 500 firms from 1996 to 2005. We find that the effects of firm size, financial leverage and R&D expenditures on firm earnings differ considerably across earnings quantiles. Comparing the results from the QR approach with those from the ordinary least squares (OLS) and least absolute deviation (LAD) methods, this study further explains the puzzling relationship between firm size, financial leverage, R&D expenditures and firm earnings.  相似文献   

17.
This paper suggests formulas able to capture potential strong connection among credit losses in downturns without assuming any specific distribution for the variables involved. We first show that the current model adopted by regulators (Basel) is equivalent to a conditional distribution derived from the Gaussian Copula (which does not identify tail dependence). We then use conditional distributions derived from copulas that express tail dependence (stronger dependence across higher losses) to estimate the probability of credit losses in extreme scenarios (crises). Next, we use data on historical credit losses incurred in American banks to compare the suggested approach to the Basel formula with respect to their performance when predicting the extreme losses observed in 2009 and 2010. Our results indicate that, in general, the copula approach outperforms the Basel method in two of the three credit segments investigated. The proposed method is extendable to other differentiable copula families and this gives flexibility to future practical applications of the model.  相似文献   

18.
Basel II aims to aggressively improve on Basel I, and is projected to capitalize on the technological advancements that have permeated the financial industry since Basel I. This paper examines the correlation issues that arise, and provides recommendations on implementation as we move forward. We provide the following results: (1) We demonstrate that fixing asset value correlations by regulators without a specification of business unit granularity and aggregation impacts franchise risk. (2) Loss distributions for credit risk are more sensitive to correlation assumptions that those for market risk; arbitrary, inaccurate correlation specifications can cause large errors in capital requirements. (3) Current regulations do not recognize that credit losses depend on four distinct correlations, not just one. (4) Recovery rates may be determined uniformly across banks. (5) Tail risk comes from LGD correlations and non-Gaussian risks. (6) The 1-year VaR horizon causes distortions especially when regimes and pro-cyclicality are involved. (7) We recommend a quantitative measure for implementing market discipline, the third pillar of the Basel II accord. Therefore, this paper highlights many issues that may be addressed using the tools banks already employ for internal risk management.  相似文献   

19.
An accurate forecast of the parameter loss given default (LGD) of loans plays a crucial role for risk-based decision making by banks. We theoretically analyze problems arising when forecasting LGDs of bank loans that lead to inconsistent estimates and a low predictive power. We present several improvements for LGD estimates, considering length-biased sampling, different loan characteristics depending on the type of default end, and different information sets according to the default status. We empirically demonstrate the capability of our proposals based on a data set of 69,985 defaulted bank loans. Our results are not only important for banks, but also for regulators, because neglecting these issues leads to a significant underestimation of capital requirements.  相似文献   

20.
We analyze the market assessment of sovereign credit risk using a reduced-form model to price the credit default swap (CDS) spreads, thus enabling us to derive values for the probability of default (PD) and loss given default (LGD) from the quotes of sovereign CDS contracts. We compare different specifications of the models allowing for both fixed and time-varying LGD, and we use these values to analyze the sovereign credit risk of Polish debt throughout the period of a global financial crisis. Our results suggest the presence of a low LGD and a relatively high PD during a recent financial crisis.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号