首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
We analyze the market assessment of sovereign credit risk using a reduced-form model to price the credit default swap (CDS) spreads, thus enabling us to derive values for the probability of default (PD) and loss given default (LGD) from the quotes of sovereign CDS contracts. We compare different specifications of the models allowing for both fixed and time-varying LGD, and we use these values to analyze the sovereign credit risk of Polish debt throughout the period of a global financial crisis. Our results suggest the presence of a low LGD and a relatively high PD during a recent financial crisis.  相似文献   

2.
In this article, a generic severity risk framework in which loss given default (LGD) is dependent upon probability of default (PD) in an intuitive manner is developed. By modeling the conditional mean of LGD as a function of PD, which also varies with systemic risk factors, this model allows an arbitrary functional relationship between PD and LGD. Based on this framework, several specifications of stochastic LGD are proposed with detailed calibration methods. By combining these models with an extension of CreditRisk+, a versatile mixed Poisson credit risk model that is capable of handling both risk factor correlation and PD–LGD dependency is developed. An efficient simulation algorithm based on importance sampling is also introduced for risk calculation. Empirical studies suggest that ignoring or incorrectly specifying severity risk can significantly underestimate credit risk and a properly defined severity risk model is critical for credit risk measurement as well as downturn LGD estimation.  相似文献   

3.
We present a screening model of the risk sensitivity of bank capital regulation. A banker funds a project with uninsured deposits and costly capital. Capital resolves a moral hazard problem in the choice of the probability of default (PD). The project’s loss given default (LGD) is the banker’s private information. The regulator receives a noisy signal about the LGD and imposes a minimum capital requirement. We show that the optimal sensitivity of capital regulation is non-monotonic in the accuracy of risk assessment. If the signal is inaccurate, the regulator should use risk-insensitive capital requirements. Given sufficient accuracy, the regulator should separate types via risk-sensitive capital requirements, reducing the risk-sensitivity of bank capital as accuracy improves.  相似文献   

4.
本文通过实证研究提出并论证了一种宏观压力测试方法,该方法可用于银行业监管和系统性风险的防范.首先采用有序多分类Logistic模型测算行业原始违约概率,再运用MFD违约概率模型将宏观冲击因子引入以求得渗入宏观经济因子的违约概率,然后采用CreditRisk+模型分别测算不同宏观压力情景下与信用风险对应的经济资本变化,经...  相似文献   

5.
We apply multiple machine learning (ML) methods to model loss given default (LGD) for corporate debt using a common dataset that is cross-sectional but collected over different time periods and shows much variation over time. We investigate the efficacy of three cross-validation (CV) schemes for hyper-parameter tuning and bootstrap aggregation (Bagging) in preventing out-of-time model performance deterioration. The three CV methods are shuffled K-fold, unshuffled K-fold and sequential blocked, which completely destroys, keeps some and completely retains the chronological order in the data, respectively. We find that it is important to keep the chronological order in the data when creating the training and testing samples, and the more the chronological order that can be retained, the more stable the out-of-time ML LGD model performance. By contrast, although bagging improves out-of-time fit in some cases, its effectiveness is rather marginal relative to that from the unshuffled K-fold and sequential blocked CV methods. Substantial uncertainty in relative out-of-time performance remains, however, thus ongoing model performance monitoring and benchmarking are still essential for sound model risk management for corporate LGD and other ML models.  相似文献   

6.
An accurate forecast of the parameter loss given default (LGD) of loans plays a crucial role for risk-based decision making by banks. We theoretically analyze problems arising when forecasting LGDs of bank loans that lead to inconsistent estimates and a low predictive power. We present several improvements for LGD estimates, considering length-biased sampling, different loan characteristics depending on the type of default end, and different information sets according to the default status. We empirically demonstrate the capability of our proposals based on a data set of 69,985 defaulted bank loans. Our results are not only important for banks, but also for regulators, because neglecting these issues leads to a significant underestimation of capital requirements.  相似文献   

7.
We propose a new procedure to estimate the loss given default (LGD) distribution. Owing to the complicated shape of the LGD distribution, using a smooth density function as a driver to estimate it may result in a decline in model fit. To overcome this problem, we first apply the logistic regression to estimate the LGD cumulative distribution function. Then, we convert the result into the LGD distribution estimate. To implement the newly proposed estimation procedure, we collect a sample of 5269 defaulted debts from Moody’s Default and Recovery Database. A performance study is performed using 2000 pairs of in-sample and out-of-sample data-sets with different sizes that are randomly selected from the entire sample. Our results show that the newly proposed procedure has better and more robust performance than its alternatives, in the sense of yielding more accurate in-sample and out-of-sample LGD distribution estimates. Thus, it is useful for studying the LGD distribution.  相似文献   

8.
Journal of Financial Services Research - We propose a new procedure to predict the loss given default (LGD) distribution. Studies find empirical evidence that LGD values have a high concentration...  相似文献   

9.
We use an intensity-based framework to study the relation between macroeconomic fundamentals and cycles in defaults and rating activity. Using Standard and Poor's U.S. corporate rating transition and default data over the period 1980–2005, we directly estimate the default and rating cycle from micro data. We relate this cycle to the business cycle, bank lending conditions, and financial market variables. In line with earlier studies, the macro variables appear to explain part of the default cycle. However, we strongly reject the correct dynamic specification of these models. The problem is solved by adding an unobserved dynamic component to the model, which can be interpreted as an omitted systematic credit risk factor. By accounting for this latent factor, many of the observed macro variables loose their significance. There are a few exceptions, but the economic impact of the observed macro variables for credit risk remains low. We also show that systematic credit risk factors differ over transition types, with risk factors for downgrades being noticeably different from those for upgrades. We conclude that portfolio credit risk models based only on observable systematic risk factors omit one of the strongest determinants of credit risk at the portfolio level. This has obvious consequences for current modeling and risk management practices.  相似文献   

10.
Using loan-level foreclosure auction data we study the loss given default (LGD) of defaulted residential mortgages originated in Korea, a low LTV regime. We find that senior mortgages generate very low loss rates (5–10%) while losses of subordinated claims are in 30–50% range. We document the effects of housing market cycles on loss severity by showing that collateral characteristics that are overvalued during the boom increase loss severity during the market downturn. We also investigate how a broad set of time-of-origination and post-origination information on loan, collateral and borrower characteristics and foreclosure auction process influence the LGD of residential mortgages.  相似文献   

11.
12.
The New Basel Accord allows internationally active banking organizations to calculate their credit risk capital requirements using an internal ratings based approach, subject to supervisory review. One of the modeling components is the loss-given default (LGD): it represents the credit loss for a bank when extreme events occur that influence the obligor ability to repay his debts to the bank. Among researchers and practitioners the use of statistical models such as linear regression, Tobit or decision trees is quite common in order to compute LGDs as a forecasting of historical losses. However, these statistical techniques do not seem to provide robust estimation and show low performance. These results could be driven by some factors that make differences in LGD, such as the presence and quality of collateral, timing of the business cycle, workout process management and M&A activity among banks. This paper evaluates an alternative method of modeling LGD using a technique based on advanced credibility theory typically used in actuarial modeling. This technique provides a statistical component to the credit and workout experts’ opinion embedded in the collateral and workout management process and improve the predictive power of forecasting. The model has been applied to an Italian Bank Retail portfolio represented by Overdrafts; the application of credibility theory provides a higher predictive power of LGD estimation and an out-of-time sample backtesting has shown a stable accuracy of estimates with respect to the traditional LGD model.  相似文献   

13.
This paper explores the traditional and prevalent approach to credit risk assessment – the rating system. We first describe the rating systems of the two main credit rating agencies, Standard & Poor's and Moody's. Then we show how an internal rating system in a bank can be organized in order to rate creditors systematically. We suggest adopting a two-tier rating system. First, an obligor rating that can be easily mapped to a default probability bucket. Second, a facility rating that determines the loss parameters in case of default, such as (i) “loss given default” (LGD), which depends on the seniority of the facility and the quality of the gurantees, and (ii) “usage given default” (UGD) for loan commitments, which depends on the nature of the commitment and the rating history of the borrower.  相似文献   

14.
We propose a novel credit default model that takes into account the impact of macroeconomic factors and intergroup contagion on the defaults of obligors. We use a set-valued Markov chain to model the default process, which includes all defaulted obligors in the group. We obtain analytic characterizations for the default process and derive pricing formulas in explicit forms for synthetic collateralized debt obligations (CDOs). Furthermore, we use market data to calibrate the model and conduct numerical studies on the tranche spreads of CDOs. We find evidence to support that systematic default risk coupled with default contagion could have the leading component of the total default risk.  相似文献   

15.
Small-medium enterprises (SMEs) encounter financial constraints when they try to obtain credit from banks. These constraints are particularly severe for innovative SMEs. Thus, developing models for innovative SMEs that provide reliable estimates of their probabilities of default (PD) is important because the PDs can also serve as ratings. We examine the role of innovative assets such as patents in credit risk modelling due to their signaling value. Specifically, we add to a logit model two innovation-related variables in order to account for both the dimension and the value of the patent portfolio. Based on a unique data set of innovative SMEs with default years of 2005–2008, we show that, although the value of the patent portfolio always reduces the PD, its dimension reduces the firm’s riskiness only if coupled with an appropriate equity level.  相似文献   

16.
Recent studies find a positive correlation between default and loss given default rates of credit portfolios. In response, financial regulators require financial institutions to base their capital on 'Downturn' loss rates given default which are also known as Downturn LGDs. This article proposes a concept for the Downturn LGD which incorporates econometric properties of credit risk as well as the information content of default and loss given default models. The concept is compared to an alternative proposal by the Department of the Treasury, the Federal Reserve System and the Federal Insurance Corporation. An empirical analysis is provided for US American corporate bond portfolios of different credit quality, seniority and security.  相似文献   

17.
违约概率(PD)的计量是商业银行内部评级体系的基础,它对整个内部评级体系的效果有根本性的影响.目前各种违约概率计量方法最大的缺陷是忽略了时间效应的影响.本文提出的含随机截距项的二值响应面板数据模型是对现有各种方法的深入和完善.首先,它成功地将二值响应模型融合在面板数据分析中;其次,它特别考虑了因为观测时间不同而产生的时间效应,依此在模型中加入了随机截距项.实证结果表明,这一方法具有更好的解释能力和预测效果,是银行业进行内部评级工作理想的模型,因此具有很强的理论价值和实践意义.  相似文献   

18.

Commercial real estate (CRE) loan losses are a recurring contributor to bank failures and financial instability, yet they are not well understood. We examine a unique and proprietary data set of CRE loan defaults at banks that failed and were resolved by the FDIC after the 2008 financial crisis. We build upon an existing literature relating stochastic collateral values to loss given default (LGD). Consistent with model predictions, we show that CRE loans defaulting sooner after origination are more sensitive to declining economic conditions and exhibit LGDs that are more severe. These results are robust to a number of factors, including the declining balance of the loan over time. Our findings point to an inherent fragility associated with high CRE loan growth, even without necessarily a deterioration in lending standards, due to the changing composition of CRE loan seasoning in the industry. This reflects an unexplored risk in the literature concerning rapid and cyclical expansions in CRE credit.

  相似文献   

19.
A forward default prediction method based on the discrete-time competing risk hazard model (DCRHM) is proposed. The proposed model is developed from the discrete-time hazard model (DHM) by replacing the binary response data in DHM with the multinomial response data, and thus allowing the firms exiting public markets for different causes to have different effects on forward default prediction. We show that DCRHM is a reliable and efficient model for forward default prediction through maximum likelihood analysis. We use actual panel data-sets to illustrate the proposed methodology. Using an expanding rolling window approach, our empirical results statistically confirm that DCRHM has better and more robust out-of-sample performance than DHM, in the sense of yielding more accurate predicted number of forward defaults. Thus, DCRHM is a useful alternative for studying forward default losses on portfolios.  相似文献   

20.
A fair level of provisions on bad and doubtful loans is an essential input in mark-to-market accounting, and in the calculation of bank profitability, capital and solvency. Loan-loss provisioning is directly related to estimates of loan-loss given default (LGD). A literature on LGD on bank loans is developing but, surprisingly, it has not been exploited to address, at the micro level, the issue of provisioning at the time of default, and after the default date. For example, in Portugal, the central bank imposes a mandatory provisioning schedule based on the time period since a loan is declared ‘non-performing’. The dynamic schedule is ‘ad hoc’, not based on empirical studies. The purpose of the paper is to present an empirical methodology to calculate a fair level of loan-loss provisions, at the time of default and after the default date. To illustrate, a dynamic provisioning schedule is estimated with micro-data provided by a Portuguese bank on recoveries on non-performing loans. This schedule is then compared to the regulatory provisioning schedule imposed by the central bank.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号