首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 99 毫秒
1.
2.
违约点的选择对KMV模型预测的准确性具有直接影响.期权定价模型的假设条件表明,当公司资产价值处于负债价值总额与短期负债之间的某一点时,公司才会违约.KMV公司根据大量违约事件的实证分析,发现违约发生最频繁的临界点处于公司价值大约等于流动负债加百分之五十的长期负债的时候,但由于我国历史违约数据严重缺乏,目前尚不能通过统计分析找出我国上市公司的违约点.作为一个次优选择,本文认为我国上市公司的违约点可以通过比较不同违约点下违约公司和正常公司违约距离差异的显著程度来确定.通过实证研究,本文得出我国上市公司的违约点是流动负债加百分之十的长期负债.  相似文献   

3.
根据KMV模型理论,企业的违约率是由企业价值与违约点的距离决定的,其中违约点计算的经验公式由KMV公司给出。本文以我国上市公司为研究对象,通过挖掘财务报表信息,对各公司债务结构进行分析,并重新划分获得三类债权数据,以三类资产负债率为自变量进行拟合,消除公司规模间差异,从而获得适用我国上市公司违约点计算的一般公式。  相似文献   

4.
利用世代交叠模型(OLG模型)探讨划转国有资本与养老保险制度改革及人口结构演化的关系。假设社保基金持股不变现、从国有资本净收入中获取分红,结果显示:中、高生育率方案下,如果做实个人账户,划转率在55%~65%,就能够实现社会福利最大化和目标替代率,并兼顾年轻消费的转折点;10%的政策划转率与此仍有较大差距。为保障社会福利和替代率目标,应增加国有资本划转比例,优化制度参数,并辅以改善制度运行环境的其他措施。  相似文献   

5.
6.
GARCH模型是金融资产波动率研究最常用的方式,近期Chou提出的CARR模型在估计波动率方面也显示一定的先进性。本文以我国黄金现货市场的主要交易品种Au99.95为研究对象,利用GARCH模型、GARCH-SKST模型、CARR模型和CARRX模型对我国黄金现货市场的波动率进行拟和,并根据Mincer-Zarnowitz回归方程和Diebold-Mariano检验对各类模型进行了预测能力比较分析,得到的结论是CARR模型在波动率研究方面优于其他模型。  相似文献   

7.
袁周波 《时代金融》2012,(30):317+344
本文采用沪深300日间隔为5分钟的高频数据,构建了日间收益序列和日已实现极差波动(RRV)序列,然后分别建立R-GARCH模型与HAR模型,并采用M-Z回归及损失函数作为判别准则对两类模型的波动预测能力进行了测度。结果表明,无论从M-Z回归结果还是损失函数值来看,R-GARCH模型都要优于HAR模型。  相似文献   

8.
KMV模型在商业银行度量上市公司信用质量上占据重要的地位,但由于KMV模型是根据美国市场情况开发的,更适合一些西方发达资本市场国家,我国在引进KMV模型时,将根据我国具体情况进行修正。本文认为上市公司所属行业不同,会影响到其违约点的选择.从而通过对ICB行业划分中的基础材料、工业、消费业进行样本验证,当违约点依次为流动负债加40%、70%、10%的长期负债时,KMV对上市公司的信用质量甄别效果最好。进而认为商业银行在度量上市公司信用质量时,应分行业差别性度量。  相似文献   

9.
在信用风险管理领域,由于中国商业银行信贷数据只满足Logistic模型的要求,因而预测单个信用资产违约率只能以Logistic模型为主线建模。以中国某商业银行1999~2005年的信贷数据为样本,实证分析得出,企业本身、宏观经济、地区及行业四方面因素对企业违约概率存在显著相关性。通过以上述四因素为变量,所构建的预测电力、公路、城镇建设三个行业信用资产违约概率的Logistic模型分析与预测单个信用资产违约的结果来看,四要素模型对中国商业银行的信用风险管理具有参照价值。  相似文献   

10.
建设银行信用风险模型体系包括非零售类客户评级模型体系、零售客户评分和分池模型体系、经济资本计量模型体系等几十个模型。在人力资源有限的条件下,如何对上述模型及时开展监测分析、验证、重新开发?传统的模型研发管理模式面临挑战。  相似文献   

11.
违约风险模型对违约定义的敏感性研究   总被引:1,自引:0,他引:1  
本文根据贷款信用风险四级分类和五级分类定义三种违约,运用商业银行的贷款企业会计数据分析了Logistic违约风险模型对违约定义的敏感性。实证研究表明,在三种违约定义下,违约模型的结构相似,但模型选择的变量和变量的显著水平存在差异,违约模型对违约定义具有敏感性。  相似文献   

12.
违约概率(PD)的计量是商业银行内部评级体系的基础,它对整个内部评级体系的效果有根本性的影响.目前各种违约概率计量方法最大的缺陷是忽略了时间效应的影响.本文提出的含随机截距项的二值响应面板数据模型是对现有各种方法的深入和完善.首先,它成功地将二值响应模型融合在面板数据分析中;其次,它特别考虑了因为观测时间不同而产生的时间效应,依此在模型中加入了随机截距项.实证结果表明,这一方法具有更好的解释能力和预测效果,是银行业进行内部评级工作理想的模型,因此具有很强的理论价值和实践意义.  相似文献   

13.
The discounted dividends model advanced by Dempsey (1996) is extended to provide a weighted average cost of capital (WACC) assessment of investment opportunities with irregular cash flows. Thereafter, the framework is extended to an assessment of the implications of government tax policy for the firm's investment behaviour. The developed framework is consistent with the empirical evidence of Poterba and Summers (1985) which — over the period of UK tax history 1950–1983 encompassing four major tax on equity reforms — observes how the related dividend and investment politics of UK firms appear to be influenced by the level of dividend taxes.  相似文献   

14.
The Journal of Real Estate Finance and Economics - We use neighborhood boundaries, in addition to a concentric rings approach, to examine single family foreclosure spillover effects on residential...  相似文献   

15.
The stock price reaction to straight debt announcements is examined by differentiating firms on the basis of any subsequent change in their overall default risk. Results indicate that firms that will within six months of straight debt announcements undergo debt rating downgrades experience significant negative abnormal stock returns at the time of the new debt announcements, while firms with bond ratings that are later upgraded exhibit significant positive abnormal returns. Multiple regression analysis shows these results to be robust to the influence of filing size, tax shield effects, relative pre-announcement long-term debt levels, and subordination effects.  相似文献   

16.
17.
The authors develop a new way to measure the cost of capital, called the empirical average cost of capital (or “EACC”), which is consistent with existing methods of calculating the weighted average cost of capital, but uses information from the firm's financial statements and requires fewer and less subjective inputs. The authors’ model relies on the concept of economic profit while using data from the period 1990‐2012 on net operating profits and total capital to estimate the EACC at both the individual company and industry‐wide levels. Estimates of the EACC and rolling quarterly forecasts of future net operating profits for a single company, McDonald's, for its related industry, and for 57 other U.S. industries are compared to five conventional “textbook” estimates of the weighted average cost of capital published by Ibbotson Associates. The authors find that the EACC yields forecasts of future net operating profit after taxes that compare favorably to those of the five published measures of the weighted average cost of capital, as well as the average and median of these measures.  相似文献   

18.
We verify the existence of a relation between loss given default rate (LGDR) and macroeconomic conditions by examining 11,649 bank loans concerning the Italian market. Using both the univariate and multivariate analyses, we pinpoint diverse macroeconomic explanatory variables for LGDR on loans to households and SMEs. For households, LGDR is more sensitive to the default-to-loan ratio, the unemployment rate, and household consumption. For SMEs, LGDR is influenced by the total number of employed people and the GDP growth rate. These findings corroborate the Basel Committee’s provision that LGDR quantification process must identify distinct downturn conditions for each supervisory asset class.
Francesca Querci (Corresponding author)Email:
  相似文献   

19.
This article deals with the optimal design of insurance contracts when the insurer faces administrative costs. If the literature provides many analyses of risk sharing with such costs, it is often assumed that these costs are linear. Furthermore, mathematical tools or initial conditions differ from one paper to another. We propose here a unified framework in which the problem is presented and solved as an infinite dimensional optimization program on a functional vector space equipped with an original norm. This general approach leads to the optimality of contracts lying on the frontier of the indemnity functions set. This frontier includes, in particular, contracts with a deductible, with total insurance and the null vector. Hence, we unify the existing results and point out some extensions.  相似文献   

20.
This article proposes a new loss reserving approach, inspired from the collective model of risk theory. According to the collective paradigm, we do not relate payments to specific claims or policies, but we work within a frequency-severity setting, with a number of payments in every cell of the run-off triangle, together with the corresponding paid amounts. Compared to the Tweedie reserving model, which can be seen as a compound sum with Poisson-distributed number of terms and Gamma-distributed summands, we allow here for more general severity distributions, typically mixture models combining a light-tailed component with a heavier-tailed one, including inflation effects. The severity model is fitted to individual observations and not to aggregated data displayed in run-off triangles with a single value in every cell. In that respect, the modeling approach appears to be a powerful alternative to both the crude traditional aggregated approach based on triangles and the extremely detailed individual reserving approach developing each and every claim separately. A case study based on a motor third-party liability insurance portfolio observed over 2004–2014 is used to illustrate the relevance of the proposed approach.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号