首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
《Journal of Banking & Finance》2006,30(10):2605-2634
This paper conducts an event study analysis of the impact of operational loss events on the market values of banks and insurance companies, using the OpVar database. We focus on financial institutions because of the increased market and regulatory scrutiny of operational losses in these industries. The analysis covers all publicly reported banking and insurance operational risk events affecting publicly traded US institutions from 1978 to 2003 that caused operational losses of at least $10 million – a total of 403 bank events and 89 insurance company events. The results reveal a strong, statistically significant negative stock price reaction to announcements of operational loss events. On average, the market value response is larger for insurers than for banks. Moreover, the market value loss significantly exceeds the amount of the operational loss reported, implying that such losses convey adverse implications about future cash flows. Losses are proportionately larger for institutions with higher Tobin’s Q ratios, implying that operational loss events are more costly in market value terms for firms with strong growth prospects.  相似文献   

2.
In recent years, general risk measures play an important role in risk management in both finance and insurance industry. As a consequence, there is an increasing number of research on optimal reinsurance decision problems using risk measures beyond the classical expected utility framework. In this paper, we first show that the stop-loss reinsurance is an optimal contract under law-invariant convex risk measures via a new simple geometric argument. A similar approach is then used to tackle the same optimal reinsurance problem under Value at Risk and Conditional Tail Expectation; it is interesting to note that, instead of stop-loss reinsurances, insurance layers serve as the optimal solution. These two results highlight that law-invariant convex risk measure is better and more robust, in the sense that the corresponding optimal reinsurance still provides the protection coverage against extreme loss irrespective to the potential increment of its probability of occurrence, to expected larger claim than Value at Risk and Conditional Tail Expectation which are more commonly used. Several illustrative examples will be provided.  相似文献   

3.
This paper argues that in the fundamental subject of financial risk analysis, some valuable lessons may be drawn from insurance. The probability of ruin, defined as a first passage time, carries a dynamic element whose absence in Value at Risk is one liability, among others. Extreme value theory, which has been successfully applied to insurance shortly after it was introduced in probability, may offer a coherent framework for analyzing the extreme moves such as the ones observed in recent foreign exchange and financial crises. Lastly, we show that the genuine hazards generated by global capital markets and illustrated by the events of summer 1998, generate a market incompleteness that existing models of defaultable bonds do not fully address. However, the long experience of risk premium analysis in the insurance and reinsurance industry, as well as the existence of historical data on natural disasters, render the valuation of catastrophe bonds less perilous than that of defaultable bonds.  相似文献   

4.

In this paper we present an overview of the standard risk sharing model of insurance. We discuss and characterize a competitive equilibrium, Pareto optimality, and representative agent pricing, including its implications for insurance premiums. We only touch upon the existence problem of a competitive equilibrium, primarily by presenting several examples. Risk tolerance and aggregation is the subject of one section. Risk adjustment of the probability measure is one topic, as well as the insurance version of the capital asset pricing model. The competitive paradigm may be a little demanding in practice, so we alternatively present a game theoretic view of risk sharing, where solutions end up in the core. Properly interpreted, this may give rise to a range of prices of each risk, often visualized in practice by an ask price and a bid price. The nice aspect of this is that these price ranges can be explained by "first principles", not relying on transaction costs or other frictions. We also include a short discussion of moral hazard in risk sharing between an insurer and a prospective insurance buyer. We end the paper by indicating the implications of our results for a pure stock market. In particular we find it advantageous to discuss the concepts of incomplete markets in this general setting, where it is possible to use results for closed, convex subspaces of an L 2 -space to discuss optimal risk allocation problems in incomplete financial markets.  相似文献   

5.
Geman  Helyette 《Review of Finance》1999,2(2):113-124
This paper argues that in the fundamental subject of financialrisk analysis, some valuable lessons may be drawn from insurance.The probability of ruin, defined as a first passage time, carriesa dynamic element whose absence in Value at Risk is one liability,among others. Extreme value theory, which has been successfullyapplied to insurance shortly after it was introduced in probability,may offer a coherent framework for analyzing the extreme movessuch as the ones observed in recent foreign exchange and financialcrises. Lastly, we show that the genuine hazards generated byglobal capital markets and illustrated by the events of summer1998, generate a market incompleteness that existing modelsof defaultable bonds do not fully address. In contrast, thelong experience of risk premium analysis in the insurance andreinsurance industry, as well as the existence of historicaldata on natural disasters, render the valuation of catastrophebonds less perilous than that of defaultable bonds.  相似文献   

6.
In this paper, we impose the insurer's Value at Risk (VaR) constraint on Arrow's optimal insurance model. The insured aims to maximize his expected utility of terminal wealth, under the constraint that the insurer wishes to control the VaR of his terminal wealth to be maintained below a prespecified level. It is shown that when the insurer's VaR constraint is binding, the solution to the problem is not linear, but piecewise linear deductible, and the insured's optimal expected utility will increase as the insurer becomes more risk-tolerant. Basak and Shapiro (2001) showed that VaR risk managers often choose larger risk exposures to risky assets. We draw a similar conclusion in this paper. It is shown that when the insured has an exponential utility function, optimal insurance based on VaR constraint causes the insurer to suffer larger losses than optimal insurance without insurer's risk constraint.  相似文献   

7.
We examine the predictive value of risk perceptions as measured in terms of the gold-to-silver and gold-to-platinum price ratios for stock-market tail risks and their connectedness in eight major industrialized economies using monthly data for the period 1916:02–2020:10 and 1968:01–2020:10, where we use four variants of the popular Conditional Autoregressive Value at Risk (CAViaR) framework to estimate the tail risks for both 1% and 5% VaRs. Our findings for the short sample period show that the gold-to-silver price ratio resembles the gold-to-platinum price ratios in that it is a useful proxy for global risk. Our findings for the long sample period show, despite some heterogeneity across economies, that the gold-to-silver price ratio often helps to out-of-sample forecast for both 1% and 5% stock market tail risks, particularly when a forecaster suffers a higher loss from underestimation of tail risks than from a corresponding overestimation of the same absolute size. We also find that using the gold-to-silver price ratio for forecasting the total connectedness of stock markets is beneficial for an investor who suffers a higher loss from an underestimation of total connectedness (i.e., an investor who otherwise would overestimate the benefits from portfolio diversification) than from a comparable overestimation.  相似文献   

8.
Catastrophes are by definition rare, which makes it difficult to project future losses using historical loss information. Catastrophe modelers have developed alternative methodologies based on sophisticated techniques that combine physics, meteorology, engineering, statistics, actuarial sciences, and other disciplines to provide estimates of the likelihood of losses from extreme events. This article discusses the basics of catastrophe modeling and describes an approach to implementing a lesson on catastrophe modeling in the insurance curriculum.  相似文献   

9.
What is the catastrophe risk a life insurance company faces? What is the correct price of a catastrophe cover? During a review of the current standard model, due to Strickler, we found that this model has some serious shortcomings. We therefore present a new model for the pricing of catastrophe excess of loss cover (Cat XL). The new model for annual claim cost C is based on a compound Poisson process of catastrophe costs. To evaluate the distribution of the cost of each catastrophe, we use the Peaks Over Threshold model for the total number of lost lives in each catastrophe and the beta binomial model for the proportion of these corresponding to customers of the insurance company. To be able to estimate the parameters of the model, international and Swedish data were collected and compiled, listing accidents claiming at least twenty and four lives, respectively. Fitting the new model to data, we find the fit to be good. Finally we give the price of a Cat XL contract and perform a sensitivity analysis of how some of the parameters affect the expected value and standard deviation of the cost and thus the price.  相似文献   

10.
The vast literature on stochastic loss reserving concentrates on data aggregated in run-off triangles. However, a triangle is a summary of an underlying data-set with the development of individual claims. We refer to this data-set as ‘micro-level’ data. Using the framework of Position Dependent Marked Poisson Processes) and statistical tools for recurrent events, a data-set is analyzed with liability claims from a European insurance company. We use detailed information of the time of occurrence of the claim, the delay between occurrence and reporting to the insurance company, the occurrences of payments and their sizes, and the final settlement. Our specifications are (semi)parametric and our approach is likelihood based. We calibrate our model to historical data and use it to project the future development of open claims. An out-of-sample prediction exercise shows that we obtain detailed and valuable reserve calculations. For the case study developed in this paper, the micro-level model outperforms the results obtained with traditional loss reserving methods for aggregate data.  相似文献   

11.
Recent research has found a number of scaling law relationships in foreign exchange data. These relationships, estimated using simple ordinary least squares, can be used to forecast losses in foreign exchange time series from as little as one month’s tick data. We compare the loss forecasts from a new scaling law against six parametric Value at Risk models. Compared to these models, the new scaling law is easier to fit, provides more stable forecasts and is very accurate.  相似文献   

12.
We have developed a regime switching framework to compute the Value at Risk and Expected Shortfall measures. Although Value at Risk as a risk measure has been criticized by some researchers for lack of subadditivity, it is still a central tool in banking regulations and internal risk management in the finance industry. In contrast, Expected Shortfall is coherent and convex, so it is a better measure of risk than Value at Risk. Expected Shortfall is widely used in the insurance industry and has the potential to replace Value at Risk as a standard risk measure in the near future. We have proposed regime switching models to measure value at risk and expected shortfall for a single financial asset as well as financial portfolios. Our models capture the volatility clustering phenomenon and variance-independent variation in the higher moments by assuming the returns follow Student-t distributions.  相似文献   

13.
Can a Coherent Risk Measure Be Too Subadditive?   总被引:1,自引:0,他引:1  
We consider the problem of determining appropriate solvency capital requirements for an insurance company or a financial institution. We demonstrate that the subadditivity condition that is often imposed on solvency capital principles can lead to the undesirable situation where the shortfall risk increases by a merger. We propose to complement the subadditivity condition by a regulator's condition. We find that for an explicitly specified confidence level, the Value‐at‐Risk satisfies the regulator's condition and is the “most efficient” capital requirement in the sense that it minimizes some reasonable cost function. Within the class of concave distortion risk measures, of which the elements, in contrast to the Value‐at‐Risk, exhibit the subadditivity property, we find that, again for an explicitly specified confidence level, the Tail‐Value‐at‐Risk is the optimal capital requirement satisfying the regulator's condition.  相似文献   

14.
This paper extends the existing literature on deposit insurance by proposing a new approach for the estimation of the loss distribution of a Deposit Insurance Scheme (DIS) that is based on the Basel 2 regulatory framework. In particular, we generate the distribution of banks’ losses following the Basel 2 theoretical approach and focus on the part of this distribution that is not covered by capital (tail risk). We also refine our approach by considering two major sources of systemic risks: the correlation between banks’ assets and interbank lending contagion. The application of our model to 2007 data for a sample of Italian banks shows that the target size of the Italian deposit insurance system covers up to 98.96% of its potential losses. Furthermore, it emerges that the introduction of bank contagion via the interbank lending market could lead to the collapse of the entire Italian banking system. Our analysis points out that the existing Italian deposit insurance system can be assessed as adequate only in normal times and not in bad market conditions with substantial contagion between banks. Overall, we argue that policy makers should explicitly consider the following when estimating DIS loss distributions: first, the regulatory framework within which banks operate such as (Basel 2) capital requirements; and, second, potential sources of systemic risk such as the correlation between banks’ assets and the risk of interbank contagion.  相似文献   

15.
Abstract

This paper presents a general probabilistic model, including stochastic discounting, for life insurance contracts, either a single policy or a portfolio of policies. In § 4 we define prospective reserves and annual losses in terms of our model and we show that these are generalisations of the corresponding concepts in conventional life insurance mathematics. Our main results are in § 5 where we use the martingale property of the loss process to derive upper bounds for the probability of ruin for the portfolio. These results are illustrated by two numerical examples in § 6.  相似文献   

16.
In this paper we propose a framework for measuring and stress testing the systemic risk of a group of major financial institutions. The systemic risk is measured by the price of insurance against financial distress, which is based on ex ante measures of default probabilities of individual banks and forecasted asset return correlations. Importantly, using realized correlations estimated from high-frequency equity return data can significantly improve the accuracy of forecasted correlations. Our stress testing methodology, using an integrated micro–macro model, takes into account dynamic linkages between the health of major US banks and macro-financial conditions. Our results suggest that the theoretical insurance premium that would be charged to protect against losses that equal or exceed 15% of total liabilities of 12 major US financial firms stood at $110 billion in March 2008 and had a projected upper bound of $250 billion in July 2008.  相似文献   

17.
The cash-flow valuation method (CFVM) has been developed in Canada for the valuation of insurance company annuity products. Its range of application is expected to be extended shortly to the valuation of most other life insurance company products. The CFVM is based on the use of “best-guess” assumptions, supplemented by specific provisions for adverse deviations. In this paper, special attention is paid to the calculation of the provision for adverse deviations with respect to the interest rate risk. We show that the determination of this provision is the analog for life insurance and annuity policy liabilities of the calculation by banks of Value at Risk (VaR) with respect to portfolios of securities held for trading.  相似文献   

18.
Cashflow-at-Risk (C-FaR) is an attempt to create an analogue to Value at Risk (VaR) that can be used by non-financial firms to quantify various kinds of risk exposures, including interest rate, exchange rate, and commodity price risks. There are two basic ways to attack this problem. One is from the "bottom up," which involves building a detailed model of all of a company's specific exposures. The C-Far approach presented here is a "top-down" method of comparables that looks directly at the ultimate item of interest—the companies' cashflows. The fundamental challenge facing the top-down strategy is that, for any one company, there is not enough data on its own cashflows to make precise statements about the likelihood of rare events. To get around this problem, the authors match a target company with a large set of comparable companies that are expected to have similar cashflow volatility. The comparables are chosen to be close to the target company on four dimensions: (1) market cap; (2) profitability; (3) industry risk; and (4) stock price volatility.
C-FaR can be useful to managers addressing a variety of corporate finance decisions. For example, by providing estimates of the probability of financial distress, the C-FaR method can be used in conjunction with capital structure data to help formulate debt-equity tradeoffs in a more precise, quantifiable fashion. It can also be used to evaluate a firm's overall risk management strategy, including the expected benefits of using derivatives to hedge commodity-price exposures or the purchase of insurance policies. Moreover, C-FaR may even have a use in investor relations: by disclosing the results of a comparables-based C-FaR analysis ahead of time, a company may be able to cushion earnings shocks by furnishing investors or analysts with credible, objective estimates of what is likely to happen to their cash flows under different economic scenarios.  相似文献   

19.
Conditional VaR using EVT - Towards a planned margin scheme   总被引:2,自引:0,他引:2  
This paper constructs a robust Value-at-Risk (VaR) measure for the Indian stock markets by combining two well-known facts about equity return time series — dynamic volatility resulting in the well-recognized phenomenon of volatility clustering, and non-normality giving rise to fat tails of the return distribution. While the phenomenon of volatility dynamics has been extensively studied using GARCH model and its many relatives, the application of Extreme Value Theory (EVT) is relatively recent in tracking extreme losses in the study of risk measurement. There are recent applications of Extreme Value Theory to estimate the unexpected losses due to extreme events and hence modify the current methodology of VaR. Extreme value theory (EVT) has been used to analyze financial data showing clear non-normal behavior. We combine the two methodologies to come up with a robust model with much enhanced predictive abilities. A robust model would obviate the need for imposing special ad hoc margins by the regulator in times of extreme volatility. A rule based margin system would increase efficiency of the price discovery process and also the market integrity with the regulator no longer seen as managing volatility.  相似文献   

20.
Secondary life insurance markets are growing rapidly. From nearly no transactions in 1980, a wide variety of similar products in this market has developed, including viatical settlements, accelerated death benefits, and life settlements and as the population ages, these markets will become increasingly popular. Eight state governments, in a bid to guarantee sellers a “fair” price, have passed regulations setting a price floor on secondary life insurance market transactions, and more are considering doing the same. Using data from a unique random sample of HIV+ patients, we estimate welfare losses from transactions prevented by binding price floors in the viatical settlements market (an important segment of the secondary life insurance market). We find that price floors bind on HIV patients with greater than 4 years of life expectancy. Furthermore, HIV patients from states with price floors are significantly less likely to viaticate than similarly healthy HIV patients from other states. If price floors were adopted nationwide, they would rule out transactions worth $119 million per year. We find that the magnitude of welfare loss from these blocked transactions would be highest for consumers who are relatively poor, have weak bequest motives, and have a high rate of time preference.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号