首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 218 毫秒
1.
We specify a general methodological framework for systemic risk measures via multidimensional acceptance sets and aggregation functions. Existing systemic risk measures can usually be interpreted as the minimal amount of cash needed to secure the system after aggregating individual risks. In contrast, our approach also includes systemic risk measures that can be interpreted as the minimal amount of cash that secures the aggregated system by allocating capital to the single institutions before aggregating the individual risks. An important feature of our approach is the possibility of allocating cash according to the future state of the system (scenario‐dependent allocation). We also provide conditions that ensure monotonicity, convexity, or quasi‐convexity of our systemic risk measures.  相似文献   

2.
We propose to interpret distribution model risk as sensitivity of expected loss to changes in the risk factor distribution, and to measure the distribution model risk of a portfolio by the maximum expected loss over a set of plausible distributions defined in terms of some divergence from an estimated distribution. The divergence may be relative entropy or another f‐divergence or Bregman distance. We use the theory of minimizing convex integral functionals under moment constraints to give formulae for the calculation of distribution model risk and to explicitly determine the worst case distribution from the set of plausible distributions. We also evaluate related risk measures describing divergence preferences.  相似文献   

3.
This paper provides a conceptual and empirical framework for evaluating the effect of capital controls on long‐term economic growth. In a small open economy which relies on successful investment projects to provide capital goods, taking out short‐term loans has two contradictory impacts: (i) it reduces the interest costs of financing investment projects; and (ii) it also leads to larger asset losses in the scenario of short‐term debt run. In this work, we hypothesise that private financing decisions made by domestic investors are distorted towards excessive risk‐taking, leading to ineffective capital formation. Thus, capital control policies, particularly regulations on short‐term loans, can be socially beneficial as they alter the debt composition, promote capital formation and achieve a higher output level. Using a panel data set covering 77 countries from 1995 to 2009, we employ a system generalised method of moments (GMM) estimator to sequentially test three hypotheses and find strong empirical evidence that supports our theory.  相似文献   

4.
Past research concerning decision framing has found that buyer's choices tend to be risk averse for decisions framed as a gain and risk seeking for choices framed as a loss. Attribution theory suggests that women may be less likely to take risk than men when faced with similar decision‐making problems. The present study sought to determine whether there would be differences between men and women with respect to their supplier choices based on how the purchasing decision was framed. Two experiments were conducted: one using a price‐based purchasing scenario and the other using a price‐based and a quality‐based purchasing scenario. For the price‐based scenario women tended to make more risky supplier choices than men when the purchasing decision was framed as a loss, and less risky supplier choices than men when the purchasing decision was framed as a gain. No differences between the sexes were found for the quality‐based scenario. © 1999 John Wiley & Sons, Inc.  相似文献   

5.
We study risk‐minimizing hedging‐strategies for derivatives in a model where the asset price follows a marked point process with stochastic jump‐intensity, which depends on some unobservable state‐variable process. This model reflects stylized facts that are typical for high frequency data. We assume that agents in our model are restricted to observing past asset prices. This poses some problems for the computation of risk‐minimizing hedging strategies as the current value of the state variable is unobservable for our agents. We overcome this difficulty by a two‐step procedure, which is based on a projection result of Schweizer and show that in our context the computation of risk‐minimizing strategies leads to a filtering problem that has received some attention in the nonlinear filtering literature.  相似文献   

6.
Using a suitable change of probability measure, we obtain a Poisson series representation for the arbitrage‐free price process of vulnerable contingent claims in a regime‐switching market driven by an underlying continuous‐time Markov process. As a result of this representation, along with a short‐time asymptotic expansion of the claim's price process, we develop an efficient novel method for pricing claims whose payoffs may depend on the full path of the underlying Markov chain. The proposed approach is applied to price not only simple European claims such as defaultable bonds, but also a new type of path‐dependent claims that we term self‐decomposable, as well as the important class of vulnerable call and put options on a stock. We provide a detailed error analysis and illustrate the accuracy and computational complexity of our method on several market traded instruments, such as defaultable bond prices, barrier options, and vulnerable call options. Using again our Poisson series representation, we show differentiability in time of the predefault price function of European vulnerable claims, which enables us to rigorously deduce Feynman‐Ka? representations for the predefault pricing function and new semimartingale representations for the price process of the vulnerable claim under both risk‐neutral and objective probability measures.  相似文献   

7.
We propose a stable nonparametric algorithm for the calibration of “top‐down” pricing models for portfolio credit derivatives: given a set of observations of market spreads for collateralized debt obligation (CDO) tranches, we construct a risk‐neutral default intensity process for the portfolio underlying the CDO which matches these observations, by looking for the risk‐neutral loss process “closest” to a prior loss process, verifying the calibration constraints. We formalize the problem in terms of minimization of relative entropy with respect to the prior under calibration constraints and use convex duality methods to solve the problem: the dual problem is shown to be an intensity control problem, characterized in terms of a Hamilton–Jacobi system of differential equations, for which we present an analytical solution. Given a set of observed CDO tranche spreads, our method allows to construct a default intensity process which leads to tranche spreads consistent with the observations. We illustrate our method on ITRAXX index data: our results reveal strong evidence for the dependence of loss transitions rates on the previous number of defaults, and offer quantitative evidence for contagion effects in the (risk‐neutral) loss process.  相似文献   

8.
1075 (Christoph Böhringer and Andreas Löschel) International climate policy has assigned the leading role in emissions abatement to the industrialised countries while developing countries remain uncommitted to binding emission reduction targets. However, cooperation between the industrialised and the developing world through joint implementation of emission abatement promises substantial economic gains to both parties. In this context, the policy debate on joint implementation has addressed the question of how investment risks to project‐based emission crediting between industrialised countries and developing countries affect the magnitude and distribution of such gains. In our quantitative analysis, we find that the incorporation of country‐specific investment risks induces rather small changes vis‐à‐vis a situation where investment risks are neglected. Only if investors go for high safety of returns is there a noticeable decline in the overall volume of emission crediting and the associated total economic benefits. While the welfare effects of risk incorporation for industrialised countries are unequivocally negative, the implications across developing countries are ambiguous. Whereas low‐risk developing countries attract higher project volumes and benefit from higher effective prices per emission credit compared to a reference scenario without risk, the opposite applies to high‐risk countries. The – politically undesired – shift in comparative advantage of emission abatement against high‐risk, typically least‐developed, countries may become larger if risk‐averse investors perceive large differences in project‐based risks across countries. In this case, only very cheap mitigation projects in high‐risk countries will be realised, driving down the respective country's benefits from emission crediting to the advantage of low‐risk developing countries.  相似文献   

9.
We propose a tractable framework for quantifying the impact of loss‐triggered fire sales on portfolio risk, in a multi‐asset setting. We derive analytical expressions for the impact of fire sales on the realized volatility and correlations of asset returns in a fire sales scenario and show that our results provide a quantitative explanation for the spikes in volatility and correlations observed during such deleveraging episodes. These results are then used to develop an econometric framework for the forensic analysis of fire sales episodes, using observations of market prices. We give conditions for the identifiability of model parameters from time series of asset prices, propose a statistical test for the presence of fire sales, and an estimator for the magnitude of fire sales in each asset class. Pathwise consistency and large sample properties of the estimator are studied in the high‐frequency asymptotic regime. We illustrate our methodology by applying it to the forensic analysis of two recent deleveraging episodes: the Quant Crash of August 2007 and the Great Deleveraging following the default of Lehman Brothers in Fall 2008.  相似文献   

10.
We consider an asset whose risk‐neutral dynamics are described by a general class of local‐stochastic volatility models and derive a family of asymptotic expansions for European‐style option prices and implied volatilities. We also establish rigorous error estimates for these quantities. Our implied volatility expansions are explicit; they do not require any special functions nor do they require numerical integration. To illustrate the accuracy and versatility of our method, we implement it under four different model dynamics: constant elasticity of variance local volatility, Heston stochastic volatility, three‐halves stochastic volatility, and SABR local‐stochastic volatility.  相似文献   

11.
A credit valuation adjustment (CVA) is an adjustment applied to the value of a derivative contract or a portfolio of derivatives to account for counterparty credit risk. Measuring CVA requires combining models of market and credit risk to estimate a counterparty's risk of default together with the market value of exposure to the counterparty at default. Wrong‐way risk refers to the possibility that a counterparty's likelihood of default increases with the market value of the exposure. We develop a method for bounding wrong‐way risk, holding fixed marginal models for market and credit risk and varying the dependence between them. Given simulated paths of the two models, a linear program computes the worst‐case CVA. We analyze properties of the solution and prove convergence of the estimated bound as the number of paths increases. The worst case can be overly pessimistic, so we extend the procedure by constraining the deviation of the joint model from a baseline reference model. Measuring the deviation through relative entropy leads to a tractable convex optimization problem that can be solved through the iterative proportional fitting procedure. Here, too, we prove convergence of the resulting estimate of the penalized worst‐case CVA and the joint distribution that attains it. We consider extensions with additional constraints and illustrate the method with examples.  相似文献   

12.
Using tools from spectral analysis, singular and regular perturbation theory, we develop a systematic method for analytically computing the approximate price of a large class of derivative‐assets. The payoff of the derivative‐assets may be path‐dependent. In addition, the process underlying the derivatives may exhibit killing (i.e., jump to default) as well as combined local/nonlocal stochastic volatility. The nonlocal component of volatility may be multiscale, in the sense that it may be driven by one fast‐varying and one slow‐varying factor. The flexibility of our modeling framework is contrasted by the simplicity of our method. We reduce the derivative pricing problem to that of solving a single eigenvalue equation. Once the eigenvalue equation is solved, the approximate price of a derivative can be calculated formulaically. To illustrate our method, we calculate the approximate price of three derivative‐assets: a vanilla option on a defaultable stock, a path‐dependent option on a nondefaultable stock, and a bond in a short‐rate model.  相似文献   

13.
After reviewing the notion of Systemically Important Financial Institution, we propose a first principles way to compute the price of the implicit put option that the State gives to such an institution. Our method is based on important results from extreme value theory, one for the aggregation of heavy‐tailed distributions and the other one for the tail behavior of the value at risk versus the tail value at risk. We show that the value of the put option is proportional to the value at risk of the institution and thus would provide the wrong incentive to banks who are qualified as Systemically Important Financial Institutions. This wrong incentive exists even if the guarantee is not explicitly granted. We conclude with a proposal to make the institution pay the price of this option to a fund, whose task would be to guarantee the orderly bankruptcy of such an institution. This fund would function like an insurance selling a cover to clients.  相似文献   

14.
Accounting for model uncertainty in risk management and option pricing leads to infinite‐dimensional optimization problems that are both analytically and numerically intractable. In this article, we study when this hurdle can be overcome for the so‐called optimized certainty equivalent (OCE) risk measure—including the average value‐at‐risk as a special case. First, we focus on the case where the uncertainty is modeled by a nonlinear expectation that penalizes distributions that are “far” in terms of optimal‐transport distance (e.g. Wasserstein distance) from a given baseline distribution. It turns out that the computation of the robust OCE reduces to a finite‐dimensional problem, which in some cases can even be solved explicitly. This principle also applies to the shortfall risk measure as well as for the pricing of European options. Further, we derive convex dual representations of the robust OCE for measurable claims without any assumptions on the set of distributions. Finally, we give conditions on the latter set under which the robust average value‐at‐risk is a tail risk measure.  相似文献   

15.
We introduce a general model for the balance‐sheet consistent valuation of interbank claims within an interconnected financial system. Our model represents an extension of clearing models of interdependent liabilities to account for the presence of uncertainty on banks' external assets. At the same time, it also provides a natural extension of classic structural credit risk models to the case of an interconnected system. We characterize the existence and uniqueness of a valuation that maximizes individual and total equity values for all banks. We apply our model to the assessment of systemic risk and in particular for the case of stress testing. Further, we provide a fixed‐point algorithm to carry out the network valuation and the conditions for its convergence.  相似文献   

16.
This paper considers the problem of risk sharing, where a coalition of homogeneous agents, each bearing a random cost, aggregates their costs, and shares the value‐at‐risk of such a risky position. Due to limited distributional information in practice, the joint distribution of agents' random costs is difficult to acquire. The coalition, being aware of the distributional ambiguity, thus evaluates the worst‐case value‐at‐risk within a commonly agreed ambiguity set of the possible joint distributions. Through the lens of cooperative game theory, we show that this coalitional worst‐case value‐at‐risk is subadditive for the popular ambiguity sets in the distributionally robust optimization literature that are based on (i) convex moments or (ii) Wasserstein distance to some reference distributions. In addition, we propose easy‐to‐compute core allocation schemes to share the worst‐case value‐at‐risk. Our results can be readily extended to sharing the worst‐case conditional value‐at‐risk under distributional ambiguity.  相似文献   

17.
In this paper, we study the aggregate risk of inhomogeneous risks with dependence uncertainty, evaluated by a generic risk measure. We say that a pair of risk measures is asymptotically equivalent if the ratio of the worst‐case values of the two risk measures is almost one for the sum of a large number of risks with unknown dependence structure. The study of asymptotic equivalence is particularly important for a pair of a noncoherent risk measure and a coherent risk measure, as the worst‐case value of a noncoherent risk measure under dependence uncertainty is typically difficult to obtain. The main contribution of this paper is to establish general asymptotic equivalence results for the classes of distortion risk measures and convex risk measures under different mild conditions. The results implicitly suggest that it is only reasonable to implement a coherent risk measure for the aggregation of a large number of risks with uncertainty in the dependence structure, a relevant situation for risk management practice.  相似文献   

18.
In this paper, we propose an aggregation procedure, which includes two steps. In the first one, the group's members elaborate individually a preorder (total or partial) on the alternatives set. These preorders can be obtained, for example, by applying a multicriterion aggregation procedure such as ELECTRE III, PROMETHEE I or II. In the second step, these individual preorders (total or partial) are aggregated into a collective preorder (total or partial). For this purpose, we first built a distance measure between pairs of binary relations in order to quantify the divergence between two preorders. Then we determine, for each pair of alternatives, the relative importance coefficients of the group's members by using a method based on subjective and objective components. These coefficients and the distance measure are finally exploited by an iterative algorithm in order to elaborate a collective preorder synthesizing the individual preorders.  相似文献   

19.
This study examines the impacts of unexpected exchange rate movements of Chinese renminbi (CNY/USD) on stock returns of Standard & Poor's 500 corporations. To gain better insight into the long‐standing exposure puzzle concerning the divergence between theoretical and empirical investigation and the critical issue regarding the determinants of exposure, the time variation and asymmetry property associated with foreign exchange exposure are further taking into account. We also broaden the scope of analysis to investigate unexpected risk exposure at 10 economic sectors in compliance with Global Industry Classification Standard. Unlike those from the static analysis or dynamic ones with the ordinary least squares model, the empirical findings provide additional concrete evidence that unexpected exchange rate exposure varies substantially across time and market circumstances when using the quantile regression with a rolling method. Furthermore, our dynamic analysis not only supports the existence of industrial effects across different economic sectors, but also provides strong evidence that financial factors, such as size, debt ratio and book to market ratio, have a significant impact on exposure.  相似文献   

20.
We propose a nonparametric kernel estimation method (KEM) that determines the optimal hedge ratio by minimizing the downside risk of a hedged portfolio, measured by conditional value‐at‐risk (CVaR). We also demonstrate that the KEM minimum‐CVaR hedge model is a convex optimization. The simulation results show that our KEM provides more accurate estimations and the empirical results suggest that, compared to other conventional methods, our KEM yields higher effectiveness in hedging the downside risk in the weather‐sensitive markets.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号