首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
This paper examines financial contagion, that is, whether the cross-market linkages in financial markets increase after a shock to a country. We use a new measure of local dependence (introduced by Tjøstheim and Hufthammer (2013)) to study the contagion effect. The central idea of the new approach is to approximate an arbitrary bivariate return distribution by a family of Gaussian bivariate distributions. At each point of the return distribution there is a Gaussian distribution that gives a good approximation at that point. The correlation of the approximating Gaussian distribution is taken as the local correlation in that neighbourhood. By examining the local Gaussian correlation before the shock (in a stable period) and after the shock (in the crisis period), we are able to test whether contagion has occurred by a bootstrap testing procedure. The use of local Gaussian correlation is compared to other methods of studying contagion. Further, the bootstrap test is examined in a Monte Carlo study, and shows good level and power properties. We illustrate our approach by re-examining the Mexican crisis of 1994, the Asian crisis of 1997–1998 and the financial crisis of 2007–2009. We find evidence of contagion based on our new procedure and are able to describe the nonlinear dependence structure of these crises.  相似文献   

2.
The Markowitz critical line method for mean–variance portfolio construction has remained highly influential today, since its introduction to the finance world six decades ago. The Markowitz algorithm is so versatile and computationally efficient that it can accommodate any number of linear constraints in addition to full allocations of investment funds and disallowance of short sales. For the Markowitz algorithm to work, the covariance matrix of returns, which is positive semi-definite, need not be positive definite. As a positive semi-definite matrix may not be invertible, it is intriguing that the Markowitz algorithm always works, although matrix inversion is required in each step of the iterative procedure involved. By examining some relevant algebraic features in the Markowitz algorithm, this paper is able to identify and explain intuitively the consequences of relaxing the positive definiteness requirement, as well as drawing some implications from the perspective of portfolio diversification. For the examination, the sample covariance matrix is based on insufficient return observations and is thus positive semi-definite but not positive definite. The results of the examination can facilitate a better understanding of the inner workings of the highly sophisticated Markowitz approach by the many investors who use it as a tool to assist portfolio decisions and by the many students who are introduced pedagogically to its special cases.  相似文献   

3.
本文对沪深300股指和股指期货仿真交易收益率极端风险和相依关系进行了研究,用DCC-GARCH模型描述了股指期货和现货之间动态的条件相关系数,并以极值分布为边际分布对四种常用的Copula函数进行了拟合,发现Frank Copula的拟合效果最好,其次为Clayton Copula。在此基础之上,对不同组合的VaR和CVaR进行测度,发现投资组合比例与风险之间呈现“U”型特征,这也为股指期货套期保值提供了一种新的研究方式。  相似文献   

4.
In this paper, we apply tools from random matrix theory (RMT) to estimates of correlations across the volatility of various assets in the S&P 500. The volatility inputs are estimated by modelling price fluctuations as a GARCH(1,1) process. The corresponding volatility correlation matrix is then constructed. It is found that the distribution of a significant number of eigenvalues of the volatility correlation matrix matches with the analytical result from RMT. Furthermore, the empirical estimates of short- and long-range correlations amongst eigenvalues, which are within RMT bounds, match with the analytical results for the Gaussian Orthogonal ensemble of RMT. To understand the information content of the largest eigenvectors, we estimate the contribution of the Global Industry Classification Standard industry groups to each eigenvector. In comparison with eigenvectors of correlation matrix for price fluctuations, only few of the largest eigenvectors of the volatility correlation matrix are dominated by a single industry group. We also study correlations between ‘volatility returns’ and log-volatility to find similar results.  相似文献   

5.
The manner in which a group of insurance risks are interrelated is commonly presented via a correlation matrix. Actuarial risk correlation matrices are often constructed using output from disparate modeling sources and can be subjectively adjusted, for example, increasing the estimated correlation between two risk sources to confer reserving prudence. Hence, while individual elements still obey the assumptions of correlation values, the overall matrix is often not mathematically valid (not positive semidefinite). This can prove problematic in using the matrix in statistical models. The first objective of this article is to review existing techniques that address the nearest positive semidefinite matrix problem in a very general setting. The chief approaches studied are Semidefinite Programming (SDP) and the Alternating Projections Method (APM). The second objective is to finesse the original problem specification to consider imposition of a block structure on the initial risk correlation matrix. This commonly employed technique identifies off-diagonal subsets of the matrix where values can or should be set equal to some constant. This may be due to similarity of the underlying risks and/or with the goal of increasing computational efficiency for processes involving large matrices. Implementation of further linear constraints of this nature requires adaptation of the standard SDP and APM algorithms. In addition, a new Shrinking Method is proposed to provide an alternative solution in the context of this increased complexity. “Nearness” is primarily considered in terms of two summary measures for differences between matrices: the Chebychev Norm (maximum element distance) and the Frobenius Norm (sum of squared element distances). Among the existing methods, adapted to function appropriately for actuarial risk matrices, APM is extremely efficient in producing solutions that are optimal in the Frobenius norm. An efficient algorithm that would return a positive semidefinite matrix that is optimal in Chebychev norm is currently unknown. However, APM is used to highlight the existence of matrices close to such an optimum and exploited, via the Shrinking Method, to find high-quality solutions. All methods are shown to work well both on artificial and real actuarial risk matrices provided under collaboration with Tokio Marine Kiln (TMK). Convergence speeds are calculated and compared and sample data and MATLAB code is provided. Ultimately the APM is identified as being superior in Frobenius distance and convergence speed. The Shrinking Method, building on the output of the APM algorithm, is demonstrated to provide excellent results at low computational cost for minimizing Chebychev distance.  相似文献   

6.
Extreme Correlation of International Equity Markets   总被引:24,自引:0,他引:24  
Testing the hypothesis that international equity market correlation increases in volatile times is a difficult exercise and misleading results have often been reported in the past because of a spurious relationship between correlation and volatility. Using "extreme value theory" to model the multivariate distribution tails, we derive the distribution of extreme correlation for a wide class of return distributions. Empirically, we reject the null hypothesis of multivariate normality for the negative tail, but not for the positive tail. We also find that correlation is not related to market volatility per se but to the market trend. Correlation increases in bear markets, but not in bull markets.  相似文献   

7.
Financial time series have two features which, in many cases, prevent the use of conventional estimators of volatilities and correlations: leptokurtotic distributions and contamination of data with outliers. Other techniques are required to achieve stable and accurate results. In this paper, we review robust estimators for volatilities and correlations and identify those best suited for use in risk management. The selection criteria were that the estimator should be stable to both fractionally small departures for all data points (fat tails), and to fractionally large departures for a small number of data points (outliers). Since risk management typically deals with thousands of time series at once, another major requirement was the independence of the approach of any manual correction or data pre-processing. We recommend using volatility t-estimators, for which we derived the estimation error formula for the case when the exact shape of the data distribution is unknown. A convenient robust estimator for correlations is Kendall's tau, whose drawback is that it does not guarantee the positivity of the correlation matrix. We chose to use geometric optimization that overcomes this problem by finding the closest correlation matrix to a given matrix in terms of the Hadamard norm. We propose the weights for the norm and demonstrate the efficiency of the algorithm on large-scale problems.  相似文献   

8.
Estimating the market risk is conditioned by the fat tail of the distribution of returns. But the tail index depends on the threshold of this distribution fat tail. We propose a methodology based on the decomposition of the series into positive outliers, Gaussian central part and negative outliers and uses the latter to estimate this cutoff point. Additionally, from this decomposition, we estimate extreme dependence correlation matrix which is used in the measurement of portfolio risk. For a sample consisting of six assets (Bitcoin, Gold, Brent, Standard&Poor-500, Nasdaq and Real Estate index), we find that our methodology presents better results, in terms of normality and volatility of the tail index, than the Kolmogorov–Smirnov distance, and its unnecessary capital consumption is lower. Also, in the measurement of the risk of a portfolio, the results of our proposal improve those of a t-Student copula and allow us to estimate the extreme dependence and the corresponding indexes avoiding the implicit restrictions of the elliptic and Archimedean copulas.  相似文献   

9.
The positive correlation (PC) test is the standard procedure used in the empirical literature to detect the existence of asymmetric information in insurance markets. This article describes a new tool to implement an extension of the PC test based on a new family of regression models, the multivariate ordered logit, designed to study how the joint distribution of two or more ordered response variables depends on exogenous covariates. We present an application of our proposed extension of the PC test to the Medigap health insurance market in the United States. Results reveal that the risk–coverage association is not homogeneous across coverage and risk categories, and depends on individual socioeconomic and risk preference characteristics.  相似文献   

10.
We propose a novel methodology to define, analyze and forecast market states. In our approach, market states are identified by a reference sparse precision matrix and a vector of expectation values. In our procedure, each multivariate observation is associated to a given market state accordingly to a minimization of a penalized Mahalanobis distance. The procedure is made computationally very efficient and can be used with a large number of assets. We demonstrate that this procedure is successful at clustering different states of the markets in an unsupervised manner. In particular, we describe an experiment with one hundred log-returns and two states in which the methodology automatically associates states prevalently to pre- and post-crisis periods with one state gathering periods with average positive returns and the other state periods with average negative returns, therefore discovering spontaneously the common classification of ‘bull’ and ‘bear’ markets. In another experiment, with again one hundred log-returns and two states, we demonstrate that this procedure can be efficiently used to forecast off-sample future market states with significant prediction accuracy. This methodology opens the way to a range of applications in risk management and trading strategies in the context where the correlation structure plays a central role.  相似文献   

11.
We used survey data on exchange-rate forecasts of the dollar/euro exchange rate and the yen/dollar exchange rate to analyze the correlation of the skewness of the distribution of heterogeneous forecasts with movements of the exchange rate. Using various measures of skewness, we found a negative correlation of skewness of 1-month-ahead forecasts with exchange-rate movements. In contrast, the correlation of skewness of 12-months-ahead forecast with exchange-rate movements is positive. The negative correlation arising in the case of 1-month-ahead forecasts is consistent with expected mean reversion in exchange rates. The positive correlation arising in the case of longer term forecasts, in turn, is consistent with longer term bandwagon effects.  相似文献   

12.
In risk management, modelling large numbers of assets and their variances and covariances in a unified framework is often important. In such multivariate frameworks, it is difficult to incorporate GARCH models and thus a new member of the ARCH-family, Orthogonal GARCH, has been suggested as a remedy to inherent estimation problems in multivariate ARCH modelling. Orthogonal GARCH creates positive definite covariance matrices of any size but builds on assumptions that partly break down during stress scenarios. This article therefore assesses the stress performance of the model by looking at four Nordic stock indices and covariance matrix forecasts during the highly volatile years of 1997 and 1998. Overall, Orthogonal GARCH is found to perform significantly better than traditional historical variance and moving average methods. Out-of-sample evaluation measures include symmetric loss functions (RMSE), asymmetric loss functions, operational methods suggested by the Basle Committee on Banking Supervision, as well as a forecast evaluation methodology based on pricing of simulated ‘rainbow options’.  相似文献   

13.
Revisiting the framework of (Barillas, Francisco, and Jay Shanken, 2018, Comparing asset pricing models, The Journal of Finance 73, 715–754). BS henceforth, we show that the Bayesian marginal likelihood-based model comparison method in that paper is unsound : the priors on the nuisance parameters across models must satisfy a change of variable property for densities that is violated by the Jeffreys priors used in the BS method. Extensive simulation exercises confirm that the BS method performs unsatisfactorily. We derive a new class of improper priors on the nuisance parameters, starting from a single improper prior, which leads to valid marginal likelihoods and model comparisons. The performance of our marginal likelihoods is significantly better, allowing for reliable Bayesian work on which factors are risk factors in asset pricing models.  相似文献   

14.
We set up a new kind of model to price the multi-asset options. A square root process fluctuating around its mean value is introduced to describe the random evolution of correlation between two assets. In this stochastic correlation model with mean reversion term, the correlation is a random walk within the region from −1 to 1, and it is centered around its equilibrium value. The trading strategy to hedge the correlation risk is discussed. Since a solution of high-dimensional partial differential equation may be impossible, the Quasi-Monte Carlo and Monte Carlo methods are introduced to compute the multi-asset option price as well. Taking a better-of two asset rainbow as an example, we compare our results with the price obtained by the Black–Scholes model with constant correlation.  相似文献   

15.
When observed stock returns are obtained from trades subject to friction, it is known that an individual stock's beta and covariance are measured with error. Univariate models of additive error adjustment are available and are often applied simultaneously to more than one stock. Unfortunately, these multivariate adjustments produce non-positive definite covariance and correlation matrices, unless the return sample sizes are very large. To prevent this, restrictions on the adjustment matrix are developed and a correction is proposed, which dominates the uncorrected estimator. The estimators are illustrated with asset opportunity set estimates where daily returns have trading frictions.  相似文献   

16.
《Finance Research Letters》2014,11(4):375-384
We propose a new method to assess sovereign risk in Eurozone countries using an approach that relies on consistent tests for stochastic dominance efficiency. The test statistics and the estimators are computed using mixed integer programming methods. Our analysis is based on macroeconomic fundamentals and their importance in accounting for sovereign risk. The results suggest that net international investment position/GDP and public debt/GDP are the main contributors to country risk in the Eurozone. We also conduct ranking analysis of countries for fiscal and external trade risk. We find a positive correlation between our rankings of the most vulnerable countries and the S&P’s ratings, whereas the correlation for other countries is weaker.  相似文献   

17.
对于动态投资组合与风险管理来说,测定波动溢出效应是非常重要的。已有的研究是建立在不同金融市场之间的波动是线性相关的,而线性相关并不能描述金融市场之间的非线性关系。借用Copula技术来描述股票市场之间的非线性关系、SV模型来刻画股票市场数据的边缘分布,并引入波动变结构论分析判断波动溢出,实证分析验证了方法是可行的。  相似文献   

18.
We develop an endogenous growth model with elastic labor supply, in which agents differ in their initial endowments of physical capital. In this context, the growth rate and the distribution of income are jointly determined. We then examine the distributional impact of different ways of financing an investment subsidy. Policies aimed at increasing the growth rate result in a more unequal distribution of pre-tax income, consistent with the positive correlation between income inequality and growth observed in the recent empirical literature. However, there is no conflict between efficiency and equity if inequality is measured in terms of the distribution of welfare.  相似文献   

19.
In this paper, a result for bivariate normal distributions from statistics is transformed into a financial asset context in order to build a tool which can translate a correlation matrix into an equivalent probability matrix and vice versa. This way, the correlation coefficient parameter is more understandable in terms of joint probability of two stocks’ returns, and much more useful in terms of the information it provides. We validate, empirically, our result for a sample covering the three market capitalization categories in the S&P 500 index over a ten-year period. Finally, the accuracy of this new tool is measured theoretically and some applications from the practitioners’ point of view are offered. Such applications include, for instance, the calculation of the number of trading days in a year in which two stocks have same sign returns and how to split the average return of weighted stocks into four orthants.  相似文献   

20.
We test whether the low-risk effect is driven by leverage constraints and, thus, risk should be measured using beta versus behavioral effects and, thus, risk should be measured by idiosyncratic risk. Beta depends on volatility and correlation, with only volatility related to idiosyncratic risk. We introduce a new betting against correlation (BAC) factor that is particularly suited to differentiate between leverage constraints and behavioral explanations. BAC produces strong performance in the US and internationally, supporting leverage constraint theories. Similarly, we construct the new factor SMAX to isolate lottery demand, which also produces positive returns. Consistent with both leverage and lottery theories contributing to the low-risk effect, we find that BAC is related to margin debt while idiosyncratic risk factors are related to sentiment.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号