共查询到20条相似文献,搜索用时 0 毫秒
1.
Insurers are faced with the challenge of estimating the future reserves needed to handle historic and outstanding claims that are not fully settled. A well-known and widely used technique is the chain-ladder method, which is a deterministic algorithm. To include a stochastic component one may apply generalized linear models to the run-off triangles based on past claims data. Analytical expressions for the standard deviation of the resulting reserve estimates are typically difficult to derive. A popular alternative approach to obtain inference is to use the bootstrap technique. However, the standard procedures are very sensitive to the possible presence of outliers. These atypical observations, deviating from the pattern of the majority of the data, may both inflate or deflate traditional reserve estimates and corresponding inference such as their standard errors. Even when paired with a robust chain-ladder method, classical bootstrap inference may break down. Therefore, we discuss and implement several robust bootstrap procedures in the claims reserving framework and we investigate and compare their performance on both simulated and real data. We also illustrate their use for obtaining the distribution of one year risk measures. 相似文献
2.
Nicholas M. Kiefer 《Journal of Empirical Finance》2009,16(1):164-173
Risk managers at financial institutions are concerned with estimating default probabilities for asset groups both for internal risk control procedures and for regulatory compliance. Low-default assets pose an estimation problem that has attracted recent concern. The problem in default probability estimation for low-default portfolios is that there is little relevant historical data information. No amount of data data-processing can fix this problem. More information is required. Incorporating expert opinion formally is an attractive option. The probability (Bayesian) approach is proposed, its feasibility demonstrated, and its relation to supervisory requirements discussed. 相似文献
3.
Value-at-risk (VaR) has been playing the role of a standard risk measure since its introduction. In practice, the delta-normal approach is usually adopted to approximate the VaR of portfolios with option positions. Its effectiveness, however, substantially diminishes when the portfolios concerned involve a high dimension of derivative positions with nonlinear payoffs; lack of closed form pricing solution for these potentially highly correlated, American-style derivatives further complicate the problem. This paper proposes a generic simulation-based algorithm for VaR estimation that can be easily applied to any existing procedures. Our proposal leverages cross-sectional information and applies variable selection techniques to simplify the existing simulation framework. Asymptotic properties of the new approach demonstrate faster convergence due to the additional model selection component introduced. We have also performed sets of numerical results that verify the effectiveness of our approach in comparison with some existing strategies. 相似文献
4.
The dynamic betas for ten Canadian sector portfolios using the Kalman filter approach are estimated herein and are found to be best described by a mix of the random walk (trend) and mean-reverting (cycle) processes. The relative importance of the trend and cycle components of sector betas is related to different sensitivities of the corresponding sectors to business cycles. Dynamic betas significantly increase the explanatory power of the market model, and particularly for the utilities sector. A dynamic hedging strategy using the one-step-ahead beta forecasts as the hedge ratios produces smaller hedging errors for every sector compared with the hedge ratios calculated from the alternative beta specifications. 相似文献
5.
Recently a number of mathematical programming models have been developed to assist banks in their portfolio (balance sheet) management decision making. Generally, the model structures used may be classified as either linear, linear goal, or two-stage linear programming. Of these, linear programming models are the most common. The purpose of this paper is to discuss the optimal bank portfolio management solutions produced by each of the above programming structures. In addition, a new model structure, two-stage linear goal programming, is developed and compared to the other structures. From a decision-making perspective, this new model structure is found to provide additional useful information. 相似文献
6.
Kurtay Ogunc 《International Review of Financial Analysis》2008,17(4):716-727
The purpose of this paper is to incorporate behavioral issues as it relates to the active currency hedging of international portfolios in the context of traditional expected utility maximization approach. The uniqueness of the approach is that separate risk aversion parameters are introduced for asset and currency markets. The paper is similar in spirit to Black (Black, F. 1989, Universal hedging, Financial Analysts Journal (July-August), 16-22.), who argued that a portion of foreign equity investments should be permanently unhedged, which is basically postulating that one should take a buy-and-hold position in currency with a fraction of the capital. The behavioral twist included in the traditional expected utility maximization approach results in lower hedge ratios, ceteris paribus, partly due to the asymmetric nature of the compensation structure of currency managers.Since the asymmetric nature of incentive schemes of asset and currency managers dictates how one optimizes the investment portfolio of a pension or endowment fund, the unusual behavior of a given institutional fund manager should not be called “irrational,” only because the optimal currency hedging level deviates from the one derived under rational expectations. This only justifies the use of different hedging strategies by various institutional investors. We describe in detail how the level of hedging should be revised downwards because of behavioral factors. Conclusions are in the context of what people would predict to see in the market, if certain investors behave in an “irrational” way. 相似文献
7.
Peter Jagers 《Scandinavian actuarial journal》2013,2013(2):187-190
Abstract The purpose of this short note is to demonstrate the power of very straightforward branching process methods outside their traditional realm of application. We shall consider an insurance scheme where claims do not necessarily arise as a stationary process. Indeed, the number of policy-holders is changing so that each of them generates a random number of new insurants. Each one of these make claims of random size at random instants, independently but with the same distribution for different individuals. Premiums are supposed equal for all policy-holders. It is proved that there is, for an expanding portfolio, only one premium size which is fair in the sense that if the premium is larger than that, then the profit of the insurer grows infinite with time, whereas a smaller premium leads to his inevitable ruin. (Branching process models for the development of the portfolio may seem unrealistic. However, they do include the classical theory, where independent and identically distributed claims arise at the points of a renewal process.) 相似文献
8.
A major impediment to measuring portfolio performance under stochastic dominance has been the lack of test statistics for
orders of stochastic dominance above first degree. In this article, the Bootstrap method, introduced by Efron (1979), is used
to estimate critical values for distance statistics in order to test the null hypothesis of no dominance, under second- and
third-degree stochastic dominance, for several samples of stock returns. These test statistics, suggested by Whitmore (1978),
are analogous to the Kolmogorov-Smirnov distance statistics that can be used to test for first-degree stochastic dominance.
Stochastic dominance is shown to accurately assess portfolio performance of sample distributions when the population distributions
are controlled and Bootstrap statistics are employed in the analysis. In addition, second- and third-degree stochastic dominance
analysis of the smallfirm January anomaly indicates that, over the 23-year time period 1964 to 1986, small firms statistically
dominate a diversified market index in only one calendar year. 相似文献
9.
We propose a methodology for modelling the value at risk of a complex portfolio, based on an extension of the Ho, Stapleton and Subrahmanyam technique. We model the variance-covariance structure of up to seven variables. These could represent four country indices and three exchange rates, for example. In addition, the effect of an arbitrary number of orthogonal factors can be analysed. The system is illustrated by estimating the value at risk for a portfolio of international stocks where the factors are stock market indices and exchange rates, a portfolio of international bonds where the factors are interest rates as well as exchange rates, and a portfolio of interest rate derivatives in different currencies. In this last case, we model a two-factor term structure of interest rates in each of the currencies, valuing the derivatives at a future date using these term structures and the Black model. The model is applied for different fineness of the binomial density and computational accuracy and efficiency are estimated.
G13, G15, G21 相似文献
G13, G15, G21 相似文献
10.
We propose a methodology that can efficiently measure the Value-at-Risk (VaR) of large portfolios with time-varying volatility and correlations by bringing together the established historical simulation framework and recent contributions to the dynamic factor models literature. We find that the proposed methodology performs well relative to widely used VaR methodologies, and is a significant improvement from a computational point of view. 相似文献
11.
Bezalel Gavish 《Journal of Banking & Finance》1977,1(2):143-150
Stochastic dominance methods which have been developed in recent years are generally more valid than mean-variance (EV) and higher moment methods for selecting a portfolio from a given finite set of possible portfolios. One of the limitations of these methods is the lack of procedures for building portfolios from a given set of securities and the probability distribution of their returns. Markowitz has developed an algorithm based on the restriction method in linear programming to build undominated portfolios. In this paper a more efficient method based on the relaxation method of linear programming is developed and tested for efficiency. Computational results justify its use as a practical tool for portfolio building. 相似文献
12.
13.
Yuming Li 《Financial Markets and Portfolio Management》2017,31(3):289-315
Rational asset pricing implies a positive relation between the expected risk-adjusted return and the volatility of a factor-mimicking portfolio. The relation for the momentum portfolio is weak after its return is adjusted for the risks associated with the market return, the size factor, and the book-to-market factor. However, the relation is significantly positive and captures most of the average return on the momentum portfolio after the return is adjusted for the market return and the risk associated with the short-term reversal portfolio return. The result supports the hypothesis that there is a common factor underlying both momentum and short-term reversal. The dynamics of the factor loadings and the correlation structure of the underlying factors have important implications for the risk prices associated with the factor-mimicking portfolios and the risk–return trade-off for momentum and reversal portfolios. 相似文献
14.
Recent theoretical work suggests that definitions of market efficiency that allow for the possibility of time-varying risk-premia
will generally lead to return sign predictability. Consistent with this theory, we show that a logit model based on the lagged
value of the market risk premium is useful for successfully predicting the return sign for CRSP small decile portfolio returns,
but not large ones. We additionally employ this model in market timing simulations of micro-cap mutual funds in which investment
can actually be made. The results indicate that a market-timing strategy based on our return-sign forecasting model outperforms
a buy-and-hold strategy for 13 of 14 micro-cap funds studied. On average, the buy-and-hold strategy produces an average compound
return of 11.98% per annum versus an average of 16.60% for the market-timing strategy. Nevertheless, trading restrictions
make the return-sign forecasting model more practical to employ by the micro-cap fund portfolio manager rather than the individual
fund investor.
相似文献
Bruce G. ResnickEmail: |
15.
The present study investigates the stock characteristic preferences of institutional Australian equity managers. In aggregate we find that active managers exhibit preferences for stocks exhibiting high‐price variance, large market capitalization, low transaction costs, value‐oriented characteristics, greater levels of analyst coverage and lower variability in analyst earnings forecasts. We observe stronger preferences for higher volatility, value stocks and wider analyst coverage among smaller stocks. We also find that smaller investment managers prefer securities with higher market capitalization and analyst coverage (including low variation in the forecasts of these analysts). We also document that industry effects play an important role in portfolio construction. 相似文献
16.
This paper proposes a methodology to analyse the risk and return of large loan portfolios in a joint setting. I propose a tractable model to obtain the distribution of loan returns from observed interest rates and default frequencies. I follow a sectoral approach that captures the heterogeneous cyclical features of different kinds of loans and yields moments in closed form. I investigate the validity of mean–variance analysis with a value at risk constraint and study its relationship with utility maximisation. Finally, I study the efficiency of corporate and household loan portfolios in an empirical application to the Spanish banking system. 相似文献
17.
We obtain an explicit formula for the bilateral counterparty valuation adjustment of a credit default swaps portfolio referencing an asymptotically large number of entities. We perform the analysis under a doubly stochastic intensity framework, allowing default correlation through a common jump process. The key insight behind our approach is an explicit characterization of the portfolio exposure as the weak limit of measure-valued processes associated with survival indicators of portfolio names. We validate our theoretical predictions by means of a numerical analysis, showing that counterparty adjustments are highly sensitive to portfolio credit risk volatility as well as to the intensity of the common jump process. 相似文献
18.
Astrid Ayala 《European Journal of Finance》2013,19(18):1861-1884
ABSTRACTThe precise measurement of the association between asset returns is important for financial investors and risk managers. In this paper, we focus on a recent class of association models: Dynamic Conditional Score (DCS) copula models. Our contributions are the following: (i) We compare the statistical performance of several DCS copulas for several portfolios. We study the Clayton, rotated Clayton, Frank, Gaussian, Gumbel, rotated Gumbel, Plackett and Student's t copulas. We find that the DCS model with the Student's t copula is the most parsimonious model. (ii) We demonstrate that the copula score function discounts extreme observations. (iii) We jointly estimate the marginal distributions and the copula, by using the Maximum Likelihood method. We use DCS models for mean, volatility and association of asset returns. (iv) We estimate robust DCS copula models, for which the probability of a zero return observation is not necessarily zero. (v) We compare different patterns of association in different regions of the distribution for different DCS copulas, by using density contour plots and Monte Carlo (MC) experiments. (vi) We undertake a portfolio performance study with the estimation and backtesting of MC Value-at-Risk for the DCS model with the Student's t copula. 相似文献
19.
Donald D. Hester 《Journal of Banking & Finance》1982,6(1):89-112
This paper examines changes in portfolios of domestic offices of large banks in the United States. Empirical evidence indicates that representative portfolios changed by statistical significant amounts because of improved cash management, broadening of the federal funds market, more sophisticated tax avoidance, and new financial instruments. Changes over the years 1970–1976 were analyzed by examining the distribution of mean portfolio shares, performing a canonical correlation study, and examining the stationarity of a portfolio model that is described in The Econometrics of Panel Data, Annales de l'insee, April–September, 1978, pp. 297–329. The final section interprets the findings for the design of monetary policy. 相似文献
20.
文章认为,由于利率和股价兼有相同的趋势性和波动性属性,股价经典波动模型对利率建模具有研究价值。通过引入经典的波动模型,结合极大似然估计的方法,本文探讨了无风险债券的最优投资方案,并将该成果运用于全球主要国债市场进行实证模拟投资,结果表明,该模型在全球主要国债市场均能取得较好的超额收益。 相似文献