首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
This study examines the asymmetric adjustments to the long-run equilibrium for credit default swap (CDS) sector indexes of three financial sectors – banking, financial services and insurance – in the presence of a threshold effect. The results of the momentum-threshold autoregression (M-TAR) models demonstrate that asymmetric cointegration exists for all pairs comprised of those three CDS indexes. The speeds of adjustment in the long-run are much higher in the case of adjustments from below the threshold than from above for all the pairs. The estimates of The MTAR-VEC models suggest that the dual CDS index return in each sector pair participates in the adjustment to equilibrium in the short- and long-run taken together. But in the long-run alone, only one of the two spreads in each pair participates. Policy implications are also provided.  相似文献   

2.
Recent literature suggests identifying house price hedonic regressions by using instrumental variables, spatial statistics, the borders approach, panel data, and other techniques. We present an empirical application of a mixed index model, first proposed by Bowden [Bowden, R.J., 1992. Competitive selection and market data: the mixed-index problem. The Review of Economic Studies 59(3):625–633.] to identify hedonic price regressions. We compare the performance of the mixed index model to a traditional hedonic model and to a hedonic model that includes characteristics of the buyer of each house. We find the mixed index model outperforms the other models based on bootstrap distributions of predicted housing values, prediction variance, and predicted policy effects. The mixed index model distributions are less skewed and kurtotic than the other models, suggesting it more closely satisfies the classical linear regression assumption of normally distributed errors. Compared to the mixed index model, the traditional hedonic overstates the importance of lot size and school quality to house price and understates the importance of environmental quality.  相似文献   

3.
The Basel II and III Accords propose estimating the credit conversion factor (CCF) to model exposure at default (EAD) for credit cards and other forms of revolving credit. Alternatively, recent work has suggested it may be beneficial to predict the EAD directly, i.e.modelling the balance as a function of a series of risk drivers. In this paper, we propose a novel approach combining two ideas proposed in the literature and test its effectiveness using a large dataset of credit card defaults not previously used in the EAD literature. We predict EAD by fitting a regression model using the generalised additive model for location, scale, and shape (GAMLSS) framework. We conjecture that the EAD level and risk drivers of its mean and dispersion parameters could substantially differ between the debtors who hit the credit limit (i.e.“maxed out” their cards) prior to default and those who did not, and thus implement a mixture model conditioning on these two respective scenarios. In addition to identifying the most significant explanatory variables for each model component, our analysis suggests that predictive accuracy is improved, both by using GAMLSS (and its ability to incorporate non-linear effects) as well as by introducing the mixture component.  相似文献   

4.
This paper provides a discrete time algorithm, in the framework of the Cox–Ross–Rubinstein analysis (1979), to evaluate both Parisian options with a flat barrier and Parisian options with an exponential boundary. The algorithm is based on a combinatorial tool for counting the number of paths of a particle performing a random walk, that remains beyond a barrier constantly for a period strictly smaller than a pre-specified time interval. As a result, a binomial evaluation model is derived that is very easy to implement and that produces highly accurate prices. Received: 19 March 2001 / Accepted: 17 March 2002 The author thanks Prof. Ivar Massabó for helpful comments and discussions. This research has been partially supported by MIUR (research on “Modelli per la Finanza Matematica”)  相似文献   

5.
The paper examines how the collateral affects the probability of default for small firms. We present a stylized theoretical model to derive the relationship between the level of collateral and subsequent loan default. We find that the probability of default is negatively correlated with the level of collateral, which is intuitive. Subsequently, we test this relationship by using a proprietary database of collateralized loans of small Brazilian enterprises.  相似文献   

6.
Based on UK data for major retail credit cards, we build several models of Loss Given Default based on account level data, including Tobit, a decision tree model, a Beta and fractional logit transformation. We find that Ordinary Least Squares models with macroeconomic variables perform best for forecasting Loss Given Default at the account and portfolio levels on independent hold-out data sets. The inclusion of macroeconomic conditions in the model is important, since it provides a means to model Loss Given Default in downturn conditions, as required by Basel II, and enables stress testing. We find that bank interest rates and the unemployment level significantly affect LGD.  相似文献   

7.
For managing credit risk, commercial banks use various scoring methodologies to evaluate the financial performance of client firms. This paper upgrades the quantitative analysis used in the financial performance modules of state-of-the-art credit scoring methodologies. This innovation should help lending officers in branch levels filter out the poor risk applicants. The Data Envelopment Analysis-based methodology was applied to current data for 82 industrial/manufacturing firms comprising the credit portfolio of one of Turkey's largest commercial banks. Using financial ratios, the DEA synthesizes a firm's overall performance into a single financial efficiency score—the “credibility score”. Results were validated by various supporting (regression and discriminant) analyses and, most importantly, by expert judgments based on data or on current knowledge of the firms.  相似文献   

8.
This paper proves existence of an ergodic Markov equilibrium for a class of general equilibrium economies with infinite horizon, incomplete markets, and default. Agents may choose to deny their liabilities and face trading constraints that depend on the adjusted amount of past default on each asset. These constraints replace the usual utility penalties and explore intertemporal tie-ins that appear in dynamic economies. The equilibrium prices and solvency rates present stationary properties that are usually required in econometric models of credit risk.  相似文献   

9.
In this article, we revisit the impact of the voluntary central clearing scheme on the CDS market. In order to address the endogeneity problem, we use a robust methodology that relies on dynamic propensity-score matching combined with generalized difference-in-differences. Our empirical findings show that central clearing results in a small increase in CDS spreads (ranging from 14 to 19 bps), while there is no evidence of an associated improvement in CDS market liquidity and trading activity or of a deterioration in the default risk of the underlying bond. These results suggest that the increase in CDS spreads can be mainly attributed to a reduction in CDS counterparty risk.  相似文献   

10.
To categorize credit applications into defaulters or non-defaulters, most credit evaluation models have employed binary classification methods based on default probabilities. However, while some loan applications can be directly accepted or rejected, there are others on which immediate accurate credit status decisions cannot be made using existing information. To resolve these issues, this study developed an optimized sequential three-way decision model. First, an information gain objective function was built for the three-way decision, after which a genetic algorithm (GA) was applied to determine the optimal decision thresholds. Then, appropriate accept or reject decisions for some applicants were made using basic credit information, with the remaining applicants, whose credit status was difficult to determine, being divided into a boundary region (BND). Supplementary information was then added to reevaluate the credit applicants in the BND, and a sequential optimization process was employed to ensure more accurate predictions. Therefore, the model’s predictive abilities were improved and the information acquisition costs controlled. The empirical results demonstrated that the proposed model was able to outperform other benchmarking credit models based on performance indicators.  相似文献   

11.
This paper explores an alternative method of ‘solving’ the problem of recurring time variable demands in a public utility context. It views the utility's load curve as a series of horizontal layers or ‘slices’ of varying lengths, rather than as a series of vertical slices as in the traditional approach. Several cases are examined, and traditional time-of-day pricing is shown to be inefficient or inapplicable in some of them, while ‘demand-layer’ pricing, based on horizontal slicing, is efficient. In still other cases, neither method of pricing is efficient.  相似文献   

12.
Decisions in Economics and Finance - We propose a new model for the pricing of wind power futures written on the wind power production index. Our approach is based on an arithmetic multi-factor...  相似文献   

13.
We construct a new measure of mortgage credit availability using a technique developed for production frontier estimation. The resulting “loan frontier” describes the maximum amount obtainable by a borrower of given characteristics. We estimate this frontier using mortgage originations data from 2001 to 2014. We find a substantial expansion of mortgage credit for all borrowers during the housing boom, not only for low‐score or low‐income borrowers. The subsequent contraction in credit was most pronounced for low‐score borrowers. Using variation in the frontier across metropolitan areas over time, we show that borrowing constraints played an important role in the recent housing cycle.  相似文献   

14.
We propose a Markov chain model for credit rating changes. We do not use any distributional assumptions on the asset values of the rated companies but directly model the rating transitions process. The parameters of the model are estimated by a maximum likelihood approach using historical rating transitions and heuristic global optimization techniques.We benchmark the model against a GLMM model in the context of bond portfolio risk management. The proposed model yields stronger dependencies and higher risks than the GLMM model. As a result, the risk optimal portfolios are more conservative than the decisions resulting from the benchmark model.  相似文献   

15.
Parisian options are path-dependent options whose payoff depends on whether the underlying asset’s price remains continuously at or above a given barrier over a given time interval. Costabile’s (Decis Econ Finance 25(2):111–125, 2002b) algorithm for pricing Parisian options based on a combinatorial approach in binomial tree has a time complexity of O( n3){O\left( {n^{3}}\right)}. We improve that algorithm to yield one with a time complexity of only O(n2){O\left({n^{2}}\right)}.  相似文献   

16.
17.
Credit risk is one of the main risks faced by a bank to provide financial products and services to clients. To evaluate the financial performance of clients, several scoring methodologies have been proposed, which are based mostly on quantitative indicators. This paper highlights the relevance of both quantitative and qualitative features of applicants and proposes a new methodology based on mixed data clustering techniques. Indeed, cluster analysis may prove particularly useful in the estimation of credit risk. Traditionally, clustering concentrates only on quantitative or qualitative data at a time; however, since credit applicants are characterized by mixed personal features, a cluster analysis specific for mixed data can lead to discover particularly informative patterns, estimating the risk associated with credit granting.  相似文献   

18.
The objective of this paper is to compare the mispricing of option valuation models when alternate techniques are applied to the volatility estimation. Akgiray (1989) shows that out-of-sample forecasts of return variances of stock indices based on a GARCH model are superior predictors of the actual ex-post variances in comparison to forecasts generated using standard rolling regression methods. A second objective of this study is to examine if Akgiray's results carry over to option valuation. Although we find that the implied volatility technique results in the least mispricing, within the class of forecasts using only historic returns data, the use of GARCH models will also significantly reduce model mispricing.  相似文献   

19.
A broad class of generalized linear mixed models, e.g. variance components models for binary data, percentages or count data, will be introduced by incorporating additional random effects into the linear predictor of a generalized linear model structure. Parameters are estimated by a combination of quasi-likelihood and iterated MINQUE (minimum norm quadratic unbiased estimation), the latter being numerically equivalent to REML (restricted, or residual, maximum likelihood). First, conditional upon the additional random effects, observations on a working variable and weights are derived by quasi-likelihood, using iteratively re-weighted least squares. Second, a linear mixed model is fitted to the working variable, employing the weights for the residual error terms, by iterated MINQUE. The latter may be regarded as a least squares procedure applied to squared and product terms of error contrasts derived from the working variable. No full distributional assumptions are needed for estimation. The model may be fitted with standardly available software for weighted regression and REML.  相似文献   

20.
We analyze the classical investment and pricing problem of a dominant firm faced with competition from substitute industries or marginal firms in the same field. The firm owns a finite level of a resource (e.g. the stock of an exhaustible one), the consumption of which is to be divided optimally over a finite planning horizon. The competitors' measures affect the demand for the resource towards the dominating firm. Rising crude oil prices and investments in forms of alternative energy are representative examples of the strategic questions which involve competitive and contradictory interests among firms within an industry. The investment and pricing problem can be solved analytically only with strong, simplifying assumptions. To make the analysis simpler and to relax these restrictions, we combine a series of numerical tools, computerize them, and build up a user-oriented, computerized decision aid, which we call a ‘computerized approach’. We solve the problem under different sets of theoretical assumptions. This chosen incremental theory building allows us to study the theoretical sensitivity of the original problem.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号