共查询到20条相似文献,搜索用时 15 毫秒
1.
Passive index investing involves investing in a fund that replicates a market index. Enhanced indexation uses the returns of an index as a reference point and aims at outperforming this index. The motivation behind enhanced indexing is that the indices and portfolios available to academics and practitioners for asset pricing and benchmarking are generally inefficient and, thus, susceptible to enhancement. In this paper we propose a novel technique based on the concept of cumulative utility area ratios and the Analytic Hierarchy Process (AHP) to construct enhanced indices from the DJIA and S&P500. Four main conclusions are forthcoming. First, the technique, called the utility enhanced tracking technique (UETT), is computationally parsimonious and applicable for all return distributions. Second, if desired, cardinality constraints are simple and computationally parsimonious. Third, the technique requires only infrequent rebalancing, monthly at the most. Finally, the UETT portfolios generate consistently higher out-of-sample utility profiles and after-cost returns for the fully enhanced portfolios as well as for the enhanced portfolios adjusted for cardinality constraints. These results are robust to varying market conditions and a range of utility functions. 相似文献
2.
开展股指期货交易--我国证券市场发展的现实选择 总被引:2,自引:0,他引:2
作为一种创新型金融衍生产品,股指期货丰富了市场交易品种,而双向交易机制的引进可以规避系统性风险,从而有利于资本市场的平稳运行。但是,股指期货交易的开展必须基于一国的基本国情,本文从股指期货的比较优势入手,结合我国当前实际情况,对我国股指期货的开展进行了分析。 相似文献
3.
We consider the problem of index tracking whose goal is to construct a portfolio that minimizes the tracking error between the returns of a benchmark index and the tracking portfolio. This problem carries significant importance in financial economics as the tracking portfolio represents a parsimonious index that facilitates a practical means to trade the benchmark index. For this reason, extensive studies from various optimization and machine learning-based approaches have ensued. In this paper, we solve this problem through the latest developments from deep learning. Specifically, we associate a deep latent representation of asset returns, obtained through a stacked autoencoder, with the benchmark index's return to identify the assets for inclusion in the tracking portfolio. Empirical results indicate that to improve the performance of previously proposed deep learning-based index tracking, the deep latent representation needs to be learned in a strictly hierarchical manner and the relationship between the returns of the index and the assets should be quantified by statistical measures. Various deep learning-based strategies have been tested for the stock market indices of the S&P 500, FTSE 100 and HSI, and it is shown that our proposed methodology generates the best index tracking performance. 相似文献
4.
This study develops an early warning system for financial crises with a focus on small open economies. We contribute to the literature by developing macro-financial dynamic factor models that extract useful information from a rich but unbalanced mixed frequency data set that includes a range of global and domestic economic and financial indicators. The framework is applied to several Asian countries—Thailand, South Korea, Singapore, Malaysia, the Philippines and Indonesia. Logit regression models that use the extracted factors and other leading indicators have significant power in predicting systemic events. In-sample and out-of-sample test results indicate that the extracted factors help to improve the predictive power over a model that uses only sufficiently long history indicators. Importantly, models that include the dynamic factors yield consistently better out-of-sample crisis prediction results for key performance measures such as a usefulness index, the noise to signal ratio, and AUROC. 相似文献
5.
6.
When a portfolio consists of a large number of assets, it generally incorporates too many small and illiquid positions and needs a large amount of rebalancing, which can involve large transaction costs. For financial index tracking, it is desirable to avoid such atomized, unstable portfolios, which are difficult to realize and manage. A natural way of achieving this goal is to build a tracking portfolio that is sparse with only a small number of assets in practice. The cardinality constraint approach, by directly restricting the number of assets held in the tracking portfolio, is a natural idea. However, it requires the pre-specification of the maximum number of assets selected, which is rarely practicable. Moreover, the cardinality constrained optimization problem is shown to be NP-hard. Solving such a problem will be computationally expensive, especially in high-dimensional settings. Motivated by this, this paper employs a regularization approach based on the adaptive elastic-net (Aenet) model for high-dimensional index tracking. The proposed method represents a family of convex regularization methods, which nests the traditional Lasso, adaptive Lasso (Alasso), and elastic-net (Enet) as special cases. To make the formulation more practical and general, we also take the full investment condition and turnover restrictions (or transaction costs) into account. An efficient algorithm based on coordinate descent with closed-form updates is derived to tackle the resulting optimization problem. Empirical results show that the proposed method is computationally efficient and has competitive out-of-sample performance, especially in high-dimensional settings. 相似文献
7.
我国社会融资规模概念在2010年提出后,一直在不断完善中,2014年2月,央行也首次公布了地区社会融资规模的统计结果。社会融资规模作为一项比新增贷款更为全面的货币政策中介指标,能够反映人民币贷款体现不出来的问题,但其在要素构成方面一直有较多探讨。本文对我国2002年1月至2013年12月的社会融资规模数据进行指数化研究,得出各指标的不同权重,试将社会融资规模指数化,以便更好的利用于研究分析中。 相似文献
8.
9.
We study equity premium out-of-sample predictability by extracting the information contained in a high number of macroeconomic predictors via large dimensional factor models. We compare the well-known factor model with a static representation of the common components with the Generalized Dynamic Factor Model, which accounts for time series dependence in the common components. Using statistical and economic evaluation criteria, we empirically show that the Generalized Dynamic Factor Model helps predicting the equity premium. Exploiting the link between business cycle and return predictability, we find accurate predictions also by combining rolling and recursive forecasts in real-time. 相似文献
10.
In this article, we examine the short-term persistence in mutual fund performance in the main European markets between January 1990 and December 2022. The mutual fund industry in Europe has experienced significant growth in recent years as a consequence of the integration of its markets. However, the European mutual fund industry is still an unexplored area of research with only a small number of significant studies compared to the US industry. Using a sample of daily survivorship bias-free data on the five most important European mutual fund countries, which includes 2734 mutual funds across all countries, we find statistically significant persistence in the post-ranking quarter across different performance models for all countries. This evidence is present across all deciles including the top-decile and bottom-decile mutual funds. Further, we also extend our analysis to high inflation periods. 相似文献
11.
Gabriela De Raaij 《European Journal of Finance》2013,19(2):151-166
Density forecasts have become important in finance and play a key role in modern risk management. Using a flexible density forecast evaluation framework that extends the Berkowitz likelihood ratio test this paper evaluates in- and out-of-sample density forecasts of daily returns on the DAX, ATX and S&P 500 stock market indices from models of financial returns that are currently widely used in the financial industry. The results indicate that GARCH-t models produce good in-sample forecasts. No model considered in this study delivers fully acceptable out-of-sample forecasts. The empirical findings emphasize that proper distributional assumptions combined with an adequate specification of relevant conditional higher moments are necessary to obtain good density forecasts. 相似文献
12.
AbstractDementia is a leading cause of death in the developed world. A number of modifiable risk factors for dementia have been identified, yet lay knowledge on dementia risk is limited. A mental models approach was used to compare lay and expert knowledge of risk in order to identify target areas for lay education. This method assumes experts and laypeople use different cognitive representations to make sense of phenomena. Semi-structured interviews were conducted with 8 experts and 15 laypeople to construct mental models of dementia risk. Interviews were transcribed verbatim and entered into NVivo qualitative data analysis software for coding. Findings indicated that laypeople had some knowledge of modifiable risk factors but in contrast to the expert mental model, laypeople poorly understood the causal links between these factors and dementia risk. Lay participants were unsure of the interaction between modifiable and non-modifiable risk factors and primarily attributed dementia risk to ‘bad luck’. It is suggested that future dementia education could benefit by building upon a general appreciation of healthy behaviour with particular focus on explaining the causal pathways to dementia risk. Additionally, it may be productive to inform laypeople it is possible to ‘change one’s luck’ by engaging in protective behaviours. 相似文献
13.
Some lay people confronted with a new base station project fear serious health consequences from the high‐frequency radiation, while experts consider exposure under the current international standards as unproblematic. These conflictive estimations may be attributed to the different mental models of lay people and experts. Less is known about lay people’s knowledge in regard to mobile communication and their intuitive understanding of the associated health risks. An adaptation of the ‘Mental Models Approach’ was used to reveal lay people’s beliefs about mobile communication and to learn more about lay people’s information requirements, potential knowledge gaps, and misconceptions. Through the means of open interviews with Swiss experts (N = 16), lay people (N = 16), and base station opponents (N = 15), different mental models were constructed and evaluated. Comparisons between the expert and the lay groups showed several qualitative differences in all identified knowledge domains. Knowledge gaps in regard to changing exposure magnitudes due to the interaction patterns of cell phones and base stations as well as misconceptions about regulation issues and scientific processes were found in both lay groups. In addition, lack of trust in responsible actors and disaffection with base station location processes were mentioned. The reported qualitative insights may be useful for the improvement of further risk communication tools. 相似文献
14.
目前,我国设立银行系基金管理公司试点工作已经进入实质性操作阶段.允许银行设立或参与基金管理公司,有助于银行提高信托理财业务水平、增加新的利润增长点,但同时也对商业银行的风险管理水平提出了更高的要求,特别是控制跨市场风险更是商业银行办好基金管理公司的关键.当前环境下我国商业银行开办基金管理公司面临的跨市场风险主要有担保风险、流动性风险和利率攀比风险.为此,必须采取一定措施来应对和防范. 相似文献
15.
Kingsley Y. L. Fong David R. Gallagher Peter A. Gardner Peter L. Swan 《Accounting & Finance》2011,51(3):684-710
When fund managers trade sequentially in the same direction, the information confirmation hypothesis predicts the long‐term profitability of the leader trade to be increasing in the number of subsequent trades. The information cascade hypothesis predicts a non‐positive relationship. Using active equity funds’ daily trading data, we document a transition from information confirmation to information cascades as the number of followers increase. We find that highly disguised multiple‐broker packages exhibit higher market impact, higher long‐term returns and are associated with fewer followers. Our study also documents that lead fund managers face portfolio risk constraints in trading on private information. 相似文献
16.
In commercial banking, various statistical models for corporate credit rating have been theoretically promoted and applied
to bank-specific credit portfolios. In this paper, we empirically compare and test the performance of a wide range of parametric
and nonparametric credit rating model approaches in a statistically coherent way, based on a ‘real-world’ data set. We repetitively
(k times) split a large sample of industrial firms’ default data into disjoint training and validation subsamples. For all model types, we estimate k out-of-sample discriminatory power measures, allowing us to compare the models coherently. We observe that more complex and
nonparametric approaches, such as random forest, neural networks, and generalized additive models, perform best in-sample.
However, comparing k out-of-sample cross-validation results, these models overfit and lose some of their predictive power. Rather than improving
discriminatory power, we perceive their major contribution to be their usefulness as diagnostic tools for the selection of
rating factors and the development of simpler, parametric models.
相似文献
Stefan DenzlerEmail: |
17.
Approximate factor models and their extensions are widely used in economic analysis and forecasting due to their ability to extracting useful information from a large number of relevant variables. In these models, candidate predictors are typically subject to some common components. In this paper we propose a new method for robustly estimating the approximate factor models and use it in risk assessments. We consider a class of approximate factor models in which the candidate predictors are additionally subject to idiosyncratic large uncommon components such as jumps or outliers. By assuming that occurrences of the uncommon components are rare, we develop an estimation procedure to simultaneously disentangle and estimate the common and uncommon components. We then use the proposed method to investigate whether risks from the latent factors are priced for expected returns of Fama and French 100 size and book-to-market ratio portfolios. We find that while the risk from the common factor is priced for the 100 portfolios, the risks from the idiosyncratic factors are not. However, we find that model uncertainty risks of the idiosyncratic factors are priced, suggesting that with effective diversifications, only the predictable idiosyncratic risks can be reduced, but the unpredictable ones may still exist. We also illustrate how the proposed method can be adopted on evaluating value at risk (VaR) and find it can delivery comparable results as the conventional methods on VaR evaluations. 相似文献
18.
This paper presents a new framework to model and calibrate the process of firm value evolution when an unanticipated exogenous event impacting one firm can contagiously affect other firms. The nature of propagation of such contagion is determined by the underlying connections between firms, which can adversely affect the tail risks of firm value, hence the securities issued by the firm. This paper combines the insights gained from the existing firm-value models and historical events into a structural model for flow of contagion among firms using a network-based approach. Rather than using stylized networks, we develop a data-driven approach for network construction where we define and calibrate several contagion variables to model the spread of contagion. This framework is applied for assessing firm-level risk under downside risk measures. Using actual data, our model illustrates how connections between firms can lead to heavy-tailed default distributions and default clustering observed in practice. 相似文献
19.
Our paper contributes to the nascent literature on forecasting the Chinese macroeconomy in a data-rich environment. We perform a horse race among a large set of traditional models and two classes of factor models with 251 monthly and 34 quarterly macroeconomic variables over the period from January 2002 to June 2018. We find evidence that mixed-frequency factor models provide superior forecasts of the CPI, RPI, investment, and consumption when compared to the simple benchmark. During the Global Financial Crisis period, we find little evidence of superiority of factor models over the simple benchmark AR(p) model. 相似文献
20.
According to Solvency II directive, each insurance company could determine solvency capital requirements using its own, tailor made, internal model. This highlights the urgency of having fast numerical tools based on practically-oriented mathematical models. From the Solvency II perspective discrete time framework seems to be the most relevant one. In this paper, we propose a number of fast and accurate approximations of ruin probabilities involving some integral operator and examine them along strictly theoretical as well as numerical lines. For a few claim distributions the approximations are shown to be exact. In general, we prove that they converge with an exponential rate to the exact ruin probabilities without any restrictive assumptions on the claim distribution. A fast algorithm to approximate ruin probabilities by a numerical fixed point of the involved integral operator is given. As an application, ruin probabilities for, e.g. normally and Weibull – distributed claims are computed. Comparisons with discrete time counterparts of some continuous time approximation methods are also carried out. Numerical studies show that our approximations are precise both for small and large values of the initial surplus u. In contrast, the empirical De Vylder-type ones strongly depend on the claim distributions and are less precise for small and medium values of u. 相似文献