首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 578 毫秒
1.
We consider an extension of the Markowitz mean–variance optimization framework to multiple return and risk scenarios. It is well known that asset return forecasts and risk estimates are inherently inaccurate. The method proposed provides a means for considering rival representations of the future. The optimal portfolio is computed, simultaneously with the worst case, to take account of all rival scenarios. This is a min-max strategy which is essentially equivalent to a robust pooling of the scenarios. Robustness is ensured by the noninferiority of min–max. For example, a basic worst-case optimal return is guaranteed in view of multiple return scenarios. If robustness happens to have too high a cost, guided by the min–max pooling, it is also possible to explore other pooling alternatives. A min–max algorithm is used to solve the problem and illustrate the robust character of min–max with return and risk scenarios. We study the properties of the min–max risk–return frontier and compare with the potentially suboptimal worst-case where the investment strategy and the worst case are computed separately.  相似文献   

2.
Data weaknesses (such as collinearity) reduce the quality of least-squares estimates by inflating parameter variances. Standard regression diagnostics and statistical tests of hypothesis are unable to indicate such variance inflation and hence cannot detect data weaknesses. In this paper, then, we consider a different means for determining the presence of weak data based on a test for signal-to-noise in which the size of the parameter variance (noise) is assessed relative to the magnitude of the parameter (signal). This test is combined with other collinearity diagnostics to provide a test for the presence of harmful collinearity and/or short data. The entire procedure is illustrated with an equation from the Michigan Quarterly Econometric Model. Tables of critical values for the test are provided in an appendix.  相似文献   

3.
The concepts of robust parameter design (RPD) are based on the philosophy of Genichi Taguchi, who introduced the concepts after several years of research. Robust engineering which is also referred to as Taguchi Methods, (TM), systematically evolved starting in the 1950s and aims at providing industries with a cost effective methodology for enhancing their competitive position in the global market. In this study, RE Methods has been used to improve the quality of oil pump housing production of Iranian diesel engine manufacturing (IDEM) Company in which 134 improvement over the current condition has been achieved. It is interesting to mention that the above wonderful result has been obtained under vast conditions of production and operation environments, and increasing production capacity and not imposing any further investment in factors of production.  相似文献   

4.
The latest development in the asset pricing literature is the emergence of empirical asset pricing models comprising q‐factors (profitability and investment factors) in conjunction with other factors. However, as in the case of the older empirical models, there is scepticism regarding the application of these newer factor models consisting of q‐factors because of the debate surrounding the explanatory power of these empirically inspired asset pricing models. This review attempts to synthesize studies pertaining to the four alternative explanations of the asset pricing models comprising the q‐factors (profitability and investment) – the data snooping hypothesis, the risk‐based explanation, the irrational investor behaviour explanation and the interpretation that suggest that the combination of the risk‐free asset and the factors comprising the model span the mean‐variance efficient tangency portfolio that prices the universe of assets.  相似文献   

5.
Multicollinearity is one of the most important issues in regression analysis, as it produces unstable coefficients’ estimates and makes the standard errors severely inflated. The regression theory is based on specific assumptions concerning the set of error random variables. In particular, when errors are uncorrelated and have a constant variance, the ordinary least squares estimator produces the best estimates among all linear estimators. If, as often happens in reality, these assumptions are not met, other methods might give more efficient estimates and their use is therefore recommendable. In this paper, after reviewing and briefly describing the salient features of the methods, proposed in the literature, to determine and address the multicollinearity problem, we introduce the Lpmin method, based on Lp-norm estimation, an adaptive robust procedure that is used when the residual distribution has deviated from normality. The major advantage of this approach is that it produces more efficient estimates of the model parameters, for different degrees of multicollinearity, than those generated by the ordinary least squares method. A simulation study and a real-data application are also presented, in order to show the better results provided by the Lpmin method in the presence of multicollinearity.  相似文献   

6.
The process capability index C pm , which considers the process variance and departure of the process mean from the target value, is important in the manufacturing industry to measure process potential and performance. This paper extends its applications to calculate the process capability index [(C)\tilde]pm{\tilde {C}_{pm} } of fuzzy numbers. In this paper, the α-cuts of fuzzy observations are first derived based on various values of α. The membership function of fuzzy process capability index [(C)\tilde]pm{\tilde {C}_{pm} } is then constructed based on the α-cuts of fuzzy observations. An example is presented to demonstrate how the fuzzy process capability index [(C)\tilde]pm{\tilde {C}_{pm} } is interpreted. When the quality characteristic cannot be precisely determined, the proposed method provides the most possible value and spread of fuzzy process capability index [(C)\tilde]pm{\tilde {C}_{pm} }. With crisp data, the proposed method reduces to the classical method of process capability index C pm .  相似文献   

7.
This paper develops a new approach for variance trading. We show that the discretely-sampled realized variance can be robustly replicated under very general conditions, including when the price can jump. The replication strategy specifies the exact timing for rebalancing in the underlying. The deviations from the optimal schedule can lead to surprisingly large hedging errors. In the empirical application, we synthesize the prices of the variance contract on S&P 500 index over the period from 01/1990 to 12/2009. We find that the market variance risk is priced, its risk premium is negative and economically very large. The variance risk premium cannot be explained by the known risk factors and option returns.  相似文献   

8.
In dynamic panel regression, when the variance ratio of individual effects to disturbance is large, the system‐GMM estimator will have large asymptotic variance and poor finite sample performance. To deal with this variance ratio problem, we propose a residual‐based instrumental variables (RIV) estimator, which uses the residual from regressing Δyi,t?1 on as the instrument for the level equation. The RIV estimator proposed is consistent and asymptotically normal under general assumptions. More importantly, its asymptotic variance is almost unaffected by the variance ratio of individual effects to disturbance. Monte Carlo simulations show that the RIV estimator has better finite sample performance compared to alternative estimators. The RIV estimator generates less finite sample bias than difference‐GMM, system‐GMM, collapsing‐GMM and Level‐IV estimators in most cases. Under RIV estimation, the variance ratio problem is well controlled, and the empirical distribution of its t‐statistic is similar to the standard normal distribution for moderate sample sizes.  相似文献   

9.
基于QFD、TRIZ和田口方法的集成化设计原理与方法研究   总被引:1,自引:0,他引:1  
产品的设计质量决定了产品的固有质量。质量功能展开(QFD)是以顾客需求为导向,建立顾客要求与设计要求间的联系,寻找和挖掘影响产品质量的关键因素。创新问题解决理论(TRIZ)是一种专门研究创新和概念设计的理论,可通过普遍性工具获得领域解。田口方法可将设计概念转化为一系列切实可行的设计参数。文章对这三种设计方法进行了比较分析,并讨论了它们的集成应用问题。  相似文献   

10.
The object of this paper is to produce distributional forecasts of asset price volatility and its associated risk premia using a non-linear state space approach. Option and spot market information on the latent variance process is captured by using dual ‘model-free’ variance measures to define a bivariate observation equation in the state space model. The premium for variance diffusive risk is defined as linear in the latent variance (in the usual fashion) whilst the premium for variance jump risk is specified as a conditionally deterministic dynamic process, driven by a function of past measurements. The inferential approach adopted is Bayesian, implemented via a Markov chain Monte Carlo algorithm that caters for the multiple sources of non-linearity in the model and for the bivariate measure. The method is applied to spot and option price data on the S&P500 index from 1999 to 2008, with conclusions drawn about investors’ required compensation for variance risk during the recent financial turmoil. The accuracy of the probabilistic forecasts of the observable variance measures is demonstrated, and compared with that of forecasts yielded by alternative methods. To illustrate the benefits of the approach, it is used to produce forecasts of prices of derivatives on volatility itself. In addition, the posterior distribution is augmented by information on daily returns to produce value at risk predictions. Linking the variance risk premia to the risk aversion parameter in a representative agent model, probabilistic forecasts of (approximate) relative risk aversion are also produced.  相似文献   

11.
The etiology of many human diseases is complex and very likely involves a combination of genetic and environmental risk factors. A popular strategy to detect genetic risk factors is to perform a systematic screening of the genome searching for linkage. The power of such and approach depends very much on the unknown characteristics of the genetic factors and the main difficulty is to establish a good trade-off between false positives and false negatives. Besides, a precise localisation of the risk factor will generally not be obtained. The set up of a candidate gene stratery is necessary to go further in genetic factor identification. It is likely that for multicfactorioal diseases the only genetic risk factors that can be detected are those with fairly strong effect. Even in that case, it is important to design strategies which increase the power of detection and provide for a better evaluation of the associated risks.  相似文献   

12.
This paper discusses pooling versus model selection for nowcasting with large datasets in the presence of model uncertainty. In practice, nowcasting a low‐frequency variable with a large number of high‐frequency indicators should account for at least two data irregularities: (i) unbalanced data with missing observations at the end of the sample due to publication delays; and (ii) different sampling frequencies of the data. Two model classes suited in this context are factor models based on large datasets and mixed‐data sampling (MIDAS) regressions with few predictors. The specification of these models requires several choices related to, amongst other things, the factor estimation method and the number of factors, lag length and indicator selection. Thus there are many sources of misspecification when selecting a particular model, and an alternative would be pooling over a large set of different model specifications. We evaluate the relative performance of pooling and model selection for nowcasting quarterly GDP for six large industrialized countries. We find that the nowcast performance of single models varies considerably over time, in line with the forecasting literature. Model selection based on sequential application of information criteria can outperform benchmarks. However, the results highly depend on the selection method chosen. In contrast, pooling of nowcast models provides an overall very stable nowcast performance over time. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

13.
We explicitly compute closed formulas for the minimal variance hedging strategy in discrete time of a European option and for the variance of the corresponding hedging error under the hypothesis that the underlying asset is a martingale following a geometric Brownian motion. The formulas are easy to implement, hence the optimal hedge ratio can be employed as a valid substitute of the standard Black–Scholes delta, and the knowledge of the variance of the total error can be a useful tool for measuring and managing the hedging risk.  相似文献   

14.
引言 线内质量工程学(On—line Quality Engineering),又称制造阶段的质量工程学,是日本著名质量管理专家田口玄一博士针对生产现场的工序管理和产品管理提出的系列理论和方法,其核心是基于田口质量观的质量损失函数。  相似文献   

15.
林琳  吴成锋 《价值工程》2007,26(7):96-98
介绍一种通过适当试验设计,运用试验结果数据近似模拟产品性能指标的均值模型和方差模型的双响应曲面建模方法(DRSM),将田口质量损失函数模型和容差—成本函数模型综合考虑,即提出一种基于田口质量损失函数的SEA模型设计方法,可以提高质量系统稳健性及其优化效率。最后通过实例证明该方法的有效性。  相似文献   

16.
The role of uniformity measured by the centered L 2-discrepancy (Hickernell 1998a) has been studied in fractional factorial designs. The issue of a lower bound for the centered L 2-discrepancy is crucial in the construction of uniform designs. Fang and Mukerjee (2000) and Fang et al. (2002, 2003b) derived lower bounds for fractions of two- and three-level factorials. In this paper we report some new lower bounds for the centered L 2-discrepancy for a set of asymmetric fraction factorials. Using these lower bounds helps to measure uniformity of a given design. In addition, as an application of these lower bounds, we propose a method to construct uniform designs or nearly uniform designs with asymmetric factorials.  相似文献   

17.
In situations where a regression model is subject to one or more breaks it is shown that it can be optimal to use pre-break data to estimate the parameters of the model used to compute out-of-sample forecasts. The issue of how best to exploit the trade-off that might exist between bias and forecast error variance is explored and illustrated for the multivariate regression model under the assumption of strictly exogenous regressors. In practice when this assumption cannot be maintained and both the time and size of the breaks are unknown, the optimal choice of the observation window will be subject to further uncertainties that make exploiting the bias–variance trade-off difficult. To that end we propose a new set of cross-validation methods for selection of a single estimation window and weighting or pooling methods for combination of forecasts based on estimation windows of different lengths. Monte Carlo simulations are used to show when these procedures work well compared with methods that ignore the presence of breaks.  相似文献   

18.
This paper examines seasonality and momentum jointly across national equity markets at the index level. We find that seasonality and momentum are almost uncorrelated and appear to arise from different global or local risk factors, rather than from different loadings on the same risk factors. Employing a trading strategy that integrates seasonality and momentum parametrically, we confirm our conclusion about the relationship between seasonality and momentum: while the pure seasonality and momentum strategies individually generate sizable and significant returns, the combination strategy significantly outperforms the pure strategies in a way that is quantitatively consistent with their lack of correlation.  相似文献   

19.
This paper extends the jump detection method based on bipower variation to identify realized jumps on financial markets and to estimate parametrically the jump intensity, mean, and variance. Finite sample evidence suggests that the jump parameters can be accurately estimated and that the statistical inferences are reliable under the assumption that jumps are rare and large. Applications to equity market, treasury bond, and exchange rate data reveal important differences in jump frequencies and volatilities across asset classes over time. For investment grade bond spread indices, the estimated jump volatility has more forecasting power than interest rate factors and volatility factors including option-implied volatility, with control for systematic risk factors. The jump volatility risk factor seems to capture the low frequency movements in credit spreads and comoves countercyclically with the price–dividend ratio and corporate default rate.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号