首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到14条相似文献,搜索用时 0 毫秒
1.
We provide a correction to Proposition 1 in Optimal and robust combination of forecasts via constrained optimization and shrinkage, published in the International Journal of Forecasting 38(1):97-116 (2021). This correction has no impact on any other result (neither theoretical nor empirical) provided in the above paper.  相似文献   

2.
Despite the clear success of forecast combination in many economic environments, several important issues remain incompletely resolved. The issues relate to the selection of the set of forecasts to combine, and whether some form of additional regularization (e.g., shrinkage) is desirable. Against this background, and also considering the frequently-found good performance of simple-average combinations, we propose a LASSO-based procedure that sets some combining weights to zero and shrinks the survivors toward equality (“partially-egalitarian LASSO”). Ex post analysis reveals that the optimal solution has a very simple form: the vast majority of forecasters should be discarded, and the remainder should be averaged. We therefore propose and explore direct subset-averaging procedures that are motivated by the structure of partially-egalitarian LASSO and the lessons learned, which, unlike LASSO, do not require the choice of a tuning parameter. Intriguingly, in an application to the European Central Bank Survey of Professional Forecasters, our procedures outperform simple average and median forecasts; indeed, they perform approximately as well as the ex post best forecaster.  相似文献   

3.
In predicting conditional covariance matrices of financial portfolios, practitioners are required to choose among several alternative options, facing a number of different sources of uncertainty. A first source is related to the frequency at which prices are observed, either daily or intradaily. Using prices sampled at higher frequency inevitably poses additional sources of uncertainty related to the selection of the optimal intradaily sampling frequency and to the construction of the best realized estimator. Likewise, the choices of model structure and estimation method also have a critical role. In order to alleviate the impact of these sources of uncertainty, we propose a forecast combination strategy based on the Model Confidence Set [MCS] to adaptively identify the set of most accurate predictors. The combined predictor is shown to achieve superior performance with respect to the whole model universe plus three additional competitors, independently of the MCS or portfolio settings.  相似文献   

4.
This article introduces the winning method at the M5 Accuracy competition. The presented method takes a simple manner of averaging the results of multiple base forecasting models that have been constructed via partial pooling of multi-level data. All base forecasting models of adopting direct or recursive multi-step forecasting methods are trained by the machine learning technique, LightGBM, from three different levels of data pools. At the competition, the simple averaging of the multiple direct and recursive forecasting models, called DRFAM, obtained the complementary effects between direct and recursive multi-step forecasting of the multi-level product sales to improve the accuracy and the robustness.  相似文献   

5.
This paper examines the out-of-sample forecasting properties of six different economic uncertainty variables for the growth of the real M2 and real M4 Divisia money series for the U.S. using monthly data. The core contention is that information on economic uncertainty improves the forecasting accuracy. We estimate vector autoregressive models using the iterated rolling-window forecasting scheme, in combination with modern regularisation techniques from the field of machine learning. Applying the Hansen-Lunde-Nason model confidence set approach under two different loss functions reveals strong evidence that uncertainty variables that are related to financial markets, the state of the macroeconomy or economic policy provide additional informational content when forecasting monetary dynamics. The use of regularisation techniques improves the forecast accuracy substantially.  相似文献   

6.
The empirical literature of stock market predictability mainly suffers from model uncertainty and parameter instability. To meet this challenge, we propose a novel approach that combines dimensionality reduction, regime-switching models, and forecast combination to predict excess returns on the S&P 500. First, we aggregate the weekly information of 146 popular macroeconomic and financial variables using different principal component analysis techniques. Second, we estimate Markov-switching models with time-varying transition probabilities using the principal components as predictors. Third, we pool the models in forecast clusters to hedge against model risk and to evaluate the usefulness of different specifications. Our weekly forecasts respond to regime changes in a timely manner to participate in recoveries or to prevent losses. This is also reflected in an improvement of risk-adjusted performance measures as compared to several benchmarks. However, when considering stock market returns, our forecasts do not outperform common benchmarks. Nevertheless, they do add statistical and, in particular, economic value during recessions or in declining markets.  相似文献   

7.
8.
This paper describes the methods used by Team Cassandra, a joint effort between IBM Research Australia and the University of Melbourne, in the GEFCom2017 load forecasting competition. An important first phase in the forecasting effort involved a deep exploration of the underlying dataset. Several data visualisation techniques were applied to help us better understand the nature and size of gaps, outliers, the relationships between different entities in the dataset, and the relevance of custom date ranges. Improved, cleaned data were then used to train multiple probabilistic forecasting models. These included a number of standard and well-known approaches, as well as a neural-network based quantile forecast model that was developed specifically for this dataset. Finally, model selection and forecast combination were used to choose a custom forecasting model for every entity in the dataset.  相似文献   

9.
We propose a new way of selecting among model forms in automated exponential smoothing routines, consequently enhancing their predictive power. The procedure, here addressed as treating, operates by selectively subsetting the ensemble of competing models based on information from their prediction intervals. By the same token, we set forth a pruning strategy to improve the accuracy of both point forecasts and prediction intervals in forecast combination methods. The proposed approaches are respectively applied to automated exponential smoothing routines and Bagging algorithms, to demonstrate their potential. An empirical experiment is conducted on a wide range of series from the M-Competitions. The results attest that the proposed approaches are simple, without requiring much additional computational cost, but capable of substantially improving forecasting accuracy for both point forecasts and prediction intervals, outperforming important benchmarks and recently developed forecast combination methods.  相似文献   

10.
Researchers from various scientific disciplines have attempted to forecast the spread of coronavirus disease 2019 (COVID-19). The proposed epidemic prediction methods range from basic curve fitting methods and traffic interaction models to machine-learning approaches. If we combine all these approaches, we obtain the Network Inference-based Prediction Algorithm (NIPA). In this paper, we analyse a diverse set of COVID-19 forecast algorithms, including several modifications of NIPA. Among the algorithms that we evaluated, the original NIPA performed best at forecasting the spread of COVID-19 in Hubei, China and in the Netherlands. In particular, we show that network-based forecasting is superior to any other forecasting algorithm.  相似文献   

11.
Since the introduction of the Basel II Accord, and given its huge implications for credit risk management, the modeling and prediction of the loss given default (LGD) have become increasingly important tasks. Institutions which use their own LGD estimates can build either simpler or more complex methods. Simpler methods are easier to implement and more interpretable, but more complex methods promise higher prediction accuracies. Using a proprietary data set of 1,184 defaulted corporate leases in Germany, this study explores different parametric, semi-parametric and non-parametric approaches that attempt to predict the LGD. By conducting the analyses for different information sets, we study how the prediction accuracy changes depending on the set of information that is available. Furthermore, we use a variable importance measure to identify the input variables that have the greatest effects on the LGD prediction accuracy for each method. In this regard, we provide new insights on the characteristics of leasing LGDs. We find that (1) more sophisticated methods, especially the random forest, lead to remarkable increases in the prediction accuracy; (2) updating information improves the prediction accuracy considerably; and (3) the outstanding exposure at default, an internal rating, asset types and lessor industries turn out to be important drivers of accurate LGD predictions.  相似文献   

12.
Establishing a robust facility location and assignment plan to improve the efficiency of the decontamination process is critical to alleviating the physical impact of the radiation leakage that occurs in a nuclear accident. This study develops an approach for optimizing the locations of decontamination facilities and assignments of affected villages. The approach is a robust optimization model that optimizes the worst-case performance. The system dynamic model is integrated into the robust optimization model to simulate the decontamination process and compute the decontamination time. A case study is conducted of the Plume Emergency Planning Zone in China. The results indicate that (1) a decontamination site location plan can be obtained in which each site is located in a different direction, (2) no evacuee will be allowed to travel across the downwind area in an assignment plan, and (3) a larger financial investment does not imply an increased decontamination efficiency. An appropriate budget exists that can balance the decontamination time and cost. The proposed model can assist decision makers in (i) better understanding the effects of decontamination site location and village assignment and (ii) deciding which location and assignment plans should be applied to cope with disruptive nuclear accidents.  相似文献   

13.
Forecast combination is a well-established and well-tested approach for improving the forecasting accuracy. One beneficial strategy is to use constituent forecasts that have diverse information. In this paper we consider the idea of diversity being accomplished by using different time aggregations. For example, we could create a yearly time series from a monthly time series and produce forecasts for both, then combine the forecasts. These forecasts would each be tracking the dynamics of different time scales, and would therefore add diverse types of information. A comparison of several forecast combination methods, performed in the context of this setup, shows that this is indeed a beneficial strategy and generally provides a forecasting performance that is better than the performances of the individual forecasts that are combined.As a case study, we consider the problem of forecasting monthly tourism numbers for inbound tourism to Egypt. Specifically, we consider 33 individual source countries, as well as the aggregate. The novel combination strategy also produces a generally improved forecasting accuracy.  相似文献   

14.
We study the behaviours of the Betfair betting market and the sterling/dollar exchange rate (futures price) during 24 June 2016, the night of the EU referendum. We investigate how the two markets responded to the announcement of the voting results by employing a Bayesian updating methodology to update prior opinion about the likelihood of the final outcome of the vote. We then relate the voting model to the real-time evolution of the market-determined prices as the results were announced. We find that, although both markets appear to be inefficient in absorbing the new information contained in the vote outcomes, the betting market seems less inefficient than the FX market. The different rates of convergence to the fundamental value between the two markets lead to highly profitable arbitrage opportunities.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号