首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到11条相似文献,搜索用时 0 毫秒
1.
This paper describes the methods used by Team Cassandra, a joint effort between IBM Research Australia and the University of Melbourne, in the GEFCom2017 load forecasting competition. An important first phase in the forecasting effort involved a deep exploration of the underlying dataset. Several data visualisation techniques were applied to help us better understand the nature and size of gaps, outliers, the relationships between different entities in the dataset, and the relevance of custom date ranges. Improved, cleaned data were then used to train multiple probabilistic forecasting models. These included a number of standard and well-known approaches, as well as a neural-network based quantile forecast model that was developed specifically for this dataset. Finally, model selection and forecast combination were used to choose a custom forecasting model for every entity in the dataset.  相似文献   

2.
Despite the clear success of forecast combination in many economic environments, several important issues remain incompletely resolved. The issues relate to the selection of the set of forecasts to combine, and whether some form of additional regularization (e.g., shrinkage) is desirable. Against this background, and also considering the frequently-found good performance of simple-average combinations, we propose a LASSO-based procedure that sets some combining weights to zero and shrinks the survivors toward equality (“partially-egalitarian LASSO”). Ex post analysis reveals that the optimal solution has a very simple form: the vast majority of forecasters should be discarded, and the remainder should be averaged. We therefore propose and explore direct subset-averaging procedures that are motivated by the structure of partially-egalitarian LASSO and the lessons learned, which, unlike LASSO, do not require the choice of a tuning parameter. Intriguingly, in an application to the European Central Bank Survey of Professional Forecasters, our procedures outperform simple average and median forecasts; indeed, they perform approximately as well as the ex post best forecaster.  相似文献   

3.
We propose an automated method for obtaining weighted forecast combinations using time series features. The proposed approach involves two phases. First, we use a collection of time series to train a meta-model for assigning weights to various possible forecasting methods with the goal of minimizing the average forecasting loss obtained from a weighted forecast combination. The inputs to the meta-model are features that are extracted from each series. Then, in the second phase, we forecast new series using a weighted forecast combination, where the weights are obtained from our previously trained meta-model. Our method outperforms a simple forecast combination, as well as all of the most popular individual methods in the time series forecasting literature. The approach achieved second position in the M4 competition.  相似文献   

4.
US yield curve dynamics are subject to time-variation, but there is ambiguity about its precise form. This paper develops a vector autoregressive (VAR) model with time-varying parameters and stochastic volatility, which treats the nature of parameter dynamics as unknown. Coefficients can evolve according to a random walk, a Markov switching process, observed predictors, or depend on a mixture of these. To decide which form is supported by the data and to carry out model selection, we adopt Bayesian shrinkage priors. Our framework is applied to model the US yield curve. We show that the model forecasts well, and focus on selected in-sample features to analyze determinants of structural breaks in US yield curve dynamics.  相似文献   

5.
We develop a Bayesian random compressed multivariate heterogeneous autoregressive (BRC-MHAR) model to forecast the realized covariance matrices of stock returns. The proposed model randomly compresses the predictors and reduces the number of parameters. We also construct several competing multivariate volatility models with the alternative shrinkage methods to compress the parameter’s dimensions. We compare the forecast performances of the proposed models with the competing models based on both statistical and economic evaluations. The results of statistical evaluation suggest that the BRC-MHAR models have the better forecast precision than the competing models for the short-term horizon. The results of economic evaluation suggest that the BRC-MHAR models are superior to the competing models in terms of the average return, the Shape ratio and the economic value.  相似文献   

6.
In this paper, we propose a novel approach to econometric forecasting of stationary and ergodic time series within a panel-data framework. Our key element is to employ the (feasible) bias-corrected average forecast. Using panel-data sequential asymptotics we show that it is potentially superior to other techniques in several contexts. In particular, it is asymptotically equivalent to the conditional expectation, i.e., has an optimal limiting mean-squared error. We also develop a zero-mean test for the average bias and discuss the forecast-combination puzzle in small and large samples. Monte-Carlo simulations are conducted to evaluate the performance of the feasible bias-corrected average forecast in finite samples. An empirical exercise, based upon data from a well known survey is also presented. Overall, these results show promise for the feasible bias-corrected average forecast.  相似文献   

7.
A complete procedure for calculating the joint predictive distribution of future observations based on the cointegrated vector autoregression is presented. The large degree of uncertainty in the choice of cointegration vectors is incorporated into the analysis via the prior distribution. This prior has the effect of weighing the predictive distributions based on the models with different cointegration vectors into an overall predictive distribution. The ideas of Litterman [Mimeo, Massachusetts Institute of Technology, 1980] are adopted for the prior on the short run dynamics of the process resulting in a prior which only depends on a few hyperparameters. A straightforward numerical evaluation of the predictive distribution based on Gibbs sampling is proposed. The prediction procedure is applied to a seven-variable system with a focus on forecasting Swedish inflation.  相似文献   

8.
Predicting the evolution of mortality rates plays a central role for life insurance and pension funds. Various stochastic frameworks have been developed to model mortality patterns by taking into account the main stylized facts driving these patterns. However, relying on the prediction of one specific model can be too restrictive and can lead to some well-documented drawbacks, including model misspecification, parameter uncertainty, and overfitting. To address these issues we first consider mortality modeling in a Bayesian negative-binomial framework to account for overdispersion and the uncertainty about the parameter estimates in a natural and coherent way. Model averaging techniques are then considered as a response to model misspecifications. In this paper, we propose two methods based on leave-future-out validation and compare them to standard Bayesian model averaging (BMA) based on marginal likelihood. An intensive numerical study is carried out over a large range of simulation setups to compare the performances of the proposed methodologies. An illustration is then proposed on real-life mortality datasets, along with a sensitivity analysis to a Covid-type scenario. Overall, we found that both methods based on an out-of-sample criterion outperform the standard BMA approach in terms of prediction performance and robustness.  相似文献   

9.
The safety stock calculation requires a measure of the forecast error uncertainty. Such errors are usually assumed to be Gaussian iid (independently and identically distributed). However, deviations from iid lead to a deterioration in the performance of the supply chain. Recent research has shown that, contrary to theoretical approaches, empirical techniques that do not rely on the aforementioned assumptions can enhance the calculation of safety stocks. In particular, GARCH models cope with time-varying heterocedastic forecast error, and kernel density estimation does not need to rely on a determined distribution. However, if the forecast errors are time-varying heterocedastic and do not follow a determined distribution, the previous approaches are inadequate. We overcome this by proposing an optimal combination of the empirical methods that minimizes the asymmetric piecewise linear loss function, also known as the tick loss. The results show that combining quantile forecasts yields safety stocks with a lower cost. The methodology is illustrated with simulations and real data experiments for different lead times.  相似文献   

10.
Forecast combination is a well-established and well-tested approach for improving the forecasting accuracy. One beneficial strategy is to use constituent forecasts that have diverse information. In this paper we consider the idea of diversity being accomplished by using different time aggregations. For example, we could create a yearly time series from a monthly time series and produce forecasts for both, then combine the forecasts. These forecasts would each be tracking the dynamics of different time scales, and would therefore add diverse types of information. A comparison of several forecast combination methods, performed in the context of this setup, shows that this is indeed a beneficial strategy and generally provides a forecasting performance that is better than the performances of the individual forecasts that are combined.As a case study, we consider the problem of forecasting monthly tourism numbers for inbound tourism to Egypt. Specifically, we consider 33 individual source countries, as well as the aggregate. The novel combination strategy also produces a generally improved forecasting accuracy.  相似文献   

11.
We extend the recently introduced latent threshold dynamic models to include dependencies among the dynamic latent factors which underlie multivariate volatility. With an ability to induce time-varying sparsity in factor loadings, these models now also allow time-varying correlations among factors, which may be exploited in order to improve volatility forecasts. We couple multi-period, out-of-sample forecasting with portfolio analysis using standard and novel benchmark neutral portfolios. Detailed studies of stock index and FX time series include: multi-period, out-of-sample forecasting, statistical model comparisons, and portfolio performance testing using raw returns, risk-adjusted returns and portfolio volatility. We find uniform improvements on all measures relative to standard dynamic factor models. This is due to the parsimony of latent threshold models and their ability to exploit between-factor correlations so as to improve the characterization and prediction of volatility. These advances will be of interest to financial analysts, investors and practitioners, as well as to modeling researchers.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号