首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
This article proposes a new technique for estimating trend and multiplicative seasonality in time series data. The technique is computationally quite straightforward and gives better forecasts (in a sense described below) than other commonly used methods. Like many other methods, the one presented here is basically a decomposition technique, that is, it attempts to isolate and estimate the several subcomponents in the time series. It draws primarily on regression analysis for its power and has some of the computational advantages of exponential smoothing. In particular, old estimates of base, trend, and seasonality may be smoothed with new data as they occur. The basic technique was developed originally as a way to generate initial parameter values for a Winters exponential smoothing model [4], but it proved to be a useful forecasting method in itself.The objective in all decomposition methods is to separate somehow the effects of trend and seasonality in the data, so that the two may be estimated independently. When seasonality is modeled with an additive form (Datum = Base + Trend + Seasonal Factor), techniques such as regression analysis with dummy variables or ratio-to-moving-average techniques accomplish this task well. It is more common, however, to model seasonality as a multiplicative form (as in the Winters model, for example, where Datum = [Base + Trend] * Seasonal Factor). In this case, it can be shown that neither of the techniques above achieves a proper separation of the trend and seasonal effects, and in some instances may give highly misleading results. The technique described in this article attempts to deal properly with multiplicative seasonality, while remaining computationally tractable.The technique is built on a set of simple regression models, one for each period in the seasonal cycle. These models are used to estimate individual seasonal effects and then pooled to estimate the base and trend. As new data occur, they are smoothed into the least-squares formulas with computations that are quite similar to those used in ordinary exponential smoothing. Thus, the full least-squares computations are done only once, when the forecasting process is first initiated. Although the technique is demonstrated here under the assumption that trend is linear, the trend may, in fact, assume any form for which the curve-fitting tools are available (exponential, polynomial, etc.).The method has proved to be easy to program and execute, and computational experience has been quite favorable. It is faster than the RTMA method or regression with dummy variables (which requires a multiple regression routine), and it is competitive with, although a bit slower than, ordinary triple exponential smoothing.  相似文献   

2.
It is a common practice to complement a forecasting method such as simple exponential smoothing with a monitoring scheme to detect those situations where forecasts have failed to adapt to structural change. It will be suggested in this paper that the equations for simple exponential smoothing can be augmented by a common monitoring statistic to provide a method that automatically adapts to structural change without human intervention. The resulting method, which turns out to be a restricted form of damped trend corrected exponential smoothing, is compared with related methods on the annual data from the M3 competition. It is shown to be better than simple exponential smoothing and more consistent than traditional damped trend exponential smoothing.  相似文献   

3.
Forecasting the term structure of government bond yields   总被引:7,自引:1,他引:6  
Despite powerful advances in yield curve modeling in the last 20 years, comparatively little attention has been paid to the key practical problem of forecasting the yield curve. In this paper we do so. We use neither the no-arbitrage approach nor the equilibrium approach. Instead, we use variations on the Nelson–Siegel exponential components framework to model the entire yield curve, period-by-period, as a three-dimensional parameter evolving dynamically. We show that the three time-varying parameters may be interpreted as factors corresponding to level, slope and curvature, and that they may be estimated with high efficiency. We propose and estimate autoregressive models for the factors, and we show that our models are consistent with a variety of stylized facts regarding the yield curve. We use our models to produce term-structure forecasts at both short and long horizons, with encouraging results. In particular, our forecasts appear much more accurate at long horizons than various standard benchmark forecasts.  相似文献   

4.
This paper considers the problem of forecasting under continuous and discrete structural breaks and proposes weighting observations to obtain optimal forecasts in the MSFE sense. We derive optimal weights for one step ahead forecasts. Under continuous breaks, our approach largely recovers exponential smoothing weights. Under discrete breaks, we provide analytical expressions for optimal weights in models with a single regressor, and asymptotically valid weights for models with more than one regressor. It is shown that in these cases the optimal weight is the same across observations within a given regime and differs only across regimes. In practice, where information on structural breaks is uncertain, a forecasting procedure based on robust optimal weights is proposed. The relative performance of our proposed approach is investigated using Monte Carlo experiments and an empirical application to forecasting real GDP using the yield curve across nine industrial economies.  相似文献   

5.
We propose a new way of selecting among model forms in automated exponential smoothing routines, consequently enhancing their predictive power. The procedure, here addressed as treating, operates by selectively subsetting the ensemble of competing models based on information from their prediction intervals. By the same token, we set forth a pruning strategy to improve the accuracy of both point forecasts and prediction intervals in forecast combination methods. The proposed approaches are respectively applied to automated exponential smoothing routines and Bagging algorithms, to demonstrate their potential. An empirical experiment is conducted on a wide range of series from the M-Competitions. The results attest that the proposed approaches are simple, without requiring much additional computational cost, but capable of substantially improving forecasting accuracy for both point forecasts and prediction intervals, outperforming important benchmarks and recently developed forecast combination methods.  相似文献   

6.
In the present paper, we attempt a critical evaluation of macroeconomic forecasting in Austria. For this purpose, we calculate conventional magnitude measures of accuracy as well as probabilities of correctly predicting directional change for the forecasts made by two Austrian institutions (WIFO and IHS) and by the OECD. ARIMA models and Holt-Winters exponential smoothing serve as benchmarks for comparison.  相似文献   

7.
For many companies, automatic forecasting has come to be an essential part of business analytics applications. The large amounts of data available, the short life-cycle of the analysis and the acceleration of business operations make traditional manual data analysis unfeasible in such environments. In this paper, an automatic forecasting support system that comprises several methods and models is developed in a general state space framework built in the SSpace toolbox written for Matlab. Some of the models included are well-known, such as exponential smoothing and ARIMA, but we also propose a new model family that has been used only very rarely in this context, namely unobserved components models. Additional novelties include the use of unobserved components models in an automatic identification environment and the comparison of their forecasting performances with those of exponential smoothing and ARIMA models estimated using different software packages. The new system is tested empirically on a daily dataset of all of the products sold by a franchise chain in Spain (166 products over a period of 517 days). The system works well in practice and the proposed automatic unobserved components models compare very favorably with other methods and other well-known software packages in forecasting terms.  相似文献   

8.
广西北部湾港口物流需求预测及发展模式研究   总被引:1,自引:0,他引:1  
广西北部湾港作为一个组合港,港口物流需求量日益扩大,为掌握其港口物流发展态势,拟采用三次指数平滑法构建广西北部湾港口物流需求预测模型,基于2000~2009十年港口吞吐量,对2010~2012年港口吞吐量作出预测,以及对港口物流科学发展提出对策建议。  相似文献   

9.
The well-developed ETS (ExponenTial Smoothing, or Error, Trend, Seasonality) method incorporates a family of exponential smoothing models in state space representation and is widely used for automatic forecasting. The existing ETS method uses information criteria for model selection by choosing an optimal model with the smallest information criterion among all models fitted to a given time series. The ETS method under such a model selection scheme suffers from computational complexity when applied to large-scale time series data. To tackle this issue, we propose an efficient approach to ETS model selection by training classifiers on simulated data to predict appropriate model component forms for a given time series. We provide a simulation study to show the model selection ability of the proposed approach on simulated data. We evaluate our approach on the widely used M4 forecasting competition dataset in terms of both point forecasts and prediction intervals. To demonstrate the practical value of our method, we showcase the performance improvements from our approach on a monthly hospital dataset.  相似文献   

10.
Information criteria (IC) are often used to decide between forecasting models. Commonly used criteria include Akaike's IC and Schwarz's Bayesian IC. They involve the sum of two terms: the model's log likelihood and a penalty for the number of model parameters. The likelihood is calculated with equal weight being given to all observations. We propose that greater weight should be put on more recent observations in order to reflect more recent accuracy. This seems particularly pertinent when selecting among exponential smoothing methods, as they are based on an exponential weighting principle. In this paper, we use exponential weighting within the calculation of the log likelihood for the IC. Our empirical analysis uses supermarket sales and call centre arrivals data. The results show that basing model selection on the new exponentially weighted IC can outperform individual models and selection based on the standard IC.  相似文献   

11.
This paper describes the approach that we implemented for producing the point forecasts and prediction intervals for our M4-competition submission. The proposed simple combination of univariate models (SCUM) is a median combination of the point forecasts and prediction intervals of four models, namely exponential smoothing, complex exponential smoothing, automatic autoregressive integrated moving average and dynamic optimised theta. Our submission performed very well in the M4-competition, being ranked 6th for the point forecasts (with a small difference compared to the 2nd submission) and prediction intervals and 2nd and 3rd for the point forecasts of the weekly and quarterly data respectively.  相似文献   

12.
Looking ahead thirty years is a difficult task, but is not impossible. In this paper we illustrate how to evaluate such long-term forecasts. Long-term forecasting is likely to be dominated by trend curves, particularly the simple linear and exponential trends. However, there will certainly be breaks in their parameter values at some unknown points, so that eventually the forecasts will be unsatisfactory. We investigate whether or not simple methods of long-run forecasting can ever be successful, after one takes into account the uncertainty level associated with the forecasts.  相似文献   

13.
With the concept of trend inflation now being widely understood to be important to the accuracy of longer-term inflation forecasts, this paper assesses alternative models of trend inflation. Reflecting the models which are common in reduced-form inflation modeling and forecasting, we specify a range of models of inflation that incorporate different trend specifications. We compare the models on the basis of their accuracies in out-of-sample forecasting, both point and density. Our results show that it is difficult to say that any one model of trend inflation is the best. Several different trend specifications seem to be about equally accurate, and the relative accuracy is somewhat prone to instabilities over time.  相似文献   

14.
Combination methods have performed well in time series forecast competitions. This study proposes a simple but general methodology for combining time series forecast methods. Weights are calculated using a cross-validation scheme that assigns greater weights to methods with more accurate in-sample predictions. The methodology was used to combine forecasts from the Theta, exponential smoothing, and ARIMA models, and placed fifth in the M4 Competition for both point and interval forecasting.  相似文献   

15.
In Winters’ seasonal exponential smoothing methods, a time series is decomposed into: level, trend and seasonal components, that change over time. The seasonal factors are initialized so that their average is 0 in the additive version or 1 in the multiplicative version. Usually, only one seasonal factor is updated each period, and the average of the seasonal factors is no longer 0 or 1; the ‘seasonal factors’ no longer meet the usual meaning of seasonal factors. We provide an equivalent reformulation of previous equations for renormalizing the components in the additive version. This form of the renormalization equations is then adapted to new renormalization formulas for the multiplicative Winters’ method. For both the standard and renormalized equations we make a minor change to the seasonal equation. Predictions from our renormalized smoothing values are the same as for the original smoothed values. The formulas can be applied every period, or when required. However, we recommend renormalization every time period. We show in the multiplicative version that the level and trend should be adjusted along with the seasonal component.  相似文献   

16.
The forecast of the real estate market is an important part of studying the Chinese economic market. Most existing methods have strict requirements on input variables and are complex in parameter estimation. To obtain better prediction results, a modified Holt's exponential smoothing (MHES) method was proposed to predict the housing price by using historical data. Unlike the traditional exponential smoothing models, MHES sets different weights on historical data and the smoothing parameters depend on the sample size. Meanwhile, the proposed MHES incorporates the whale optimization algorithm (WOA) to obtain the optimal parameters. Housing price data from Kunming, Changchun, Xuzhou and Handan were used to test the performance of the model. The housing prices results of four cities indicate that the proposed method has a smaller prediction error and shorter computation time than that of other traditional models. Therefore, WOA-MHES can be applied efficiently to housing price forecasting and can be a reliable tool for market investors and policy makers.  相似文献   

17.
Global forecasting models (GFMs) that are trained across a set of multiple time series have shown superior results in many forecasting competitions and real-world applications compared with univariate forecasting approaches. One aspect of the popularity of statistical forecasting models such as ETS and ARIMA is their relative simplicity and interpretability (in terms of relevant lags, trend, seasonality, and other attributes), while GFMs typically lack interpretability, especially relating to particular time series. This reduces the trust and confidence of stakeholders when making decisions based on the forecasts without being able to understand the predictions. To mitigate this problem, we propose a novel local model-agnostic interpretability approach to explain the forecasts from GFMs. We train simpler univariate surrogate models that are considered interpretable (e.g., ETS) on the predictions of the GFM on samples within a neighbourhood that we obtain through bootstrapping, or straightforwardly as the one-step-ahead global black-box model forecasts of the time series which needs to be explained. After, we evaluate the explanations for the forecasts of the global models in both qualitative and quantitative aspects such as accuracy, fidelity, stability, and comprehensibility, and are able to show the benefits of our approach.  相似文献   

18.
In this study we analyze existing and improved methods for forecasting incoming calls to telemarketing centers for the purposes of planning and budgeting. We analyze the use of additive and multiplicative versions of Holt–Winters (HW) exponentially weighted moving average models and compare it to Box–Jenkins (ARIMA) modeling with intervention analysis. We determine the forecasting accuracy of HW and ARIMA models for samples of telemarketing data. Although there is much evidence in recent literature that “simple models” such as Holt–Winters perform as well as or better than more complex models, we find that ARIMA models with intervention analysis perform better for the time series studied.  相似文献   

19.
This paper evaluates the performances of prediction intervals generated from alternative time series models, in the context of tourism forecasting. The forecasting methods considered include the autoregressive (AR) model, the AR model using the bias-corrected bootstrap, seasonal ARIMA models, innovations state space models for exponential smoothing, and Harvey’s structural time series models. We use thirteen monthly time series for the number of tourist arrivals to Hong Kong and Australia. The mean coverage rates and widths of the alternative prediction intervals are evaluated in an empirical setting. It is found that all models produce satisfactory prediction intervals, except for the autoregressive model. In particular, those based on the bias-corrected bootstrap perform best in general, providing tight intervals with accurate coverage rates, especially when the forecast horizon is long.  相似文献   

20.
Standard selection criteria for forecasting models focus on information that is calculated for each series independently, disregarding the general tendencies and performance of the candidate models. In this paper, we propose a new way to perform statistical model selection and model combination that incorporates the base rates of the candidate forecasting models, which are then revised so that the per-series information is taken into account. We examine two schemes that are based on the precision and sensitivity information from the contingency table of the base rates. We apply our approach on pools of either exponential smoothing or ARMA models, considering both simulated and real time series, and show that our schemes work better than standard statistical benchmarks. We test the significance and sensitivity of our results, discuss the connection of our approach to other cross-learning approaches, and offer insights regarding implications for theory and practice.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号