首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Popular goodness-of-fit tests like the famous Pearson test compare the estimated probability mass function with the corresponding hypothetical one. If the resulting divergence value is too large, then the null hypothesis is rejected. If applied to i. i. d. data, the required critical values can be computed according to well-known asymptotic approximations, e. g., according to an appropriate \(\chi ^2\)-distribution in case of the Pearson statistic. In this article, an approach is presented of how to derive an asymptotic approximation if being concerned with time series of autocorrelated counts. Solutions are presented for the case of a fully specified null model as well as for the case where parameters have to be estimated. The proposed approaches are exemplified for (among others) different types of CLAR(1) models, INAR(p) models, discrete ARMA models and Hidden-Markov models.  相似文献   

2.
This paper reports the results of the NN3 competition, which is a replication of the M3 competition with an extension of the competition towards neural network (NN) and computational intelligence (CI) methods, in order to assess what progress has been made in the 10 years since the M3 competition. Two masked subsets of the M3 monthly industry data, containing 111 and 11 empirical time series respectively, were chosen, controlling for multiple data conditions of time series length (short/long), data patterns (seasonal/non-seasonal) and forecasting horizons (short/medium/long). The relative forecasting accuracy was assessed using the metrics from the M3, together with later extensions of scaled measures, and non-parametric statistical tests. The NN3 competition attracted 59 submissions from NN, CI and statistics, making it the largest CI competition on time series data. Its main findings include: (a) only one NN outperformed the damped trend using the sMAPE, but more contenders outperformed the AutomatANN of the M3; (b) ensembles of CI approaches performed very well, better than combinations of statistical methods; (c) a novel, complex statistical method outperformed all statistical and CI benchmarks; and (d) for the most difficult subset of short and seasonal series, a methodology employing echo state neural networks outperformed all others. The NN3 results highlight the ability of NN to handle complex data, including short and seasonal time series, beyond prior expectations, and thus identify multiple avenues for future research.  相似文献   

3.
In this paper the unit root tests proposed by Dickey and Fuller (DF) and their rank counterpart suggested by Breitung and Gouriéroux (J Econom 81(1): 7–27, 1997) (BG) are analytically investigated under the presence of additive outlier (AO) contaminations. The results show that the limiting distribution of the former test is outlier dependent, while the latter one is outlier free. The finite sample size properties of these tests are also investigated under different scenarios of testing contaminated unit root processes. In the empirical study, the alternative DF rank test suggested in Granger and Hallman (J Time Ser Anal 12(3): 207–224, 1991) (GH) is also considered. In Fotopoulos and Ahn (J Time Ser Anal 24(6): 647–662, 2003), these unit root rank tests were analytically and empirically investigated and compared to the DF test, but with outlier-free processes. Thus, the results provided in this paper complement the studies of the previous works, but in the context of time series with additive outliers. Equivalently to DF and Granger and Hallman (J Time Ser Anal 12(3): 207–224, 1991) unit root tests, the BG test shows to be sensitive to AO contaminations, but with less severity. In practical situations where there would be a suspicion of additive outlier, the general conclusion is that the DF and Granger and Hallman (J Time Ser Anal 12(3): 207–224, 1991) unit root tests should be avoided, however, the BG approach can still be used.  相似文献   

4.
5.
6.
The introduction of the Lee–Carter (LC) method marked a breakthrough in mortality forecasting, providing a simple yet powerful data-driven stochastic approach. The method has the merit of capturing the dynamics of mortality change by a single time index that is almost invariably linear. This thirtieth anniversary review of its 1992 publication examines the LC method and the large body of research that it has since spawned. We first describe the method and present a 30-year ex post evaluation of the original LC forecast for U.S. mortality. We then review the most prominent extensions of the LC method in relation to the limitations that they sought to address. With a focus on the efficacy of the various extensions, we review existing evaluations and comparisons. To conclude, we juxtapose the two main statistical approaches used, discuss further issues, and identify several potential avenues for future research.  相似文献   

7.
8.
When questions in business surveys about the direction of change have three reply options, “up”, “down”, and “unchanged”, a common practice is to release the results as balance indices. These are linear combinations of the response shares, i.e., the percentage share of the respondents who answered “up” minus the percentage share of those who answered “down”. Forecasters traditionally use these indices for short-term business cycle forecasting. Survey response shares can also be combined non-linearly into alternative indices, using the Carlson–Parkin method. Using IFO and ISM data, this paper tests the relative performance of Carlson–Parkin type indices versus balance indices for the short-term forecasting of industrial production growth. The main finding is that the two types of indices show no difference in forecasting performance during the Great Moderation. However, the Carlson–Parkin type indices outperform the balance indices during periods with higher output volatilities, such as before and after the Great Moderation.  相似文献   

9.
10.
This paper offers a conceptual and partly empirical decomposition of the trends in U.S. consumer expenditures on five communications and nine transportation subcategories between 1984 and 2002. We find that inflation nearly always increases unit prices. Income effects are positive for all categories, meaning that these are all normal goods, not inferior ones. We speculate that taste changes have contributed to increasing expenditures in most categories, with the exception of out-of-town lodging, the public transit component of the public transportation category, and the old communication media categories of postage and reading. We suggest that production and technological changes have led to decreased unit prices in most categories. In the private vehicle operations categories, technological improvements dominate, so that expenditure shares have been decreasing despite increasing demand. Conversely, in the new media categories, taste changes dominate, so that expenditure shares are increasing despite technological improvements which lower prices. The decomposition explored here enhances our understanding of these important expenditure categories, and provides a useful methodology with which to examine trends in other categories as well.  相似文献   

11.
We propose a categorical time-varying coefficient translog cost function, where each coefficient is expressed as a nonparametric function of a categorical time variable, thereby allowing each time period to have its own set of coefficients. Our application to U.S. electricity firms reveals that this model offers two major advantages over the traditional time trend representation of technical change: (1) it is capable of producing estimates of productivity growth that closely track those obtained using the Törnqvist approximation to the Divisia index; and (2) it can solve a well-known problem commonly referred to as “the problem of trending elasticities”.  相似文献   

12.
13.
14.
We study here extremes of residuals of the bivariate lifetime and the residual of extremes of the two lifetimes. In the case of generalized Marshall–Olkin model and the total time transformed exponential model, we first present some sufficient conditions for the extremes of residuals to be stochastically larger than the residual of the corresponding extremes, and then investigate the stochastic order of the residual of extremes of the two lifetimes based on the majorization of the age vector of the residuals.  相似文献   

15.
16.
17.
18.
19.
20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号