首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
Analysis, model selection and forecasting in univariate time series models can be routinely carried out for models in which the model order is relatively small. Under an ARMA assumption, classical estimation, model selection and forecasting can be routinely implemented with the Box–Jenkins time domain representation. However, this approach becomes at best prohibitive and at worst impossible when the model order is high. In particular, the standard assumption of stationarity imposes constraints on the parameter space that are increasingly complex. One solution within the pure AR domain is the latent root factorization in which the characteristic polynomial of the AR model is factorized in the complex domain, and where inference questions of interest and their solution are expressed in terms of the implied (reciprocal) complex roots; by allowing for unit roots, this factorization can identify any sustained periodic components. In this paper, as an alternative to identifying periodic behaviour, we concentrate on frequency domain inference and parameterize the spectrum in terms of the reciprocal roots, and, in addition, incorporate Gegenbauer components. We discuss a Bayesian solution to the various inference problems associated with model selection involving a Markov chain Monte Carlo (MCMC) analysis. One key development presented is a new approach to forecasting that utilizes a Metropolis step to obtain predictions in the time domain even though inference is being carried out in the frequency domain. This approach provides a more complete Bayesian solution to forecasting for ARMA models than the traditional approach that truncates the infinite AR representation, and extends naturally to Gegenbauer ARMA and fractionally differenced models.  相似文献   

2.
This paper re-examines whether the time series properties of aggregate consumption, real wages, and asset returns can be explained by a neoclassical model. Previous empirical rejections of the model have suggested that the optimal labour contract model might be appropriate for understanding the time series properties of the real wage rate and consumption. We show that an optimal contract model restricts the long-run relation of the real wage rate and consumption. We exploit this long-run restriction (cointegration restriction) for estimating and testing the model, using Ogaki and Park's (1989) cointegration approach. This long-run restriction involves a parameter that we call the long-run intertemporal elasticity of substitution (IES) for non-durable consumption but does not involve the IES for leisure. This allows us to estimate the long-run IES for non-durable consumption from a cointegrating regression. Tests for the null of cointegration do not reject our model. As a further analysis, our estimates of the long-run IES for non-durable consumption are used to estimate the discount factor and a coefficient of time-nonseparability using Hansen's (1982) Generalized Method of Moments. We form a specification test for our model à la Hausman (1978) from these two steps. This specification test does not reject our model. © 1996 John Wiley & Sons, Ltd.  相似文献   

3.
A method is presented to improve the precision of timely data, which are published when final data are not yet available. Explicit statistical formulae, equivalent to Kalman filtering, are derived to combine historical with preliminary information. The application of these formulae is validated by the data, through a statistical test of compatibility between sources of information. A measure of the share of precision of each source of information is also derived. An empirical example with Mexican economic data serves to illustrate the procedure.  相似文献   

4.
人事任用问题是一个社会非常重要而基本的问题。一般由于信息的不对称,人事部门很难完全了解被任用人的素质,从而做出合适的人事安排。本文采用博奕论和信息经济学的基本原理对此进行了分析,分析表明:人事任用问题实际上是一个不完全信息动态信号博奕过程,人事部门只要设计出具有足够信息量的信号,在博奕过程中产生分离均衡。这样,就可以通过被任用人发出的信号识别出其类型,从而做出适当的人事安排,最终达到人才资源的最优配置。  相似文献   

5.
The paper focuses on the time series aggregate consumption function for the Hungarian economy. The empirical econometric analysis presented produces several new results. First, it shows that the income and consumption variables used in this type of model by previous studies are I(2) variables. Consequently, error correction models formulated in terms of their first differences are mis-specified. Second, it provides a strong empirical evidence supporting the view that consumption (and thus saving) was (real) interest rate elastic during the period under investigation, having impact both on the long run and on the short relationships between income and consumption. Third, it provides empirical evidence on choosing the proper income variable in the consumption function. The model selection results clearly supports the model with unadjusted total real money income variable. Fourth, it shows that for the period 1960–1986 a correctly specified and stable error correction model can be established. Finally, the analysis shows that when used for the period beyond 1986, this model suffers from a structural break.  相似文献   

6.
7.
Forecasting compositional time series   总被引:1,自引:0,他引:1  
Compositional data sets occur in many disciplines and give rise to some interesting statistical considerations. In recent years, the modelling and forecasting of compositional time series has seen some important developments, although this approach does not seem to be widely known. This paper represents a modest step towards rectifying this. After briefly setting out the basic structure of compositional data sets and outlining the implications for forecasting compositional time series, it illustrates the techniques using three examples: modelling and forecasting expenditure shares in the U.K. economy; forecasting trends in obesity in England; and examining shifts in the proportions of English first class cricketers born during particular quarters of the year.  相似文献   

8.
Many time series are asymptotically unstable and intrinsically nonstationary, i.e. satisfy difference equations with roots greater than one (in modulus) and with time-varying parameters. Models developed by Box–Jenkins solve these problems by imposing on data two transformations: differencing (unit-roots) and exponential (Box–Cox). Owing to the Jensen inequality, these techniques are not optimal for forecasting and sometimes may be arbitrary. This paper develops a method for modeling time series with unstable roots and changing parameters. In particular, the effectiveness of recursive estimators in tracking time-varying unstable parameters is shown with applications to data-sets of Box–Jenkins. The method is useful for forecasting time series with trends and cycles whose pattern changes over time.  相似文献   

9.
We consider a stock market model where prices satisfy a stochastic differential equation with a stochastic drift process. The investor’s objective is to maximize the expected utility of consumption and terminal wealth under partial information; the latter meaning that investment decisions are based on the knowledge of the stock prices only. We derive explicit representations of optimal consumption and trading strategies using Malliavin calculus. The results apply to both classical models for the drift process, a mean reverting Ornstein-Uhlenbeck process and a continuous time Markov chain. The model can be transformed to a complete market model with full information. This allows to use results on optimization under convex constraints which are used in the numerical part for the implementation of more stable strategies. Supported by the Austrian Science Fund FWF, project P17947-N12. We thank two anonymous referees for their comments which led to a considerable improvement of the paper.  相似文献   

10.
In this paper sequential procedures are proposed for jointly monitoring all elements of the covariance matrix at lag 0 of a multivariate time series. All control charts are based on exponential smoothing. As a measure of the distance between the target values and the actual values the Mahalanobis distance is used. It is distinguished between residual control schemes and modified control schemes. Several properties of these charts are proved assuming the target process to be a stationary Gaussian process. Within an extensive Monte Carlo study all procedures are compared with each other. As a measure of the performance of a control chart the average run length is used. An empirical example about Eastern European stock markets illustrates how the autocovariance and the cross-covariance structure of financial assets can be monitored by these methods.  相似文献   

11.
The concept of Granger-causality is formulated for a finite-dimensional multiple time series. Special attention is given to causality patterns in autoregressive series, and it is shown how these patterns can be tested under quite general assumptions using a χ2 statistic. The power of the test is discussed, and it is shown that the χ2 statistic results from a Lagrange multiplier test in the Gaussian case. The causality test is tried both on artificial data and some economic time series. Finally we consider the problem of constrained estimation in models with a known causality structure.  相似文献   

12.
13.
14.
Mean monthly flows from thirty rivers in North and South America are used to test the short-term forecasting ability of seasonal ARIMA, deseasonalized ARMA, and periodic autoregressive models. The series were split into two sections and models were calibrated to the first portion of the data. The models were then used to generate one-step-ahead forecasts for the second portion of the data. The forecast performance is compared using various measures of accuracy. The results suggest that a periodic autoregressive model, identified by using the partial autocorrelation function, provided the most accurate forecasts  相似文献   

15.
A normality assumption is usually made for the discrimination between two stationary time series processes. A nonparametric approach is desirable whenever there is doubt concerning the validity of this normality assumption. In this paper a nonparametric approach is suggested based on kernel density estimation firstly on (p+1) sample autocorrelations and secondly on (p+1) consecutive observations. A numerical comparison is made between Fishers linear discrimination based on sample autocorrelations and kernel density discrimination for AR and MA processes with and without Gaussian noise. The methods are applied to some seismological data.  相似文献   

16.
M. Roubens 《Metrika》1972,19(1):178-184
Summary In the following, causal pattern buried in autocorrelated noise is considered. The causal pattern may be described by models such as trends, polynomial trajectories, growing sines. Based on a new criterion — called expolynomial — estimators of coefficients of a polynomial model are obtained. Characteristic functions of the estimators are derived and the first two moments calculated. Continuous time series are briefly studied to show similarities between discrete and continuous observations. Popular exponential smoothing is a special case of the expolynomial smoothing.  相似文献   

17.
This paper examines the momentum effect for twenty cryptocurrencies compared to the US stock market. For this purpose, we implement a dynamic modeling approach to define and test momentum periods that follow a formation period for interday and various intraday price levels. We find evidence that large proportions of the asset classes’ formation periods are followed by momentum periods, strongly supporting the momentum effect. In particular cryptocurrencies have significantly larger and longer momentum periods in all frequencies which we attribute to the lower derivability of their intrinsic value leading to a higher degree of noise traders in the market. A momentum trading strategy based on the identical approach outperforms a buy-hold strategy for both asset classes, while only cryptocurrencies have higher risk-adjusted returns and lower downside risks than a passive investment. We also find critical price levels during structural elements of the momentum period where the volatility shortly but intensively increases and consequently initiates a price impulse in the direction of the momentum.  相似文献   

18.
Economic and financial data often take the form of a collection of curves observed consecutively over time. Examples include, intraday price curves, yield and term structure curves, and intraday volatility curves. Such curves can be viewed as a time series of functions. A fundamental issue that must be addressed, before an attempt is made to statistically model such data, is whether these curves, perhaps suitably transformed, form a stationary functional time series. This paper formalizes the assumption of stationarity in the context of functional time series and proposes several procedures to test the null hypothesis of stationarity. The tests are nontrivial extensions of the broadly used tests in the KPSS family. The properties of the tests under several alternatives, including change-point and I(1)I(1), are studied, and new insights, present only in the functional setting are uncovered. The theory is illustrated by a small simulation study and an application to intraday price curves.  相似文献   

19.
In this paper several cumulative sum (CUSUM) charts for the mean of a multivariate time series are introduced. We extend the control schemes for independent multivariate observations of crosier [ Technometrics (1988) Vol. 30, pp. 187–194], pignatiello and runger [ Journal of Quality Technology (1990) Vol. 22, pp. 173–186], and ngai and zhang [ Statistica Sinica (2001) Vol. 11, pp. 747–766] to multivariate time series by taking into account the probability structure of the underlying stochastic process. We consider modified charts and residual schemes as well. It is analyzed under which conditions these charts are directionally invariant. In an extensive Monte Carlo study these charts are compared with the CUSUM scheme of theodossiu [ Journal of the American Statistical Association (1993) Vol. 88, pp. 441–448], the multivariate exponentially weighted moving-average (EWMA) chart of kramer and schmid [ Sequential Analysis (1997) Vol. 16, pp. 131–154], and the control procedures of bodnar and schmid [ Frontiers of Statistical Process Control (2006) Physica, Heidelberg]. As a measure of the performance, the maximum expected delay is used.  相似文献   

20.
The M4 competition identified innovative forecasting methods, advancing the theory and practice of forecasting. One of the most promising innovations of M4 was the utilization of cross-learning approaches that allow models to learn from multiple series how to accurately predict individual ones. In this paper, we investigate the potential of cross-learning by developing various neural network models that adopt such an approach, and we compare their accuracy to that of traditional models that are trained in a series-by-series fashion. Our empirical evaluation, which is based on the M4 monthly data, confirms that cross-learning is a promising alternative to traditional forecasting, at least when appropriate strategies for extracting information from large, diverse time series data sets are considered. Ways of combining traditional with cross-learning methods are also examined in order to initiate further research in the field.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号