首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
This paper discusses estimation of US inflation volatility using time‐varying parameter models, in particular whether it should be modelled as a stationary or random walk stochastic process. Specifying inflation volatility as an unbounded process, as implied by the random walk, conflicts with priors beliefs, yet a stationary process cannot capture the low‐frequency behaviour commonly observed in estimates of volatility. We therefore propose an alternative model with a change‐point process in the volatility that allows for switches between stationary models to capture changes in the level and dynamics over the past 40 years. To accommodate the stationarity restriction, we develop a new representation that is equivalent to our model but is computationally more efficient. All models produce effectively identical estimates of volatility, but the change‐point model provides more information on the level and persistence of volatility and the probabilities of changes. For example, we find a few well‐defined switches in the volatility process and, interestingly, these switches line up well with economic slowdowns or changes of the Federal Reserve Chair. Moreover, a decomposition of inflation shocks into permanent and transitory components shows that a spike in volatility in the late 2000s was entirely on the transitory side and characterized by a rise above its long‐run mean level during a period of higher persistence. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

2.
This paper proposes a new approach to handle nonparametric stochastic frontier (SF) models. It is based on local maximum likelihood techniques. The model is presented as encompassing some anchorage parametric model in a nonparametric way. First, we derive asymptotic properties of the estimator for the general case (local linear approximations). Then the results are tailored to a SF model where the convoluted error term (efficiency plus noise) is the sum of a half normal and a normal random variable. The parametric anchorage model is a linear production function with a homoscedastic error term. The local approximation is linear for both the production function and the parameters of the error terms. The performance of our estimator is then established in finite samples using simulated data sets as well as with a cross-sectional data on US commercial banks. The methods appear to be robust, numerically stable and particularly useful for investigating a production process and the derived efficiency scores.  相似文献   

3.
This paper evaluates the performances of prediction intervals generated from alternative time series models, in the context of tourism forecasting. The forecasting methods considered include the autoregressive (AR) model, the AR model using the bias-corrected bootstrap, seasonal ARIMA models, innovations state space models for exponential smoothing, and Harvey’s structural time series models. We use thirteen monthly time series for the number of tourist arrivals to Hong Kong and Australia. The mean coverage rates and widths of the alternative prediction intervals are evaluated in an empirical setting. It is found that all models produce satisfactory prediction intervals, except for the autoregressive model. In particular, those based on the bias-corrected bootstrap perform best in general, providing tight intervals with accurate coverage rates, especially when the forecast horizon is long.  相似文献   

4.
Wold Theorem plays a fundamental role in the decomposition of weakly stationary time series. It provides a moving average representation of the process under consideration in terms of uncorrelated innovations, whatever the nature of the process is. From an empirical point of view, this result enables to identify orthogonal shocks, for instance in macroeconomic and financial time series. More theoretically, the decomposition of weakly stationary stochastic processes can be seen as a special case of the Abstract Wold Theorem, that allows to decompose Hilbert spaces by using isometric operators. In this work we explain this link in detail, employing the Hilbert space spanned by a weakly stationary time series and the lag operator as isometry. In particular, we characterize the innovation subspace by exploiting the adjoint operator. We also show that the isometry of the lag operator is equivalent to weak stationarity. Our methodology, fully based on operator theory, provides novel tools useful to discover new Wold-type decompositions of stochastic processes, in which the involved isometry is no more the lag operator. In such decompositions the orthogonality of innovations is ensured by construction since they are derived from the Abstract Wold Theorem.  相似文献   

5.
华颖 《价值工程》2013,32(1):288-289
灰色预测模型是灰色系统理论的重要内容之一,也是预测理论中被广泛使用的一种预测方法。为了提高预测精度,需要对原始数据序列作数据处理,提高数据序列的光滑度,本文主要是对灰色预测GM(1,1)模型利用函数变换理论对其原始数据函数变换处理进行研究,并对进过函数变换原的数据精度进行比较。  相似文献   

6.
Beveridge and Nelson [Beveridge, Stephen, Nelson, Charles R., 1981. A new approach to decomposition of economic time series into permanent and transitory components with particular attention to measurement of the ‘business cycle’. Journal of Monetary Economics 7, 151–174] proposed that the long-run forecast is a measure of trend for time series such as GDP that do not follow a deterministic path in the long run. They showed that if the series is stationary in first differences, then the estimated trend is a random walk with drift that accounts for growth, and the cycle is stationary. In contrast to linear de-trending, the smoother of Hodrick and Prescott (1981) and Hodrick and Prescott [Hodrick, Robert, Prescott, Edward C., 1997. Post-war US business cycles: An empirical investigation. Journal of Money Credit and Banking 29 (1), 1–16] and the unobserved components model of Harvey, [Harvey, A.C., 1985. Trends and cycles in macroeconomic time series. Journal of Business and Economic Statistics 3, 216–227]. Watson [Watson, Mark W., 1986. Univariate detrending methods with stochastic trends Journal of Monetary Economics 18, 49–75] and Clark [Clark, Peter K., 1987. The cyclical component of US economic activity. The Quarterly Journal of Economics 102 (4), 797–814], the BN decomposition attributes most variation in GDP to trend shocks while the cycles are short and brief. Since each is an estimate of the transitory part of GDP that will die out, it seems natural to compare cycle measures by their ability to forecast future growth. The results presented here suggest that cycle measures contain little if any information beyond the short-term momentum captured by BN.  相似文献   

7.
In a structural vector‐error correction (VEC) model, it is possible to decompose the shocks into those with permanent and transitory effects on the levels of the variables. Pagan and Pesaran derive the restrictions which the permanent–transitory decomposition of the shocks imposes on the structural VEC model. This paper shows that these restrictions are equivalent to a set of restrictions that are applied in the methods of Gonzalo and Ng and King et al. (KPSW). Using this result, it is shown that the Pagan and Pesaran method can be used to recover the structural shocks with permanent effects identically to those from the Gonzalo and Ng and KPSW methods. In the former case, this is illustrated in the context of Lettau and Ludvigson's consumption model and in the latter case in KPSW's six variable model. There are also two other methods for which the Pagan and Pesaran approach can deliver identical permanent shocks which are also discussed.  相似文献   

8.
Forecasting economic time series with unconditional time-varying variance   总被引:1,自引:0,他引:1  
The classical forecasting theory of stationary time series exploits the second-order structure (variance, autocovariance, and spectral density) of an observed process in order to construct some prediction intervals. However, some economic time series show a time-varying unconditional second-order structure. This article focuses on a simple and meaningful model allowing this nonstationary behaviour. We show that this model satisfactorily explains the nonstationary behaviour of several economic data sets, among which are the U.S. stock returns and exchange rates. The question of how to forecast these processes is addressed and evaluated on the data sets.  相似文献   

9.
Time series analysts have long been concerned with distinguishing stationary "generating processes" from processes for which differencing is required to induce stationarity. In practical applications, this issue is addressed almost invariably through formal hypothesis testing. In this paper, we explore some aspects of the Bayesian approach to the problem, leading to the calculation of posterior odds ratios. Interesting features arise in the simplest possible variant of the problem, where a choice has to be made between a random walk and a stationary first order autoregressive model. We discuss in detail the analysis of this case, and also indicate how our approach extends to the more general comparison of an ARIMA model with a stationary competitor.  相似文献   

10.
"This paper presents a stochastic version of the demographic cohort-component method of forecasting future population. In this model the sizes of future age-sex groups are non-linear functions of random future vital rates. An approximation to their joint distribution can be obtained using linear approximations or simulation. A stochastic formulation points to the need for new empirical work on both the autocorrelations and the cross-correlations of the vital rates. Problems of forecasting declining mortality and fluctuating fertility are contrasted. A volatility measure for fertility is presented. The model can be used to calculate approximate prediction intervals for births using data from deterministic cohort-component forecasts. The paper compares the use of expert opinion in mortality forecasting with simple extrapolation techniques to see how useful each approach has been in the past. Data from the United States suggest that expert opinion may have caused systematic bias in the forecasts."  相似文献   

11.
Computational intelligence approaches to multiple-step-ahead forecasting rely on either iterated one-step-ahead predictors or direct predictors. In both cases the predictions are obtained by means of multi-input single-output modeling techniques. This paper discusses the limitations of single-output approaches when the predictor is expected to return a long series of future values, and presents a multi-output approach to long term prediction. The motivation for this work is that, when predicting multiple steps ahead, the forecasted sequence should preserve the stochastic properties of the training series. However, this may not be the case, for instance in direct approaches where predictions for different horizons are produced independently. We discuss here a multi-output extension of conventional local modeling approaches, and present and compare three distinct criteria for performing conditionally dependent model selection. In order to assess the effectiveness of the different selection strategies, we carry out an extensive experimental session based on the 111 series in the NN5 competition.  相似文献   

12.
This paper considers joint estimation of long run equilibrium coefficients and parameters governing the short run dynamics of a fully parametric Gaussian cointegrated system formulated in continuous time. The model allows the stationary disturbances to be generated by a stochastic differential equation system and for the variables to be a mixture of stocks and flows. We derive a precise form for the exact discrete analogue of the continuous time model in triangular error correction form, which acts as the basis for frequency domain estimation of the unknown parameters using discrete time data. We formally establish the order of consistency and the asymptotic sampling properties of such an estimator. The estimator of the cointegrating parameters is shown to converge at the rate of the sample size to a mixed normal distribution, while that of the short run parameters converges at the rate of the square root of the sample size to a limiting normal distribution.  相似文献   

13.
This article has three objectives: (a) to describe the method of automatic ARIMA modeling (AAM), with and without intervention analysis, that has been used in the analysis; (b) to comment on the results; and (c) to comment on the M3 Competition in general. Starting with a computer program for fitting an ARIMA model and a methodology for building univariate ARIMA models, an expert system has been built, while trying to avoid the pitfalls of most existing software packages. A software package called Time Series Expert TSE-AX is used to build a univariate ARIMA model with or without an intervention analysis. The characteristics of TSE-AX are summarized and, more especially, its automatic ARIMA modeling method. The motivation to take part in the M3-Competition is also outlined. The methodology is described mainly in three technical appendices: (Appendix A) choice of differences and of a transformation, use of intervention analysis; ( Appendix B) available specification procedures; ( Appendix C) adequacy, model checking and new specification. The problems raised by outliers are discussed, in particular how close they are from the forecast origin. Several series that are difficult to deal with from that point of view are mentioned and one of them is shown. In the last section, we comment on contextual information, the idea of an e−M Competition, prediction intervals and the possible use of other forecasting methods within Time Series Expert.  相似文献   

14.
In most empirical studies, once the best model has been selected according to a certain criterion, subsequent analysis is conducted conditionally on the chosen model. In other words, the uncertainty of model selection is ignored once the best model has been chosen. However, the true data-generating process is in general unknown and may not be consistent with the chosen model. In the analysis of productivity and technical efficiencies in the stochastic frontier settings, if the estimated parameters or the predicted efficiencies differ across competing models, then it is risky to base the prediction on the selected model. Buckland et al. (Biometrics 53:603?C618, 1997) have shown that if model selection uncertainty is ignored, the precision of the estimate is likely to be overestimated, the estimated confidence intervals of the parameters are often below the nominal level, and consequently, the prediction may be less accurate than expected. In this paper, we suggest using the model-averaged estimator based on the multimodel inference to estimate stochastic frontier models. The potential advantages of the proposed approach are twofold: incorporating the model selection uncertainty into statistical inference; reducing the model selection bias and variance of the frontier and technical efficiency estimators. The approach is demonstrated empirically via the estimation of an Indian farm data set.  相似文献   

15.
In two recent articles, Sims (1988) and Sims and Uhlig (1988/1991) question the value of much of the ongoing literature on unit roots and stochastic trends. They characterize the seeds of this literature as ‘sterile ideas’, the application of nonstationary limit theory as ‘wrongheaded and unenlightening’, and the use of classical methods of inference as ‘unreasonable’ and ‘logically unsound’. They advocate in place of classical methods an explicit Bayesian approach to inference that utilizes a flat prior on the autoregressive coefficient. DeJong and Whiteman adopt a related Bayesian approach in a group of papers (1989a,b,c) that seek to re-evaluate the empirical evidence from historical economic time series. Their results appear to be conclusive in turning around the earlier, influential conclusions of Nelson and Plosser (1982) that most aggregate economic time series have stochastic trends. So far these criticisms of unit root econometrics have gone unanswered; the assertions about the impropriety of classical methods and the superiority of flat prior Bayesian methods have been unchallenged; and the empirical re-evaluation of evidence in support of stochastic trends has been left without comment. This paper breaks that silence and offers a new perspective. We challenge the methods, the assertions, and the conclusions of these articles on the Bayesian analysis of unit roots. Our approach is also Bayesian but we employ what are known in the statistical literature as objective ignorance priors in our analysis. These are developed in the paper to accommodate explicitly time series models in which no stationarity assumption is made. Ignorance priors are intended to represent a state of ignorance about the value of a parameter and in many models are very different from flat priors. We demonstrate that in time series models flat priors do not represent ignorance but are actually informative (sic) precisely because they neglect generically available information about how autoregressive coefficients influence observed time series characteristics. Contrary to their apparent intent, flat priors unwittingly bias inferences towards stationary and i.i.d. alternatives where they do represent ignorance, as in the linear regression model. This bias helps to explain the outcome of the simulation experiments in Sims and Uhlig and some of the empirical results of DeJong and Whiteman. Under both flat priors and ignorance priors this paper derives posterior distributions for the parameters in autoregressive models with a deterministic trend and an arbitrary number of lags. Marginal posterior distributions are obtained by using the Laplace approximation for multivariate integrals along the lines suggested by the author (Phillips, 1983) in some earlier work. The bias towards stationary models that arises from the use of flat priors is shown in our simulations to be substantial; and we conclude that it is unacceptably large in models with a fitted deterministic trend, for which the expected posterior probability of a stochastic trend is found to be negligible even though the true data generating mechanism has a unit root. Under ignorance priors, Bayesian inference is shown to accord more closely with the results of classical methods. An interesting outcome of our simulations and our empirical work is the bimodal Bayesian posterior, which demonstrates that Bayesian confidence sets can be disjoint, just like classical confidence intervals that are based on asymptotic theory. The paper concludes with an empirical application of our Bayesian methodology to the Nelson-Plosser series. Seven of the 14 series show evidence of stochastic trends under ignorance priors, whereas under flat priors on the coefficients all but three of the series appear trend stationary. The latter result corresponds closely with the conclusion reached by DeJong and Whiteman (1989b) (based on truncated flat priors). We argue that the DeJong-Whiteman inferences are biased towards trend stationarity through the use of flat priors on the autoregressive coefficients, and that their inferences for some of the series (especially stock prices) are fragile (i.e. not robust) not only to the prior but also to the lag length chosen in the time series specification.  相似文献   

16.
In this paper I present a general method forconstructing confidence intervals for predictionsfrom the generalized linear model in sociologicalresearch. I demonstrate that the method used forconstructing confidence intervals for predictions inclassical linear models is indeed a special case ofthe method for generalized linear models. I examinefour such models – the binary logit, the binaryprobit, the ordinal logit, and the Poissonregression model – to construct confidence intervalsfor predicted values in the form of probability,odds, Z score, or event count. The estimatedconfidence interval for an event prediction, whenapplied judiciously, can give the researcher usefulinformation and an estimated measure of precisionfor the prediction so that interpretation ofestimates from the generalized linear model becomeseasier.  相似文献   

17.
We propose a beta spatial linear mixed model with variable dispersion using Monte Carlo maximum likelihood. The proposed method is useful for those situations where the response variable is a rate or a proportion. An approach to the spatial generalized linear mixed models using the Box–Cox transformation in the precision model is presented. Thus, the parameter optimization process is developed for both the spatial mean model and the spatial variable dispersion model. All the parameters are estimated using Markov chain Monte Carlo maximum likelihood. Statistical inference over the parameters is performed using approximations obtained from the asymptotic normality of the maximum likelihood estimator. Diagnosis and prediction of a new observation are also developed. The method is illustrated with the analysis of one simulated case and two studies: clay and magnesium contents. In the clay study, 147 soil profile observations were taken from the research area of the Tropenbos Cameroon Programme, with explanatory variables: elevation in metres above sea level, agro‐ecological zone, reference soil group and land cover type. In the magnesium content, the soil samples were taken from 0‐ to 20‐cm‐depth layer at each of the 178 locations, and the response variable is related to the spatial locations, altitude and sub‐region.  相似文献   

18.
Forecasting researchers, with few exceptions, have ignored the current major forecasting controversy: global warming and the role of climate modelling in resolving this challenging topic. In this paper, we take a forecaster’s perspective in reviewing established principles for validating the atmospheric-ocean general circulation models (AOGCMs) used in most climate forecasting, and in particular by the Intergovernmental Panel on Climate Change (IPCC). Such models should reproduce the behaviours characterising key model outputs, such as global and regional temperature changes. We develop various time series models and compare them with forecasts based on one well-established AOGCM from the UK Hadley Centre. Time series models perform strongly, and structural deficiencies in the AOGCM forecasts are identified using encompassing tests. Regional forecasts from various GCMs had even more deficiencies. We conclude that combining standard time series methods with the structure of AOGCMs may result in a higher forecasting accuracy. The methodology described here has implications for improving AOGCMs and for the effectiveness of environmental control policies which are focussed on carbon dioxide emissions alone. Critically, the forecast accuracy in decadal prediction has important consequences for environmental planning, so its improvement through this multiple modelling approach should be a priority.  相似文献   

19.
This paper shows that the properties of nonlinear transformations of a fractionally integrated process strongly depend on whether the initial series is stationary or not. Transforming a stationary Gaussian I(d) process with d>0 leads to a long-memory process with the same or a smaller long-memory parameter depending on the Hermite rank of the transformation. Any nonlinear transformation of an antipersistent Gaussian I(d) process is I(0)). For non-stationary I(d) processes, every polynomial transformation is non-stationary and exhibits a stochastic trend in mean and in variance. In particular, the square of a non-stationary Gaussian I(d) process still has long memory with parameter d, whereas the square of a stationary Gaussian I(d) process shows less dependence than the initial process. Simulation results for other transformations are also discussed.  相似文献   

20.
Sir Francis Galton introduced median regression and the use of the quantile function to describe distributions. Very early on the tradition moved to mean regression and the universal use of the Normal distribution, either as the natural ‘error’ distribution or as one forced by transformation. Though the introduction of ‘quantile regression’ refocused attention on the shape of the variability about the line, it uses nonparametric approaches and so ignores the actual distribution of the ‘error’ term. This paper seeks to show how Galton's approach enables the complete regression model, deterministic and stochastic elements, to be modelled, fitted and investigated. The emphasis is on the range of models that can be used for the stochastic element. It is noted that as the deterministic terms can be built up from components, so to, using quantile functions, can the stochastic element. The model may thus be treated in both modelling and fitting as a unity. Some evidence is presented to justify the use of a much wider range of distributional models than is usually considered and to emphasize their flexibility in extending regression models.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号