首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Sequential Data Assimilation Techniques in Oceanography   总被引:8,自引:0,他引:8  
We review recent developments of sequential data assimilation techniques used in oceanography to integrate spatio-temporal observations into numerical models describing physical and ecological dynamics. Theoretical aspects from the simple case of linear dynamics to the general case of nonlinear dynamics are described from a geostatistical point-of-view. Current methods derived from the Kalman filter are presented from the least complex to the most general and perspectives for nonlinear estimation by sequential importance resampling filters are discussed. Furthermore an extension of the ensemble Kalman filter to transformed Gaussian variables is presented and illustrated using a simplified ecological model. The described methods are designed for predicting over geographical regions using a high spatial resolution under the practical constraint of keeping computing time sufficiently low to obtain the prediction before the fact. Therefore the paper focuses on widely used and computationally efficient methods.  相似文献   

2.
This paper compares two methods for undertaking likelihood‐based inference in dynamic equilibrium economies: a sequential Monte Carlo filter and the Kalman filter. The sequential Monte Carlo filter exploits the nonlinear structure of the economy and evaluates the likelihood function of the model by simulation methods. The Kalman filter estimates a linearization of the economy around the steady state. We report two main results. First, both for simulated and for real data, the sequential Monte Carlo filter delivers a substantially better fit of the model to the data as measured by the marginal likelihood. This is true even for a nearly linear case. Second, the differences in terms of point estimates, although relatively small in absolute values, have important effects on the moments of the model. We conclude that the nonlinear filter is a superior procedure for taking models to the data. Copyright © 2005 John Wiley & Sons, Ltd.  相似文献   

3.
粒子滤波算法(PF)在疲劳驾驶检测系统中的应用研究   总被引:1,自引:0,他引:1  
目前,实现定位跟踪的算法有很多,如卡尔曼滤波算法、扩展卡尔曼滤波算法、粒子滤波算法等。由于我们实际生活中的系统基本上都是非线性的,因此本文研究的是专门用于非线性非高斯系统跟踪的粒子滤波算法(PF)的基本原理及其具体应用。  相似文献   

4.
This paper considers the problem of forecasting realized variance measures. These measures are highly persistent estimates of the underlying integrated variance, but are also noisy. Bollerslev, Patton and Quaedvlieg (2016), Journal of Econometrics 192(1), 1–18 exploited this so as to extend the commonly used heterogeneous autoregressive (HAR) by letting the model parameters vary over time depending on the estimated measurement error variances. We propose an alternative specification that allows the autoregressive parameters of HAR models to be driven by a latent Gaussian autoregressive process that may also depend on the estimated measurement error variance. The model parameters are estimated by maximum likelihood using the Kalman filter. Our empirical analysis considers the realized variances of 40 stocks from the S&P 500. Our model based on log variances shows the best overall performance and generates superior forecasts both in terms of a range of different loss functions and for various subsamples of the forecasting period.  相似文献   

5.
刘丽丽 《价值工程》2010,29(34):190-191
非线性随机动态系统的滤波问题是一类经常遇到的实际应用问题,本文分析了扩展卡尔曼(EKF)、无迹卡尔曼滤波(UKF)和粒子滤波(PF)这三种非线性滤波算法的基本原理和特点以及适应的条件。并通过一个强非线性系统的实验仿真,验证了各自算法的性能。  相似文献   

6.
Notions of cause and effect are fundamental to economic explanation. Although concepts such as price effects are intuitive, rigorous foundations justifying causal discourse in the wide range of economic settings remain lacking. We illustrate this deficiency using an NN-bidder private-value auction, posing causal questions that cannot be addressed within existing frameworks. We extend the frameworks of Pearl (2000) and White and Chalak (2009) to introduce topological settable systems (TSS), a causal framework capable of delivering the missing answers. Particularly, TSS accommodate choices belonging to general function spaces. Our analysis suggests how TSS enable causal discourse in various areas of economics.  相似文献   

7.
Recently, there has been considerable work on stochastic time-varying coefficient models as vehicles for modelling structural change in the macroeconomy with a focus on the estimation of the unobserved paths of random coefficient processes. The dominant estimation methods, in this context, are based on various filters, such as the Kalman filter, that are applicable when the models are cast in state space representations. This paper introduces a new class of autoregressive bounded processes that decompose a time series into a persistent random attractor, a time varying autoregressive component, and martingale difference errors. The paper examines, rigorously, alternative kernel based, nonparametric estimation approaches for such models and derives their basic properties. These estimators have long been studied in the context of deterministic structural change, but their use in the presence of stochastic time variation is novel. The proposed inference methods have desirable properties such as consistency and asymptotic normality and allow a tractable studentization. In extensive Monte Carlo and empirical studies, we find that the methods exhibit very good small sample properties and can shed light on important empirical issues such as the evolution of inflation persistence and the purchasing power parity (PPP) hypothesis.  相似文献   

8.
Abstract Seasonality is one of the most important features of economic time series. The possibility to abstract from seasonality for the assessment of economic conditions is a widely debated issue. In this paper we propose a strategy for assessing the role of seasonal adjustment (SA) on business cycle measurement. In particular, we provide a method for quantifying the contribution to the unreliability of the estimated cycles extracted by popular filters, such as Baxter and King and Hodrick–Prescott (HP). The main conclusion is that the contribution is larger around the turning points of the series and at the extremes of the sample period; moreover, it much more sizeable for highpass filters, like the HP filter, which retain to a great extent the high‐frequency fluctuations in a time series, the latter being the ones that are more affected by SA. If a bandpass component is considered, the effect has reduced size. Finally, we discuss the role of forecast extensions and the prediction of the cycle. For the time series of industrial production considered in the illustration, it is not possible to provide a reliable estimate of the cycle at the end of the sample.  相似文献   

9.
A simulation-based non-linear filter is developed for prediction and smoothing in non-linear and/or non-normal structural time-series models. Recursive algorithms of weighting functions are derived by applying Monte Carlo integration. Through Monte Carlo experiments, it is shown that (1) for a small number of random draws (or nodes) our simulation-based density estimator using Monte Carlo integration (SDE) performs better than Kitagawa's numerical integration procedure (KNI), and (2) SDE and KNI give less biased parameter estimates than the extended Kalman filter (EKF). Finally, an estimation of per capita final consumption data is taken as an application to the non-linear filtering problem.  相似文献   

10.
Filters used to estimate unobserved components in time series are often designed on a priori grounds, so as to capture the frequencies associated with the component. A limitation of these filters is that they may yield spurious results. The danger can be avoided if the so-called ARIMA-model-based (AMB) procedure is used to derive the filter. However, parsimony of ARIMA models typically implies little resolution in terms of the detection of hidden components. It would be desirable to combine a higher resolution with consistency of the structure of the observed series.We show first that for a large class of a priori designed filters, an AMB interpretation is always possible. Using this result, proper convolution of AMB filters can produce richer decompositions of the series that incorporate a priori desired features of the components and fully respect the ARIMA model for the observed series (hence no additional parameter needs to be estimated).The procedure is discussed in detail in the context of business-cycle estimation by means of the Hodrick-Prescott filter applied to a seasonally adjusted series or a trend–cycle component.  相似文献   

11.
We propose and study the finite‐sample properties of a modified version of the self‐perturbed Kalman filter of Park and Jun (Electronics Letters 1992; 28 : 558–559) for the online estimation of models subject to parameter instability. The perturbation term in the updating equation of the state covariance matrix is weighted by the estimate of the measurement error variance. This avoids the calibration of a design parameter as the perturbation term is scaled by the amount of uncertainty in the data. It is shown by Monte Carlo simulations that this perturbation method is associated with a good tracking of the dynamics of the parameters compared to other online algorithms and to classical and Bayesian methods. The standardized self‐perturbed Kalman filter is adopted to forecast the equity premium on the S&P 500 index under several model specifications, and determines the extent to which realized variance can be used to predict excess returns. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

12.
The paper considers a version of the question of how to define treatment and control groups in a dynamic setting where treatments can occur at any time (but only once). The version considered pre‐supposes that treatments as well as outcomes can be conceptualised as events occurring in temporal locations of a discrete time axis. It is proposed to think of effects as being dependent on both the time when and the time since the treatment occurred. The paper develops corresponding definitions of treatment and control groups, and proposes a notion of ‘comprehensive treatment effect’ that takes into account how treatment and control groups are generated. Based on this notion, the paper discusses causal interpretations that do not pre‐suppose a potential outcomes framework.  相似文献   

13.
In this paper we show how the Kalman filter, which is a recursive estimation procedure, can be applied to the standard linear regression model. The resulting "Kalman estimator" is compared with the classical least-squares estimator.
The applicability and (dis)advantages of the filter are illustrated by means of a case study which consists of two parts. In the first part we apply the filter to a regression model with constant parameters and in the second part the filter is applied to a regression model with time-varying stochastic parameters. The prediction-powers of various "Kalman predictors" are compared with "least-squares predictors" by using T heil 's prediction-error coefficient U.  相似文献   

14.
孙秋云 《价值工程》2011,30(19):51-51
卡尔曼滤波及其改进方法在行驶车辆状态估计中有着广泛的应用,并取得良好的效果;本文针对自适应卡尔曼滤波算法、无轨迹卡尔曼滤波算法的应用等进行了综合性阐述。  相似文献   

15.
M. Roubens 《Metrika》1972,19(1):178-184
Summary In the following, causal pattern buried in autocorrelated noise is considered. The causal pattern may be described by models such as trends, polynomial trajectories, growing sines. Based on a new criterion — called expolynomial — estimators of coefficients of a polynomial model are obtained. Characteristic functions of the estimators are derived and the first two moments calculated. Continuous time series are briefly studied to show similarities between discrete and continuous observations. Popular exponential smoothing is a special case of the expolynomial smoothing.  相似文献   

16.
《Journal of econometrics》2005,127(2):165-178
This paper is concerned with the specification for modelling financial leverage effect in the context of stochastic volatility (SV) models. Two alternative specifications co-exist in the literature. One is the Euler approximation to the well-known continuous time SV model with leverage effect and the other is the discrete time SV model of Jacquier et al. (J. Econometrics 122 (2004) 185). Using a Gaussian nonlinear state space form with uncorrelated measurement and transition errors, I show that it is easy to interpret the leverage effect in the conventional model whereas it is not clear how to obtain and interpret the leverage effect in the model of Jacquier et al. Empirical comparisons of these two models via Bayesian Markov chain Monte Carlo (MCMC) methods further reveal that the specification of Jacquier et al. is inferior. Simulation experiments are conducted to study the sampling properties of Bayes MCMC for the conventional model.  相似文献   

17.
王伟 《物流科技》2009,32(2):137-139
文章研究了联合计划、预测和补货(CPFR)中的联合预测流程,并建立了相关的预测模型。在建模的过程中,使用了状态空间方程来描述实际市场需求和观测到的市场需求(销售量),并通过卡尔曼滤波来预测零售商下期的销售量.结合零售商库存策略,预测出零售商下期的订单量。  相似文献   

18.
在利用卡尔曼滤波新息序列进行泄漏检测的同时,引入强跟踪滤波器性能评价指标,对信号进行变点定位。管道压力信号经卡尔曼滤波后产生新息序列,结合序贯似然比检测(SPRT)检测新息序列,实现管道泄漏检测。强跟踪滤波(STF)跟踪平稳信号时,新息序列满足相互正交,发生泄漏突变后,产生跟踪误差,新息序列相互正交性被破坏。建立新息序列正交性评价指标,作为泄漏的突变点指示,同时实现滤波器工作性能监测。  相似文献   

19.
The purpose of this paper is (1) to present a new dynamic integrated performance measurement system (IPMS) based on a managerial view, and (2) to present preliminary empirical evidence on the importance of performance measures in small Finnish technology companies using the IMPS as the framework for the survey. The aim is to develop a useful managerial tool for measuring and improving performance in business firms. The system is intended to include a comprehensive set of relevant factors and dimensions, which together form an integrated managerial system of performance measurement. The proposed IPMS is linked to the idea of activity-based costing (ABC). It consists of seven main factors and the causal chain connecting these factors. The factors are classified as two external factors (financial performance and competitiveness) and five internal factors (costs, production factors, activities, products, and revenues). The main idea of the IPMS is to follow the use (transformation) of resources from the point of the very first (elementary) resource allocation to the point when the results of the allocation are realized as revenues. In the causal chain, the factor at any point along the chain is regarded as a determinant of the factor that succeeds it. Moreover, the next resource allocation decision is dynamically affected by the results of the former decisions, thus allowing for learning-by-doing. The IPMS is also used as a framework for a postal questionnaire completed by 93 small Finnish technology firms. These companies put great emphasis on the importance of the employee motivation (production factors dimension), customer satisfaction (products), product profitability (revenues), company profitability, liquidity, and capital structure (financial performance) in the measurement of performance. Factor analysis is used to classify the companies into three groups on the basis of performance measurement.  相似文献   

20.
We explore the time variation of factor loadings and abnormal returns in the context of a four-factor model. Our methodology, based on an application of the Kalman filter and on endogenous uncertainty, overcomes several limitations of competing approaches used in the literature. Besides taking learning into account, it does not rely on any conditioning information, and it only imposes minimal assumptions on the time variation of the parameters. Our estimates capture both short- and long-term fluctuations of risk loadings and abnormal returns, also showing marked variation across US industry portfolios. The results from mean-variance spanning tests indicate that our baseline model yields accurate predictions and can therefore improve pricing and performance measurement.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号