首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 453 毫秒
1.
Stochastic demographic forecasting   总被引:1,自引:0,他引:1  
"This paper describes a particular approach to stochastic population forecasting, which is implemented for the U.S.A. through 2065. Statistical time series methods are combined with demographic models to produce plausible long run forecasts of vital rates, with probability distributions. The resulting mortality forecasts imply gains in future life expectancy that are roughly twice as large as those forecast by the Office of the Social Security Actuary....Resulting stochastic forecasts of the elderly population, elderly dependency ratios, and payroll tax rates for health, education and pensions are presented."  相似文献   

2.
3.
《Journal of econometrics》1986,32(3):297-332
The specifications of multi-market disequilibrium econometric models are clouded with different notions of effective demand. This paper points out that the specification of such models for econometric analysis can be achieved from the basic concept of fixed-price equilibrium and without the use of the concepts of effective demand. The specifications of Ito and Gourieroux, Laffont and Monfort are justified within this framework. With proper stochastic elements introduced in the system, the derived likelihood function from our approach does not involve multiple integrals and is computationally tractable for models with many disequilibrium markets.  相似文献   

4.
Estimation and prediction in high dimensional multivariate factor stochastic volatility models is an important and active research area, because such models allow a parsimonious representation of multivariate stochastic volatility. Bayesian inference for factor stochastic volatility models is usually done by Markov chain Monte Carlo methods (often by particle Markov chain Monte Carlo methods), which are usually slow for high dimensional or long time series because of the large number of parameters and latent states involved. Our article makes two contributions. The first is to propose a fast and accurate variational Bayes methods to approximate the posterior distribution of the states and parameters in factor stochastic volatility models. The second is to extend this batch methodology to develop fast sequential variational updates for prediction as new observations arrive. The methods are applied to simulated and real datasets, and shown to produce good approximate inference and prediction compared to the latest particle Markov chain Monte Carlo approaches, but are much faster.  相似文献   

5.
Adding multivariate stochastic volatility of a flexible form to large vector autoregressions (VARs) involving over 100 variables has proved challenging owing to computational considerations and overparametrization concerns. The existing literature works with either homoskedastic models or smaller models with restrictive forms for the stochastic volatility. In this paper, we develop composite likelihood methods for large VARs with multivariate stochastic volatility. These involve estimating large numbers of parsimonious models and then taking a weighted average across these models. We discuss various schemes for choosing the weights. In our empirical work involving VARs of up to 196 variables, we show that composite likelihood methods forecast much better than the most popular large VAR approach, which is computationally practical in very high dimensions: the homoskedastic VAR with Minnesota prior. We also compare our methods to various popular approaches that allow for stochastic volatility using medium and small VARs involving up to 20 variables. We find our methods to forecast appreciably better than these as well.  相似文献   

6.
Pareto-Koopmans efficiency in Data Envelopment Analysis (DEA) is extended to stochastic inputs and outputs via probabilistic input-output vector comparisons in a given empirical production (possibility) set. In contrast to other approaches which have used Chance Constrained Programming formulations in DEA, the emphasis here is on joint chance constraints. An assumption of arbitrary but known probability distributions leads to the P-Model of chance constrained programming. A necessary condition for a DMU to be stochastically efficient and a sufficient condition for a DMU to be non-stochastically efficient are provided. Deterministic equivalents using the zero order decision rules of chance constrained programming and multivariate normal distributions take the form of an extended version of the additive model of DEA. Contacts are also maintained with all of the other presently available deterministic DEA models in the form of easily identified extensions which can be used to formalize the treatment of efficiency when stochastic elements are present.  相似文献   

7.
Linkage errors can occur when probability‐based methods are used to link records from two distinct data sets corresponding to the same target population. Current approaches to modifying standard methods of regression analysis to allow for these errors only deal with the case of two linked data sets and assume that the linkage process is complete, that is, all records on the two data sets are linked. This study extends these ideas to accommodate the situation when more than two data sets are probabilistically linked and the linkage is incomplete.  相似文献   

8.
Effective linkage detection and gene mapping requires analysis of data jointly on members of extended pedigrees, jointly at multiple genetic markers. Exact likelihood computation is then often infeasible, but Markov chain Monte Carlo (MCMC) methods permit estimation of posterior probabilities of genome sharing among relatives, conditional upon marker data. In principle, MCMC also permits estimation of linkage analysis location score curves, but in practice effective MCMC samplers are hard to find. Although the whole-meiosis Gibbs sampler (M-sampler) performs well in some cases, for extended pedigrees and tightly linked markers better samplers are needed. However, using the M-sampler as a proposal distribution in a Metropolis-Hastings algorithm does allow genetic interference to be incorporated into the analysis.  相似文献   

9.
Recently, there has been considerable work on stochastic time-varying coefficient models as vehicles for modelling structural change in the macroeconomy with a focus on the estimation of the unobserved paths of random coefficient processes. The dominant estimation methods, in this context, are based on various filters, such as the Kalman filter, that are applicable when the models are cast in state space representations. This paper introduces a new class of autoregressive bounded processes that decompose a time series into a persistent random attractor, a time varying autoregressive component, and martingale difference errors. The paper examines, rigorously, alternative kernel based, nonparametric estimation approaches for such models and derives their basic properties. These estimators have long been studied in the context of deterministic structural change, but their use in the presence of stochastic time variation is novel. The proposed inference methods have desirable properties such as consistency and asymptotic normality and allow a tractable studentization. In extensive Monte Carlo and empirical studies, we find that the methods exhibit very good small sample properties and can shed light on important empirical issues such as the evolution of inflation persistence and the purchasing power parity (PPP) hypothesis.  相似文献   

10.
We introduce a new family of network models, called hierarchical network models, that allow us to represent in an explicit manner the stochastic dependence among the dyads (random ties) of the network. In particular, each member of this family can be associated with a graphical model defining conditional independence clauses among the dyads of the network, called the dependency graph. Every network model with dyadic independence assumption can be generalized to construct members of this new family. Using this new framework, we generalize the Erdös–Rényi and the β models to create hierarchical Erdös–Rényi and β models. We describe various methods for parameter estimation, as well as simulation studies for models with sparse dependency graphs.  相似文献   

11.
Abstract. Most disequilibrium and shortage models of centrally planned economies fall into three categories: testable excess demand (the Portes school), disequilibrium indicators, and Kornai's economics of shortage. These models have generated numerous controversies and conflicting empirical results. However, this paper argues that some disputes are not caused by the theoretical features of the models but rather by the utilisation of different estimation methods that are not directly comparable. This suggests that several controversies are more apparent than real and can be resolved through the improvement of computational techniques and statistical hypothesis testing theory.  相似文献   

12.
本文将贝叶斯非线性分层模型应用于基于不同业务线的多元索赔准备金评估中,设计了一种合适的模型结构,将非线性分层模型与贝叶斯方法结合起来,应用WinBUGS软件对精算实务中经典流量三角形数据进行建模分析,并使用MCMC方法得到了索赔准备金完整的预测分布。这种方法扩展并超越了已有多元评估方法中最佳估计和预测均方误差估计的研究范畴。在贝叶斯框架下结合后验分布实施推断对非寿险公司偿付能力监管和行业决策具有重要作用。  相似文献   

13.
One of the pressing problems of mechanical reliability still requiring a satisfactory solution is that of ensuring the guaranteed fatigue life of a component or structure subject to random dynamic loading. In the past, this problem has generally been solved in technical practice by the choice of a sufficiently large "safety factor" when dimensioning critically stressed parts of a complex structure.
Application of probability and statistical methods now offers the possibility of developing a theory of reliability of mechanical systems, where the risk of failure can be expressed as a probability, taking into account effects of random loading processes, which characterize either the functioning of the system itself or external environmental operating conditions.
In the following paper we describe one method of approach to the solution of this problem. The solution consists of a combination of the W eibull -F reudenthal -G umbel theory of fatigue estimation using ( , N, P ) relations, the P almgren -M iner hypothesis of linear accumulation of damage and the theory of stationary random processes having a given autocorrelation function or spectral density. Several other stochastic models are discussed in [1]. The subject of this paper was chosen in acknowledgement of the fact that H. C. H amaker in his applied theoretical work also dealt with a related problem concerning the breaking strength of glass [2].  相似文献   

14.
In this paper, a method is introduced for approximating the likelihood for the unknown parameters of a state space model. The approximation converges to the true likelihood as the simulation size goes to infinity. In addition, the approximating likelihood is continuous as a function of the unknown parameters under rather general conditions. The approach advocated is fast and robust, and it avoids many of the pitfalls associated with current techniques based upon importance sampling. We assess the performance of the method by considering a linear state space model, comparing the results with the Kalman filter, which delivers the true likelihood. We also apply the method to a non-Gaussian state space model, the stochastic volatility model, finding that the approach is efficient and effective. Applications to continuous time finance models and latent panel data models are considered. Two different multivariate approaches are proposed. The neoclassical growth model is considered as an application.  相似文献   

15.
In econometric models of disequilibrium, observations are generated by one of two regimes. This paper discusses alternative methods of classifying observations into the two regimes and provides a Monte Carlo assessment of the most widely applicable classification rule.  相似文献   

16.
This paper uses nonlinear error correction models to study yield movements in the US Treasury Bill Market. Nonlinear error correction arises because portfolio adjustment is an ‘on-off’ process, which occurs only when disequilibrium in the bill market is large enough to induce investors to incur the transaction costs associated with buying/selling bills. This, together with heterogeneity of transaction costs, implies that the strength of aggregate error correction depends on both the distribution of costs and the extent of disequilibrium in the market. Smooth transition models are used to describe an aggregate adjustment process which is strong when the market is distant from equilibrium, but becomes weaker as the market approaches equilibrium. Linearity tests indicate that the types of nonlinearities that would be induced by transactions costs are statistically significant, and estimated models which incororate these nonlinearities outperform their linear counterparts, both in sample and out of sample.  相似文献   

17.
This paper develops methods for estimating and forecasting in Bayesian panel vector autoregressions of large dimensions with time‐varying parameters and stochastic volatility. We exploit a hierarchical prior that takes into account possible pooling restrictions involving both VAR coefficients and the error covariance matrix, and propose a Bayesian dynamic learning procedure that controls for various sources of model uncertainty. We tackle computational concerns by means of a simulation‐free algorithm that relies on analytical approximations to the posterior. We use our methods to forecast inflation rates in the eurozone and show that these forecasts are superior to alternative methods for large vector autoregressions.  相似文献   

18.
The past decade has seen a number of advances in modelling disequilibrium dynamics. This paper draws on separate approaches to disequilibrium dynamics to demonstrate a Keynesian result concerning the formal relevance of “animal spirits” in production economies. Specifically, it is shown that a parameter that can be associated with the “animal spirits” of firms is crucial to the stability of full employment equilibrium in a production economy. This approach to “animal spirits” is different to that taken by recent New Keynesian DSGE-type models, but similar in spirit to “Old Keynesian” approaches, including that of the General Theory. The corollary of the main conclusion is that price flexibility is not a sufficient condition for convergence on full employment equilibrium.  相似文献   

19.
This paper analyzes a general class of non-normal density functions (dubbed Sargan densities) in the context of the ordinary regression model and the simple one-market disequilibrium model. Use of the normal density in disequilibrium models is unwieldy, especially for multimarket models, since the application of maximum likelihood methods requires numerical evaluation of multiple integrals. These difficulties are avoided with the Sargan densities, and based on both asymptotic results and limited sampling experiments, these densities appear to offer a promising alternative to the normal in disequilibrium models.  相似文献   

20.
Two popular approaches for efficiency measurement are a non-stochastic approach called data envelopment analysis (DEA) and a parametric approach called stochastic frontier analysis (SFA). Both approaches have modeling difficulty, particularly for ranking firm efficiencies. In this paper, a new parametric approach using quantile statistics is developed. The quantile statistic relies less on the stochastic model than SFA methods, and accounts for a firm's relationship to the other firms in the study by acknowledging the firm's influence on the empirical model, and its relationship, in terms of similarity of input levels, to the other firms. Copyright © 1999 John Wiley & Sons, Ltd.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号