首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
The term structure of real yields and expected inflation are two unobserved components of the nominal yield curve. The primary objectives of this study are to decompose nominal yields into their expected real yield and inflation components and to examine their behaviour using state-space and regime-switching frameworks. The dynamic yield-curve models capture three well-known latent factors – level, slope, and curvature – that accurately aggregate the information for the nominal yields and the expected real and inflation components for all maturities. The nominal yield curve is found to increase slightly with a slope of about 120 basis points, while the real yield curve slopes upward by about 20 basis points, and the expected inflation curve is virtually flat at slightly above 2 per cent. The regime-switching estimations reveal that the nominal yield, real yield and expected inflation curves have shifted down significantly since 1999.  相似文献   

2.
Forecasting economic time series using targeted predictors   总被引:2,自引:0,他引:2  
This paper studies two refinements to the method of factor forecasting. First, we consider the method of quadratic principal components that allows the link function between the predictors and the factors to be non-linear. Second, the factors used in the forecasting equation are estimated in a way to take into account that the goal is to forecast a specific series. This is accomplished by applying the method of principal components to ‘targeted predictors’ selected using hard and soft thresholding rules. Our three main findings can be summarized as follows. First, we find improvements at all forecast horizons over the current diffusion index forecasts by estimating the factors using fewer but informative predictors. Allowing for non-linearity often leads to additional gains. Second, forecasting the volatile one month ahead inflation warrants a high degree of targeting to screen out the noisy predictors. A handful of variables, notably relating to housing starts and interest rates, are found to have systematic predictive power for inflation at all horizons. Third, the targeted predictors selected by both soft and hard thresholding changes with the forecast horizon and the sample period. Holding the set of predictors fixed as is the current practice of factor forecasting is unnecessarily restrictive.  相似文献   

3.
This paper discusses a factor model for short-term forecasting of GDP growth using a large number of monthly and quarterly time series in real-time. To take into account the different periodicities of the data and missing observations at the end of the sample, the factors are estimated by applying an EM algorithm, combined with a principal components estimator. We discuss some in-sample properties of the estimator in a real-time environment and propose alternative methods for forecasting quarterly GDP with monthly factors. In the empirical application, we use a novel real-time dataset for the German economy. Employing a recursive forecast experiment, we evaluate the forecast accuracy of the factor model with respect to German GDP. Furthermore, we investigate the role of revisions in forecast accuracy and assess the contribution of timely monthly observations to the forecast performance. Finally, we compare the performance of the mixed-frequency model with that of a factor model, based on time-aggregated quarterly data.  相似文献   

4.
The empirical analysis of monetary policy requires the construction of instruments for future expected inflation. Dynamic factor models have been applied rather successfully to inflation forecasting. In fact, two competing methods have recently been developed to estimate large‐scale dynamic factor models based, respectively, on static and dynamic principal components. This paper combines the econometric literature on dynamic principal components and the empirical analysis of monetary policy. We assess the two competing methods for extracting factors on the basis of their success in instrumenting future expected inflation in the empirical analysis of monetary policy. We use two large data sets of macroeconomic variables for the USA and for the Euro area. Our results show that estimated factors do provide a useful parsimonious summary of the information used in designing monetary policy. Copyright © 2005 John Wiley & Sons, Ltd.  相似文献   

5.
An important issue in interest rate modeling is the number and nature of the random factors driving the evolution of the yield curve. This paper uses principal component analysis to examine (1) the inherent dimension of historical yield curve changes indicated by the significance of eigenvalues of the covariance matrix, (2) the practical dimension determined by a variance threshold, (3) the shape of the yield curve change associated with the first principal component, and (4) the persistence of this shape over time. We find that although the first two components explain 93% of the sample variation within a 90% confidence interval, the remaining components make statistically significant contribution to the covariance matrix. Consequently, we can establish a practical limit on the dimension only if we are willing to designate a threshold error variance. Further, our results on the persistence of the shape of the yield curve shift associated with the first component depend upon this threshold. If all components are included, the hypothesis that the shape persists between two sample time periods is rejected. On the other hand, if all but the first six components are eliminated, the hypothesis is not rejected.  相似文献   

6.
The purpose of this paper is to contrasts four different multivariate methods; multiple regression using principal components, factors analysis, discriminant analysis involving another use of principal components, and canonical correlation. The method adopted to compare and constrast the techniques is to make careful note of the assumptions involved in each model and to apply each form of analysis to variables drawn from the same set of data.  相似文献   

7.
Forecast combination through dimension reduction techniques   总被引:2,自引:0,他引:2  
This paper considers several methods of producing a single forecast from several individual ones. We compare “standard” but hard to beat combination schemes (such as the average of forecasts at each period, or consensus forecast and OLS-based combination schemes) with more sophisticated alternatives that involve dimension reduction techniques. Specifically, we consider principal components, dynamic factor models, partial least squares and sliced inverse regression.Our source of forecasts is the Survey of Professional Forecasters, which provides forecasts for the main US macroeconomic aggregates. The forecasting results show that partial least squares, principal component regression and factor analysis have similar performances (better than the usual benchmark models), but sliced inverse regression shows an extreme behavior (performs either very well or very poorly).  相似文献   

8.
Changing time series properties of US inflation and economic activity, measured as marginal costs, are modeled within a set of extended New Keynesian Phillips curve (NKPC) models. It is shown that mechanical removal or modeling of simple low‐frequency movements in the data may yield poor predictive results which depend on the model specification used. Basic NKPC models are extended to include structural time series models that describe typical time‐varying patterns in levels and volatilities. Forward‐ and backward‐looking expectation components for inflation are incorporated and their relative importance is evaluated. Survey data on expected inflation are introduced to strengthen the information in the likelihood. Use is made of simulation‐based Bayesian techniques for the empirical analysis. No credible evidence is found on endogeneity and long‐run stability between inflation and marginal costs. Backward‐looking inflation appears stronger than forward‐looking inflation. Levels and volatilities of inflation are estimated more precisely using rich NKPC models. The extended NKPC structures compare favorably with existing basic Bayesian vector autoregressive and stochastic volatility models in terms of fit and prediction. Tails of the complete predictive distributions indicate an increase in the probability of deflation in recent years. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

9.
We use a macro‐finance model, incorporating macroeconomic and financial factors, to study the term premium in the US bond market. Estimating the model using Bayesian techniques, we find that a single factor explains most of the variation in bond risk premiums. Furthermore, the model‐implied risk premiums account for up to 40% of the variability of one‐ and two‐year excess returns. Using the model to decompose yield spreads into an expectations and a term premium component, we find that, although this decomposition does not seem important to forecast economic activity, it is crucial to forecast inflation for most forecasting horizons. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

10.
侯青 《价值工程》2010,29(2):40-42
利用门限回归模型(threshold regression model)分析中国通货膨胀与经济增长之间的关系,实证研究发现中国通货膨胀对经济增长的影响不是单一形态,而是具有非线性特征。中国经济存在最优通货膨胀区间1.4%~5.1%,而当通货膨胀高于5.1%或低于1.4%时,通货膨胀(通过影响生产要素)对经济增长有负的影响。  相似文献   

11.
This paper estimates a sticky price macro model with US macro and term structure data using Bayesian methods. The model is solved by a nonlinear method. The posterior distribution of the parameters in the model is found to be bi-modal. The degree of nominal rigidity is high at one mode (“sticky price mode”) but is low at the other mode (“flexible price mode”). I find that the degree of nominal rigidity is important for identifying macro shocks that affect the yield curve. When prices are more flexible, a slowly varying inflation target of the central bank is the main driver of the overall level of the yield curve by changing long-run inflation expectations. In contrast, when prices are more sticky, a highly persistent markup shock is the main driver. The posterior probability of each mode is sensitive to the use of observed proxies for inflation expectations. Ignoring additional information from survey data on inflation expectations significantly reduces the posterior probability of the flexible price mode. Incorporating this additional information suggests that yield curve fluctuations can be better understood by focusing on the flexible price mode. Considering nonlinearities of the model solution also increases the posterior probability of the flexible price mode, although to a lesser degree than using survey data information.  相似文献   

12.
A number of recent studies in the economics literature have focused on the usefulness of factor models in the context of prediction using “big data” (see Bai and Ng, 2008; Dufour and Stevanovic, 2010; Forni, Hallin, Lippi, & Reichlin, 2000; Forni et al., 2005; Kim and Swanson, 2014a; Stock and Watson, 2002b, 2006, 2012, and the references cited therein). We add to this literature by analyzing whether “big data” are useful for modelling low frequency macroeconomic variables, such as unemployment, inflation and GDP. In particular, we analyze the predictive benefits associated with the use of principal component analysis (PCA), independent component analysis (ICA), and sparse principal component analysis (SPCA). We also evaluate machine learning, variable selection and shrinkage methods, including bagging, boosting, ridge regression, least angle regression, the elastic net, and the non-negative garotte. Our approach is to carry out a forecasting “horse-race” using prediction models that are constructed based on a variety of model specification approaches, factor estimation methods, and data windowing methods, in the context of predicting 11 macroeconomic variables that are relevant to monetary policy assessment. In many instances, we find that various of our benchmark models, including autoregressive (AR) models, AR models with exogenous variables, and (Bayesian) model averaging, do not dominate specifications based on factor-type dimension reduction combined with various machine learning, variable selection, and shrinkage methods (called “combination” models). We find that forecast combination methods are mean square forecast error (MSFE) “best” for only three variables out of 11 for a forecast horizon of h=1, and for four variables when h=3 or 12. In addition, non-PCA type factor estimation methods yield MSFE-best predictions for nine variables out of 11 for h=1, although PCA dominates at longer horizons. Interestingly, we also find evidence of the usefulness of combination models for approximately half of our variables when h>1. Most importantly, we present strong new evidence of the usefulness of factor-based dimension reduction when utilizing “big data” for macroeconometric forecasting.  相似文献   

13.
This paper examines the theoretical and empirical properties of a supervised factor model based on combining forecasts using principal components (CFPC), in comparison with two other supervised factor models (partial least squares regression, PLS, and principal covariate regression, PCovR) and with the unsupervised principal component regression, PCR. The supervision refers to training the predictors for a variable to forecast. We compare the performance of the three supervised factor models and the unsupervised factor model in forecasting of U.S. CPI inflation. The main finding is that the predictive ability of the supervised factor models is much better than the unsupervised factor model. The computation of the factors can be doubly supervised together with variable selection, which can further improve the forecasting performance of the supervised factor models. Among the three supervised factor models, the CFPC best performs and is also most stable. While PCovR also performs well and is stable, the performance of PLS is less stable over different out-of-sample forecasting periods. The effect of supervision gets even larger as forecast horizon increases. Supervision helps to reduce the number of factors and lags needed in modelling economic structure, achieving more parsimony.  相似文献   

14.
This paper employs a zero lower bound (ZLB) consistent shadow‐rate model to decompose UK nominal yields into expectation and term premium components. Compared to a standard affine term structure model, it performs relatively better in a ZLB setting by capturing the stylized facts of the yield curve. The ZLB model is then exploited to estimate inflation expectations and risk premiums. This entails jointly pricing and decomposing nominal and real UK yields. We find evidence that medium‐ and long‐term inflation expectations are contained within narrower bounds since the early 1990s, suggesting monetary policy credibility improved after the introduction of inflation targeting.  相似文献   

15.
This paper contributes to the nascent literature on nowcasting and forecasting GDP in emerging market economies using big data methods. This is done by analyzing the usefulness of various dimension-reduction, machine learning and shrinkage methods, including sparse principal component analysis (SPCA), the elastic net, the least absolute shrinkage operator, and least angle regression when constructing predictions using latent global macroeconomic and financial factors (diffusion indexes) in a dynamic factor model (DFM). We also utilize a judgmental dimension-reduction method called the Bloomberg Relevance Index (BRI), which is an index that assigns a measure of importance to each variable in a dataset depending on the variable’s usage by market participants. Our empirical analysis shows that, when specified using dimension-reduction methods (particularly BRI and SPCA), DFMs yield superior predictions relative to both benchmark linear econometric models and simple DFMs. Moreover, global financial and macroeconomic (business cycle) diffusion indexes constructed using targeted predictors are found to be important in four of the five emerging market economies that we study (Brazil, Mexico, South Africa, and Turkey). These findings point to the importance of spillover effects across emerging market economies, and underscore the significance of characterizing such linkages parsimoniously when utilizing high-dimensional global datasets.  相似文献   

16.
An article by Chan et al. ( 2013 ) published in the Journal of Business and Economic Statistics introduces a new model for trend inflation. They allow the trend inflation to evolve according to a bounded random walk. In order to draw the latent states from their respective conditional posteriors, they use accept–reject Metropolis–Hastings procedures. We reproduce their results using particle Markov chain Monte Carlo (PMCMC), which approaches drawing the latent states from a different technical point of view by relying on combining Markov chain Monte Carlo and sequential Monte Carlo methods. To conclude: we are able to reproduce the results of Chan et al. ( 2013 ). Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

17.
We present the sparse estimation of one-sided dynamic principal components (ODPCs) to forecast high-dimensional time series. The forecast can be made directly with the ODPCs or by using them as estimates of the factors in a generalized dynamic factor model. It is shown that a large reduction in the number of parameters estimated for the ODPCs can be achieved without affecting their forecasting performance.  相似文献   

18.
By using a dynamic factor model, we can substantially improve the reliability of real-time output gap estimates for the U.S. economy. First, we use a factor model to extract a series for the common component in GDP from a large panel of monthly real-time macroeconomic variables. This series is immune to revisions to the extent that revisions are due to unbiased measurement errors or idiosyncratic news. Second, our model is able to handle the unbalanced arrival of the data. This yields favorable nowcasting properties and thus starting conditions for the filtering of data into a trend and deviations from a trend. Combined with the method of augmenting data with forecasts prior to filtering, this greatly reduces the end-of-sample imprecision in the gap estimate. The increased precision has economic importance for real-time policy decisions and improves real-time inflation forecasts.  相似文献   

19.
This paper examines whether the explanatory power of exchange rate models can be improved by allowing for cross-country asymmetries and non-linear effects of fundamentals. Both appear to be crucial. The samples include the USD versus pound and yen from 1982:10 to 2013:10, and automated model selection is conducted with indicator saturation. Several non-linear effects are significant at 1%. Further, many of the indicators present in the linear models are eliminated once allowing for non-linearities; suggesting some of the structural breaks found in previous work were an artifact of the misspecified linear functional form. These conclusions are robust to estimation using principal components.  相似文献   

20.
US monetary policy is investigated using a regime-switching no-arbitrage term structure model that relies on inflation, output, and the short interest rate as factors. The model is complemented with a set of assumptions that allow the dynamics of the private sector to be separated from monetary policy. The monetary policy regimes cannot be estimated if the yield curve is ignored during estimation. Counterfactual analysis evaluates importance of regimes in policy and shocks for the great moderation. The low-volatility regime of exogenous shocks plays an important role. Monetary policy contributes by trading off asymmetric responses of output and inflation under different regimes.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号