首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
This paper proposes a common and tractable framework for analyzing fixed and random effects models, in particular constant‐slope variable‐intercept designs. It is shown that, regardless of whether effects (i) are treated as parameters or as an error term, (ii) are estimated in different stages of a hierarchical model, or whether (iii) correlation between effects and regressors is allowed, when the same prior information on idiosyncratic parameters is introduced into all estimation methods, the resulting common slope estimator is also the same across methods. These results are illustrated using the Grünfeld investment data with different prior distributions. Random effects estimates are shown to be more efficient than fixed effects estimates. This efficiency gain, however, comes at the cost of neglecting information obtained in the computation of the prior unknown variance of idiosyncratic parameters.  相似文献   

2.
In most democracies, at least two out of any three individuals vote for the same party in sequential elections. This paper presents a model in which vote‐persistence is partly due to the dependence of the utility on the previous voting decision. This dependence is termed ‘habit formation’. The model and its implications are supported by individual‐level panel data on the presidential elections in the USA in 1972 and 1976. For example, it is found that the voting probability is a function of the lagged choice variable, even when the endogeneity of the lagged variable is accounted for, and that the tendency to vote for different parties in sequential elections decreased with the age of the voter. Furthermore, using structural estimation the effect of habit is estimated, while allowing unobserved differences among respondents. The structural habit parameter implies that the effect of previous votes on the current decision is quite strong. The habit model fits the data better than the traditional ‘party identification’ model. Copyright © 2003 John Wiley & Sons, Ltd.  相似文献   

3.
The paper asks how state of the art DSGE models that account for the conditional response of hours following a positive neutral technology shock compare in a marginal likelihood race. To that end we construct and estimate several competing small-scale DSGE models that extend the standard real business cycle model. In particular, we identify from the literature six different hypotheses that generate the empirically observed decline in hours worked after a positive technology shock. These models alternatively exhibit (i) sticky prices; (ii) firm entry and exit with time to build; (iii) habit in consumption and costly adjustment of investment; (iv) persistence in the permanent technology shocks; (v) labor market friction with procyclical hiring costs; and (vi) Leontief production function with labor-saving technology shocks. In terms of model posterior probabilities, impulse responses, and autocorrelations, the model favored is the one that exhibits habit formation in consumption and investment adjustment costs. A robustness test shows that the sticky price model becomes as competitive as the habit formation and costly adjustment of investment model when sticky wages are included.  相似文献   

4.
This paper studies the ability of a general class of habit‐based asset pricing models to match the conditional moment restrictions implied by asset pricing theory. We treat the functional form of the habit as unknown, and estimate it along with the rest of the model's finite dimensional parameters. Using quarterly data on consumption growth, assets returns and instruments, our empirical results indicate that the estimated habit function is nonlinear, that habit formation is better described as internal rather than external, and the estimated time‐preference parameter and the power utility parameter are sensible. In addition, the estimated habit function generates a positive stochastic discount factor (SDF) proxy and performs well in explaining cross‐sectional stock return data. We find that an internal habit SDF proxy can explain a cross‐section of size and book‐market sorted portfolio equity returns better than (i) the Fama and French ( 1993 ) three‐factor model, (ii) the Lettau and Ludvigson ( 2001b ) scaled consumption CAPM model, (iii) an external habit SDF proxy, (iv) the classic CAPM, and (v) the classic consumption CAPM. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

5.
This paper examines how and to what extent parameter estimates can be biased in a dynamic stochastic general equilibrium (DSGE) model that omits the zero lower bound (ZLB) constraint on the nominal interest rate. Our Monte Carlo experiments using a standard sticky‐price DSGE model show that no significant bias is detected in parameter estimates and that the estimated impulse response functions are quite similar to the true ones. However, as the frequency of being at the ZLB or the duration of ZLB spells increases, the parameter bias becomes larger and therefore leads to substantial differences between the estimated and true impulse responses. It is also demonstrated that the model missing the ZLB causes biased estimates of structural shocks even with the virtually unbiased parameters. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

6.
This paper discusses estimation of US inflation volatility using time‐varying parameter models, in particular whether it should be modelled as a stationary or random walk stochastic process. Specifying inflation volatility as an unbounded process, as implied by the random walk, conflicts with priors beliefs, yet a stationary process cannot capture the low‐frequency behaviour commonly observed in estimates of volatility. We therefore propose an alternative model with a change‐point process in the volatility that allows for switches between stationary models to capture changes in the level and dynamics over the past 40 years. To accommodate the stationarity restriction, we develop a new representation that is equivalent to our model but is computationally more efficient. All models produce effectively identical estimates of volatility, but the change‐point model provides more information on the level and persistence of volatility and the probabilities of changes. For example, we find a few well‐defined switches in the volatility process and, interestingly, these switches line up well with economic slowdowns or changes of the Federal Reserve Chair. Moreover, a decomposition of inflation shocks into permanent and transitory components shows that a spike in volatility in the late 2000s was entirely on the transitory side and characterized by a rise above its long‐run mean level during a period of higher persistence. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

7.
This paper evaluates the effects of high‐frequency uncertainty shocks on a set of low‐frequency macroeconomic variables representative of the US economy. Rather than estimating models at the same common low frequency, we use recently developed econometric models, which allow us to deal with data of different sampling frequencies. We find that credit and labor market variables react the most to uncertainty shocks in that they exhibit a prolonged negative response to such shocks. When looking at detailed investment subcategories, our estimates suggest that the most irreversible investment projects are the most affected by uncertainty shocks. We also find that the responses of macroeconomic variables to uncertainty shocks are relatively similar across single‐frequency and mixed‐frequency data models, suggesting that the temporal aggregation bias is not acute in this context.  相似文献   

8.
This paper studies the empirical performance of stochastic volatility models for twenty years of weekly exchange rate data for four major currencies. We concentrate on the effects of the distribution of the exchange rate innovations for both parameter estimates and for estimates of the latent volatility series. The density of the log of squared exchange rate innovations is modelled as a flexible mixture of normals. We use three different estimation techniques: quasi-maximum likelihood, simulated EM, and a Bayesian procedure. The estimated models are applied for pricing currency options. The major findings of the paper are that: (1) explicitly incorporating fat-tailed innovations increases the estimates of the persistence of volatility dynamics; (2) the estimation error of the volatility time series is very large; (3) this in turn causes standard errors on calculated option prices to be so large that these prices are rarely significantly different from a model with constant volatility. © 1998 John Wiley & Sons, Ltd.  相似文献   

9.
In practice, inventory decisions depend heavily on demand forecasts, but the literature typically assumes that demand distributions are known. This means that estimates are substituted directly for the unknown parameters, leading to insufficient safety stocks, stock-outs, low service, and high costs. We propose a framework for addressing this estimation uncertainty that is applicable to any inventory model, demand distribution, and parameter estimator. The estimation errors are modeled and a predictive lead time demand distribution obtained, which is then substituted into the inventory model. We illustrate this framework for several different demand models. When the estimates are based on ten observations, the relative savings are typically between 10% and 30% for mean-stationary demand. However, the savings are larger when the estimates are based on fewer observations, when backorders are costlier, or when the lead time is longer. In the presence of a trend, the savings are between 50% and 80% for several scenarios.  相似文献   

10.
The objective of this paper is to investigate the cyclical behaviour of mark‐ups, using a panel of Spanish manufacturing firms over the period 1990–1998. Margins are estimated from the optimal conditions derived from the firm's optimisation problem, which assumes that labour inputs are subject to adjustment costs. A number of results emerge from the estimations. First, we find positive and asymmetric adjustment costs for permanent labour inputs. Second, price‐cost margins are markedly procyclical. Our estimates suggest that labour adjustment costs more than double the variability of average margins with respect to Lerner indexes. Third, we find differences in the parameters of the adjustment technology across industries which make markups of intermediate and production good industries more cyclical than consumer good industries. Finally, industry‐specific price‐cost margins are higher in more concentrated industries.  相似文献   

11.
This paper derives a method for estimating and testing the Linear Quadratic Adjustment Cost (LQAC) model when the target variable and some of the forcing variables follow I(2) processes. Based on a forward-looking error-correction formulation of the model it is shown how to obtain strongly consistent estimates of the structural parameters from both a linear and a non-linear cointegrating regression where first-differences of the I(2) variables are included as regressors (multicointegration). Further, based on the estimated parameter values, it is shown how to test and evaluate the LQAC model using a VAR approach. A simple easy interpretable metric for measuring the model fit is suggested. In an empirical application using UK money demand data, the non-linear multicointegrating regression delivers an economically plausible estimate of the adjustment cost parameter. However, the restrictions implied by the exact LQAC model under rational expectations are strongly rejected and the metric for model fit indicates a substantial noise component in the model. Copyright © 1999 John Wiley & Sons, Ltd.  相似文献   

12.
13.
We propose a new dynamic copula model in which the parameter characterizing dependence follows an autoregressive process. As this model class includes the Gaussian copula with stochastic correlation process, it can be viewed as a generalization of multivariate stochastic volatility models. Despite the complexity of the model, the decoupling of marginals and dependence parameters facilitates estimation. We propose estimation in two steps, where first the parameters of the marginal distributions are estimated, and then those of the copula. Parameters of the latent processes (volatilities and dependence) are estimated using efficient importance sampling. We discuss goodness‐of‐fit tests and ways to forecast the dependence parameter. For two bivariate stock index series, we show that the proposed model outperforms standard competing models. Copyright © 2010 John Wiley & Sons, Ltd.  相似文献   

14.
It is argued that univariate long memory estimates based on ex post data tend to underestimate the persistence of ex ante variables (and, hence, that of the ex post variables themselves) because of the presence of unanticipated shocks whose short‐run volatility masks the degree of long‐range dependence in the data. Empirical estimates of long‐range dependence in the Fisher equation are shown to manifest this problem and lead to an apparent imbalance in the memory characteristics of the variables in the Fisher equation. Evidence in support of this typical underestimation is provided by results obtained with inflation forecast survey data and by direct calculation of the finite sample biases. To address the problem of bias, the paper introduces a bivariate exact Whittle (BEW) estimator that explicitly allows for the presence of short memory noise in the data. The new procedure enhances the empirical capacity to separate low‐frequency behaviour from high‐frequency fluctuations, and it produces estimates of long‐range dependence that are much less biased when there is noise contaminated data. Empirical estimates from the BEW method suggest that the three Fisher variables are integrated of the same order, with memory parameter in the range (0.75, 1). Since the integration orders are balanced, the ex ante real rate has the same degree of persistence as expected inflation, thereby furnishing evidence against the existence of a (fractional) cointegrating relation among the Fisher variables and, correspondingly, showing little support for a long‐run form of Fisher hypothesis. Copyright © 2004 John Wiley & Sons, Ltd.  相似文献   

15.
This paper examines price adjustment behaviour in the magazine industry. In a frequently cited study, Cecchetti ( 1986 ) constructs a reduced‐form (S, s) model for firms. Cecchetti assumes that a firm's pricing rules are fixed for non‐overlapping three‐year intervals and estimates the model using a conditional logit specification from Chamberlain ( 1980 ). The estimates are inconsistent, however, due to the duration‐dependent specification of the model. Two alternative specifications are used to obtain consistent estimation. The consistent estimates continue to provide strong evidence in favour of state‐dependent pricing models, but only weak evidence on the behaviour of price adjustment costs. Copyright © 2006 John Wiley & Sons, Ltd.  相似文献   

16.
This study examined the performance of two alternative estimation approaches in structural equation modeling for ordinal data under different levels of model misspecification, score skewness, sample size, and model size. Both approaches involve analyzing a polychoric correlation matrix as well as adjusting standard error estimates and model chi-squared, but one estimates model parameters with maximum likelihood and the other with robust weighted least-squared. Relative bias in parameter estimates and standard error estimates, Type I error rate, and empirical power of the model test, where appropriate, were evaluated through Monte Carlo simulations. These alternative approaches generally provided unbiased parameter estimates when the model was correctly specified. They also provided unbiased standard error estimates and adequate Type I error control in general unless sample size was small and the measured variables were moderately skewed. Differences between the methods in convergence problems and the evaluation criteria, especially under small sample and skewed variable conditions, were discussed.  相似文献   

17.
Pooling of data is often carried out to protect privacy or to save cost, with the claimed advantage that it does not lead to much loss of efficiency. We argue that this does not give the complete picture as the estimation of different parameters is affected to different degrees by pooling. We establish a ladder of efficiency loss for estimating the mean, variance, skewness and kurtosis, and more generally multivariate joint cumulants, in powers of the pool size. The asymptotic efficiency of the pooled data non‐parametric/parametric maximum likelihood estimator relative to the corresponding unpooled data estimator is reduced by a factor equal to the pool size whenever the order of the cumulant to be estimated is increased by one. The implications of this result are demonstrated in case–control genetic association studies with interactions between genes. Our findings provide a guideline for the discriminate use of data pooling in practice and the assessment of its relative efficiency. As exact maximum likelihood estimates are difficult to obtain if the pool size is large, we address briefly how to obtain computationally efficient estimates from pooled data and suggest Gaussian estimation and non‐parametric maximum likelihood as two feasible methods.  相似文献   

18.
We employ a neoclassical business‐cycle model to study two sources of business‐cycle fluctuations: marginal efficiency of investment shocks, and total factor productivity shocks. The parameters of the model are estimated using a Bayesian procedure that accommodates prior uncertainty about their magnitudes; from these estimates, posterior distributions of the two shocks are obtained. The postwar US experience suggests that both shocks are important in understanding fluctuations, but that total factor productivity shocks are primarily responsible for beginning and ending recessions. Copyright © 2000 John Wiley & Sons, Ltd.  相似文献   

19.
We develop new tests of the capital asset pricing model that take account of and are valid under the assumption that the distribution generating returns is elliptically symmetric; this assumption is necessary and sufficient for the validity of the CAPM. Our test is based on semiparametric efficient estimation procedures for a seemingly unrelated regression model where the multivariate error density is elliptically symmetric, but otherwise unrestricted. The elliptical symmetry assumption allows us to avoid the curse of dimensionality problem that typically arises in multivariate semiparametric estimation procedures, because the multivariate elliptically symmetric density function can be written as a function of a scalar transformation of the observed multivariate data. The elliptically symmetric family includes a number of thick‐tailed distributions and so is potentially relevant in financial applications. Our estimated betas are lower than the OLS estimates, and our parameter estimates are much less consistent with the CAPM restrictions than the corresponding OLS estimates. Copyright © 2002 John Wiley & Sons, Ltd.  相似文献   

20.
铁路项目造价控制是通过概算来控制的,设计院出据的经审核的核备概算是在初步设计基础上编制的,但投资检算仅是对设计的一种考核而不是工程造价的控制指标。这就造成由于设计深度不同造成工程造价不能反映最终的工程施工情况,设计院为了不受到处罚,在投检编制过程中出现可能超概算的情况时,采用人为降低造价标准或一些需发生项目不计算费用的办法,来使投检额控制在概算范围内。为此,如何做好铁路工程的调概索赔管理、通过项目经营求得效益最大化更具挑战性。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号