首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 406 毫秒
1.
This paper proposes a new method for estimating a structural model of labour supply in which hours of work depend on (log) wages and the wage rate is considered endogenous. The main innovation with respect to other related estimation procedures is that a nonparametric additive structure in the hours of work equation is permitted. Though the focus of the paper is on this particular application, a three‐step methodology for estimating models in the presence of the above econometric problems is described. In the first step the reduced form parameters of the participation equation are estimated by a maximum likelihood procedure adapted for estimation of an additive nonparametric function. In the second step the structural parameters of the wage equation are estimated after obtaining the selection‐corrected conditional mean function. Finally, in the third step the structural parameters of the labour supply equation are estimated using local maximum likelihood estimation techniques. The paper concludes with an application to illustrate the feasibility, performance and possible gain of using this method. Copyright © 2005 John Wiley & Sons, Ltd.  相似文献   

2.
We use frequency domain techniques to estimate a medium‐scale dynamic stochastic general equilibrium (DSGE) model on different frequency bands. We show that goodness of fit, forecasting performance and parameter estimates vary substantially with the frequency bands over which the model is estimated. Estimates obtained using subsets of frequencies are characterized by significantly different parameters, an indication that the model cannot match all frequencies with one set of parameters. In particular, we find that: (i) the low‐frequency properties of the data strongly affect parameter estimates obtained in the time domain; (ii) the importance of economic frictions in the model changes when different subsets of frequencies are used in estimation. This is particularly true for the investment adjustment cost and habit persistence: when low frequencies are present in the estimation, the investment adjustment cost and habit persistence are estimated to be higher than when low frequencies are absent. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

3.
Forecasting earthquakes and earthquake risk   总被引:1,自引:0,他引:1  
This paper reviews issues, models, and methodologies arising out of the problems of predicting earthquakes and forecasting earthquake risk. The emphasis is on statistical methods which attempt to quantify the probability of an earthquake occurring within specified time, space, and magnitude windows. One recurring theme is that such probabilities are best developed from models which specify a time-varying conditional intensity (conditional probability per unit time, area or volume, and magnitude interval) for every point in the region under study. The paper comprises three introductory sections, and three substantive sections. The former outline the current state of earthquake prediction, earthquakes and their parameters, and the point process background. The latter cover the estimation of background risk, the estimation of time-varying risk, and some specific examples of models and prediction algorithms. The paper concludes with some brief comments on the links between forecasting earthquakes and other forecasting problems.  相似文献   

4.
We consider the problem of estimating a varying coefficient regression model when regressors include a time trend. We show that the commonly used local constant kernel estimation method leads to an inconsistent estimation result, while a local polynomial estimator yields a consistent estimation result. We establish the asymptotic normality result for the proposed estimator. We also provide asymptotic analysis of the data-driven (least squares cross validation) method of selecting the smoothing parameters. In addition, we consider a partially linear time trend model and establish the asymptotic distribution of our proposed estimator. Two test statistics are proposed to test the null hypotheses of a linear and of a partially linear time trend models. Simulations are reported to examine the finite sample performances of the proposed estimators and the test statistics.  相似文献   

5.
This paper is concerned with the statistical inference on seemingly unrelated varying coefficient partially linear models. By combining the local polynomial and profile least squares techniques, and estimating the contemporaneous correlation, we propose a class of weighted profile least squares estimators (WPLSEs) for the parametric components. It is shown that the WPLSEs achieve the semiparametric efficiency bound and are asymptotically normal. For the non‐parametric components, by applying the undersmoothing technique, and taking the contemporaneous correlation into account, we propose an efficient local polynomial estimation. The resulting estimators are shown to have mean‐squared errors smaller than those estimators that neglect the contemporaneous correlation. In addition, a class of variable selection procedures is developed for simultaneously selecting significant variables and estimating unknown parameters, based on the non‐concave penalized and weighted profile least squares techniques. With a proper choice of regularization parameters and penalty functions, the proposed variable selection procedures perform as efficiently as if one knew the true submodels. The proposed methods are evaluated using wide simulation studies and applied to a set of real data.  相似文献   

6.
In situations where a regression model is subject to one or more breaks it is shown that it can be optimal to use pre-break data to estimate the parameters of the model used to compute out-of-sample forecasts. The issue of how best to exploit the trade-off that might exist between bias and forecast error variance is explored and illustrated for the multivariate regression model under the assumption of strictly exogenous regressors. In practice when this assumption cannot be maintained and both the time and size of the breaks are unknown, the optimal choice of the observation window will be subject to further uncertainties that make exploiting the bias–variance trade-off difficult. To that end we propose a new set of cross-validation methods for selection of a single estimation window and weighting or pooling methods for combination of forecasts based on estimation windows of different lengths. Monte Carlo simulations are used to show when these procedures work well compared with methods that ignore the presence of breaks.  相似文献   

7.
In this paper we introduce a nonparametric estimation method for a large Vector Autoregression (VAR) with time‐varying parameters. The estimators and their asymptotic distributions are available in closed form. This makes the method computationally efficient and capable of handling information sets as large as those typically handled by factor models and Factor Augmented VARs. When applied to the problem of forecasting key macroeconomic variables, the method outperforms constant parameter benchmarks and compares well with large (parametric) Bayesian VARs with time‐varying parameters. The tool can also be used for structural analysis. As an example, we study the time‐varying effects of oil price shocks on sectoral U.S. industrial output. According to our results, the increased role of global demand in shaping oil price fluctuations largely explains the diminished recessionary effects of global energy price increases.  相似文献   

8.
This paper reviews research issues in modeling panels of time series. Examples of this type of data are annually observed macroeconomic indicators for all countries in the world, daily returns on the individual stocks listed in the S&P500, and the sales records of all items in a retail store. A panel of time series concerns the case where the cross‐sectional dimension and the time dimension are large. Often, there is no a priori reason to select a few series or to aggregate the series over the cross‐sectional dimension. The use of, for example, a vector autoregression or other types of multivariate models then becomes cumbersome. Panel models and associated estimation techniques are more useful. Due to the large time dimension, one should however incorporate the time‐series features. And, the models should not have too many parameters to facilitate interpretation. This paper discusses representation, estimation and inference of relevant models and discusses recently proposed modeling approaches that explicitly aim to meet these requirements. The paper concludes with some reflections on the usefulness of large data sets. These concern sample selection issues and the notion that more detail also requires more complex models.  相似文献   

9.
Abstract This paper provides a systematic review of the weak‐form market efficiency literature that examines return predictability from past price changes, with an exclusive focus on the stock markets. Our survey shows that the bulk of the empirical studies examine whether the stock market under study is or is not weak‐form efficient in the absolute sense, assuming that the level of market efficiency remains unchanged throughout the estimation period. However, the possibility of time‐varying weak‐form market efficiency has received increasing attention in recent years. We categorize these emerging studies based on the research framework adopted, namely non‐overlapping sub‐period analysis, time‐varying parameter model and rolling estimation window. An encouraging development is that the documented empirical evidence of evolving stock return predictability can be rationalized within the framework of the adaptive markets hypothesis.  相似文献   

10.
We develop a mathematical theory needed for moment estimation of the parameters in a general shifting level process (SLP) treating, in particular, the finite state space case geometric finite normal (GFN) SLP. For the SLP, we give expressions for the moment estimators together with asymptotic (co)variances, following, completing, and correcting Cline (Journal of Applied Probability 20, 1983, 322–337); formulae are then made more explicit for the GFN‐SLP. To illustrate the potential uses, we then apply the moment estimation method to a GFN‐SLP model of array comparative genomic hybridization data. We obtain encouraging results in the sense that a segmentation based on the estimated parameters turns out to be faster than with other currently available methods, while being comparable in terms of sensitivity and specificity.  相似文献   

11.
Two alternative robust estimation methods often employed by National Statistical Institutes in business surveys are two‐sided M‐estimation and one‐sided Winsorisation, which can be regarded as an approximate implementation of one‐sided M‐estimation. We review these methods and evaluate their performance in a simulation of a repeated rotating business survey based on data from the Retail Sales Inquiry conducted by the UK Office for National Statistics. One‐sided and two‐sided M‐estimation are found to have very similar performance, with a slight edge for the former for positive variables. Both methods considerably improve both level and movement estimators. Approaches for setting tuning parameters are evaluated for both methods, and this is a more important issue than the difference between the two approaches. M‐estimation works best when tuning parameters are estimated using historical data but is serviceable even when only live data is available. Confidence interval coverage is much improved by the use of a bootstrap percentile confidence interval.  相似文献   

12.
Many industrial and engineering applications are built on the basis of differential equations. In some cases, parameters of these equations are not known and are estimated from measurements leading to an inverse problem. Unlike many other papers, we suggest to construct new designs in the adaptive fashion ‘on the go’ using the A‐optimality criterion. This approach is demonstrated on determination of optimal locations of measurements and temperature sensors in several engineering applications: (1) determination of the optimal location to measure the height of a hanging wire in order to estimate the sagging parameter with minimum variance (toy example), (2) adaptive determination of optimal locations of temperature sensors in a one‐dimensional inverse heat transfer problem and (3) adaptive design in the framework of a one‐dimensional diffusion problem when the solution is found numerically using the finite difference approach. In all these problems, statistical criteria for parameter identification and optimal design of experiments are applied. Statistical simulations confirm that estimates derived from the adaptive optimal design converge to the true parameter values with minimum sum of variances when the number of measurements increases. We deliberately chose technically uncomplicated industrial problems to transparently introduce principal ideas of statistical adaptive design.  相似文献   

13.
Both the theoretical and empirical literature on the estimation of allocative and technical inefficiency has grown enormously. To minimize aggregation bias, ideally one should estimate firm and input‐specific parameters describing allocative inefficiency. However, identifying these parameters has often proven difficult. For a panel of Chilean hydroelectric power plants, we obtain a full set of such parameters using Gibbs sampling, which draws sequentially from conditional generalized method of moments (GMM) estimates obtained via instrumental variables estimation. We find an economically significant range of firm‐specific efficiency estimates with differing degrees of precision. The standard GMM approach estimates virtually no allocative inefficiency for industry‐wide parameters. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

14.
Barry M. Rubin 《Socio》1985,19(6):387-398
Research into wage determination and inflation at the level of the urban labor market has generally followed a Phillips curve adaptive expectations framework. This paper explores the accuracy of such specifications when national and intermarket linkages are ignored, and extends such specifications to incorporate these linkages. The present research also addresses the impact of serial correlation problems and time series aggregation bias on the ability to identify the local wage determination and inflation mechanism. The estimation results for both annual and quarterly specifications indicate that there is virtually no support for a Phillips curve adaptive expectations hypothesis when external linkages are included in the equations. It is demonstrated that specification errors and serial correlation problems are probably responsible for many of the contradictory and inconclusive results obtained in previous studies.  相似文献   

15.
Nonparametric estimation and inferences of conditional distribution functions with longitudinal data have important applications in biomedical studies. We propose in this paper an estimation approach based on time-varying parametric models. Our model assumes that the conditional distribution of the outcome variable at each given time point can be approximated by a parametric model, but the parameters are smooth functions of time. Our estimation is based on a two-step smoothing method, in which we first obtain the raw estimators of the conditional distribution functions at a set of disjoint time points, and then compute the final estimators at any time by smoothing the raw estimators. Asymptotic properties, including the asymptotic biases, variances and mean squared errors, are derived for the local polynomial smoothed estimators. Applicability of our two-step estimation method is demonstrated through a large epidemiological study of childhood growth and blood pressure. Finite sample properties of our procedures are investigated through simulation study.  相似文献   

16.
Macroeconomic forecasting in China is essential for the government to take proper policy decisions on government expenditure and money supply, among other matters. The existing literature on forecasting Chinas macroeconomic variables is unclear on the crucial issue of how to choose an optimal window to estimate parameters with rolling out-of-sample forecasts. This study fills this gap in forecasting economic growth and inflation in China, by using the rolling weighted least squares (WLS) with the practically feasible cross-validation (CV) procedure of Hong et al. (2018) to choose an optimal estimation window. We undertake an empirical analysis of monthly data on up to 30 candidate indicators (mainly asset prices) for a span of 17 years (2000–2017). It is documented that the forecasting performance of rolling estimation is sensitive to the selection of rolling windows. The empirical analysis shows that the rolling WLS with the CV-based rolling window outperforms other rolling methods on univariate regressions in most cases. One possible explanation for this is that these macroeconomic variables often suffer from structural changes due to changes in institutional reforms, policies, crises, and other factors. Furthermore, we find that, in most cases, asset prices are key variables for forecasting macroeconomic variables, especially output growth rate.  相似文献   

17.
This paper considers the effects on multi-step prediction of using semiparametric local Whittle estimators rather than MLE for long memory ARFIMA models. We consider various representations of the minimum MSE predictor with known parameters. We then conduct a detailed simulation study for when the true parameters are replaced with estimates. The predictor based on MLE is found to be superior, in the MSE sense, to the predictor based on the two-step local Whittle estimation. The “optimal” bandwidth local Whittle estimator produces worse predictions than the local Whittle using an agnostic bandwidth of the square root of the sample size.  相似文献   

18.
Conventional employment functions with partial adjustment to output fitted to quarterly data tend to have positively autocorrelated residuals, to imply implausibly high returns to scale and almost always fail tests for parameter stability. The hypothesis of this paper is that mis-specified expectations are the main cause of these findings and rational and adaptive expectations models are compared. Further, employment is conditioned not on output but on variables which firms can more reasonably take as exogenous. ‘Disequilibrium’ features of labour markets are introduced by making adjustment costs depend upon current and expected labour market tightness.One of the implications of rational expectations is that the revision between points in time t and t ? 1 in the expected value of any variable should be independent of any information available before t and serially uncorrelated. Given a model of a forward looking firm whose hiring decisions are subject to quadratic adjustment costs, an appropriately transformed employment equation can be derived which has a very similar structure to the Koyck transformed employment equation which corresponds to adaptive expectations. Maximum likelihood estimation of the adaptive expectations form gives parameter estimates for quarterly British data for the manufacturing sector which are so unreasonable that this hypothesis can be rejected. Maximum likelihood estimation of the rational expectations form would involve modelling the stochastic processes of all the driving variables. However, conditional upon one parameter, consistent estimates of the remaining parameters can be obtained by OLS and these accord well with economic theory. This is the direct evidence in favour of the rational expectations hypothesis. However, it can also explain why the adaptive expectations form gives such poor results and why conventional employment functions give the unsatisfactory results referred to above. Further, rational expectations provides an explanation for the common finding, particularly in the context of employment and the demand for durable goods, of implausibly low or wrong signed levels effects in more general quarterly time series models with lagged dependent variables.  相似文献   

19.
This paper considers the estimation of frontier production functions in panel data models. It proposes a multi-stage method to obtain estimates of (1) the parameters of a flexible input requirement function and (2) technical inefficiency decomposed into time-invariant (firm-specific), time-varying, and the residual components. The proposed method is used to analyse labour-use efficiency of Swedish local social insurance offices on the basis of a large panel of observations during the time period 1974–84. Empirical results show: (1) substantial variations in labour-use efficiency among these offices, with the mean efficiencies declining over time; (2) presence of economies of scale, thereby meaning that most of the offices were of suboptimal size.  相似文献   

20.
This paper proposes a common and tractable framework for analyzing fixed and random effects models, in particular constant‐slope variable‐intercept designs. It is shown that, regardless of whether effects (i) are treated as parameters or as an error term, (ii) are estimated in different stages of a hierarchical model, or whether (iii) correlation between effects and regressors is allowed, when the same prior information on idiosyncratic parameters is introduced into all estimation methods, the resulting common slope estimator is also the same across methods. These results are illustrated using the Grünfeld investment data with different prior distributions. Random effects estimates are shown to be more efficient than fixed effects estimates. This efficiency gain, however, comes at the cost of neglecting information obtained in the computation of the prior unknown variance of idiosyncratic parameters.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号