首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Evidence of lengthy half‐lives for real exchange rates in the presence of a high degree of exchange rate volatility has been considered as one of the most puzzling empirical regularities in international macroeconomics. This paper suggests that the measure of half‐life used in the literature might be problematic and proposes alternative measures with desirable properties. Their focus on the cumulative effects of the shocks distinguishes them from the measures used in the literature. An empirical analysis of bilateral US dollar real exchange rates employing the alternative half‐life measure produces results consistent with theory and indicates that the PPP puzzle is less pronounced than initially thought. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

2.
In a seminal paper, Mak, Journal of the Royal Statistical Society B, 55, 1993, 945, derived an efficient algorithm for solving non‐linear unbiased estimation equations. In this paper, we show that when Mak's algorithm is applied to biased estimation equations, it results in the estimates that would come from solving a bias‐corrected estimation equation, making it a consistent estimator if regularity conditions hold. In addition, the properties that Mak established for his algorithm also apply in the case of biased estimation equations but for estimates from the bias‐corrected equations. The marginal likelihood estimator is obtained when the approach is applied to both maximum likelihood and least squares estimation of the covariance matrix parameters in the general linear regression model. The new approach results in two new estimators when applied to the profile and marginal likelihood functions for estimating the lagged dependent variable coefficient in the dynamic linear regression model. Monte Carlo simulation results show the new approach leads to a better estimator when applied to the standard profile likelihood. It is therefore recommended for situations in which standard estimators are known to be biased.  相似文献   

3.
Understanding the effects of operational conditions and practices on productive efficiency can provide valuable economic and managerial insights. The conventional approach is to use a two-stage method where the efficiency estimates are regressed on contextual variables representing the operational conditions. The main problem of the two-stage approach is that it ignores the correlations between inputs and contextual variables. To address this shortcoming, we build on the recently developed regression interpretation of data envelopment analysis (DEA) to develop a new one-stage semi-nonparametric estimator that combines the nonparametric DEA-style frontier with a regression model of the contextual variables. The new method is referred to as stochastic semi-nonparametric envelopment of z variables data (StoNEZD). The StoNEZD estimator for the contextual variables is shown to be statistically consistent under less restrictive assumptions than those required by the two-stage DEA estimator. Further, the StoNEZD estimator is shown to be unbiased, asymptotically efficient, asymptotically normally distributed, and converge at the standard parametric rate of order n −1/2. Therefore, the conventional methods of statistical testing and confidence intervals apply for asymptotic inference. Finite sample performance of the proposed estimators is examined through Monte Carlo simulations.  相似文献   

4.
《Statistica Neerlandica》2018,72(2):126-156
In this paper, we study application of Le Cam's one‐step method to parameter estimation in ordinary differential equation models. This computationally simple technique can serve as an alternative to numerical evaluation of the popular non‐linear least squares estimator, which typically requires the use of a multistep iterative algorithm and repetitive numerical integration of the ordinary differential equation system. The one‐step method starts from a preliminary ‐consistent estimator of the parameter of interest and next turns it into an asymptotic (as the sample size n ) equivalent of the least squares estimator through a numerically straightforward procedure. We demonstrate performance of the one‐step estimator via extensive simulations and real data examples. The method enables the researcher to obtain both point and interval estimates. The preliminary ‐consistent estimator that we use depends on non‐parametric smoothing, and we provide a data‐driven methodology for choosing its tuning parameter and support it by theory. An easy implementation scheme of the one‐step method for practical use is pointed out.  相似文献   

5.
It is argued that univariate long memory estimates based on ex post data tend to underestimate the persistence of ex ante variables (and, hence, that of the ex post variables themselves) because of the presence of unanticipated shocks whose short‐run volatility masks the degree of long‐range dependence in the data. Empirical estimates of long‐range dependence in the Fisher equation are shown to manifest this problem and lead to an apparent imbalance in the memory characteristics of the variables in the Fisher equation. Evidence in support of this typical underestimation is provided by results obtained with inflation forecast survey data and by direct calculation of the finite sample biases. To address the problem of bias, the paper introduces a bivariate exact Whittle (BEW) estimator that explicitly allows for the presence of short memory noise in the data. The new procedure enhances the empirical capacity to separate low‐frequency behaviour from high‐frequency fluctuations, and it produces estimates of long‐range dependence that are much less biased when there is noise contaminated data. Empirical estimates from the BEW method suggest that the three Fisher variables are integrated of the same order, with memory parameter in the range (0.75, 1). Since the integration orders are balanced, the ex ante real rate has the same degree of persistence as expected inflation, thereby furnishing evidence against the existence of a (fractional) cointegrating relation among the Fisher variables and, correspondingly, showing little support for a long‐run form of Fisher hypothesis. Copyright © 2004 John Wiley & Sons, Ltd.  相似文献   

6.
Consider a linear regression model and suppose that our aim is to find a confidence interval for a specified linear combination of the regression parameters. In practice, it is common to perform a Durbin–Watson pretest of the null hypothesis of zero first‐order autocorrelation of the random errors against the alternative hypothesis of positive first‐order autocorrelation. If this null hypothesis is accepted then the confidence interval centered on the ordinary least squares estimator is used; otherwise the confidence interval centered on the feasible generalized least squares estimator is used. For any given design matrix and parameter of interest, we compare the confidence interval resulting from this two‐stage procedure and the confidence interval that is always centered on the feasible generalized least squares estimator, as follows. First, we compare the coverage probability functions of these confidence intervals. Second, we compute the scaled expected length of the confidence interval resulting from the two‐stage procedure, where the scaling is with respect to the expected length of the confidence interval centered on the feasible generalized least squares estimator, with the same minimum coverage probability. These comparisons are used to choose the better confidence interval, prior to any examination of the observed response vector.  相似文献   

7.
This paper is an empirical study of the uncertainty associated with technical efficiency estimates from stochastic frontier models. We show how to construct confidence intervals for estimates of technical efficiency levels under different sets of assumptions ranging from the very strong to the relatively weak. We demonstrate empirically how the degree of uncertainty associated with these estimates relates to the strength of the assumptions made and to various features of the data.  相似文献   

8.
Standard jackknife confidence intervals for a quantile Q y (β) are usually preferred to confidence intervals based on analytical variance estimators due to their operational simplicity. However, the standard jackknife confidence intervals can give undesirable coverage probabilities for small samples sizes and large or small values of β. In this paper confidence intervals for a population quantile based on several existing estimators of a quantile are derived. These intervals are based on an approximation for the cumulative distribution function of a studentized quantile estimator. Confidence intervals are empirically evaluated by using real data and some applications are illustrated. Results derived from simulation studies show that proposed confidence intervals are narrower than confidence intervals based on the standard jackknife technique, which assumes normal approximation. Proposed confidence intervals also achieve coverage probabilities above to their nominal level. This study indicates that the proposed method can be an alternative to the asymptotic confidence intervals, which can be unknown in practice, and the standard jackknife confidence intervals, which can have poor coverage probabilities and give wider intervals.  相似文献   

9.
Summary In this paper, using the pivotal quantity method, new shortest-length confidence intervals and uniformly minimum variance unbiased (UMVU) estimators are constructed, where two independent random samples are available from families of distributions involving truncation parameters. Also, in the case of one sample, we give, for some uniform distributions, confidence intervals which are the shortest among all known confidence intervals.  相似文献   

10.
We study the generalized bootstrap technique under general sampling designs. We focus mainly on bootstrap variance estimation but we also investigate the empirical properties of bootstrap confidence intervals obtained using the percentile method. Generalized bootstrap consists of randomly generating bootstrap weights so that the first two (or more) design moments of the sampling error are tracked by the corresponding bootstrap moments. Most bootstrap methods in the literature can be viewed as special cases. We discuss issues such as the choice of the distribution used to generate bootstrap weights, the choice of the number of bootstrap replicates, and the potential occurrence of negative bootstrap weights. We first describe the generalized bootstrap for the linear Horvitz‐Thompson estimator and then consider non‐linear estimators such as those defined through estimating equations. We also develop two ways of bootstrapping the generalized regression estimator of a population total. We study in greater depth the case of Poisson sampling, which is often used to select samples in Price Index surveys conducted by national statistical agencies around the world. For Poisson sampling, we consider a pseudo‐population approach and show that the resulting bootstrap weights capture the first three design moments of the sampling error. A simulation study and an example with real survey data are used to illustrate the theory.  相似文献   

11.
A new method to derive confidence intervals for medians in a finite population is presented. This method uses multi-auxiliary information through a multivariate regression type estimator of the population distribution function. A simulation study based on four real populations shows its behaviour versus other known methods.  相似文献   

12.
Simultaneous optimal estimation in linear mixed models is considered. A necessary and sufficient condition is presented for the least squares estimator of the fixed effects and the analysis of variance estimator of the variance components to be of uniformly minimum variance simultaneously in a general variance components model. That is, the matrix obtained by orthogonally projecting the covariance matrix onto the orthogonal complement space of the column space of the design matrix is symmetric, each eigenvalue of the matrix is a linear combinations of the variance components and the number of all distinct eigenvalues of the matrix is equal to the the number of the variance components. Under this condition, uniformly optimal unbiased tests and uniformly most accurate unbiased confidence intervals are constructed for the parameters of interest. A necessary and sufficient condition is also given for the equivalence of several common estimators of variance components. Two examples of their application are given.  相似文献   

13.
We investigate the empirical relevance of structural breaks for GARCH models of exchange rate volatility using both in‐sample and out‐of‐sample tests. We find significant evidence of structural breaks in the unconditional variance of seven of eight US dollar exchange rate return series over the 1980–2005 period—implying unstable GARCH processes for these exchange rates—and GARCH(1,1) parameter estimates often vary substantially across the subsamples defined by the structural breaks. We also find that it almost always pays to allow for structural breaks when forecasting exchange rate return volatility in real time. Combining forecasts from different models that accommodate structural breaks in volatility in various ways appears to offer a reliable method for improving volatility forecast accuracy given the uncertainty surrounding the timing and size of the structural breaks. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   

14.
We propose a Bayesian framework in which the uncertainty about the half‐life of deviations from purchasing power parity can be quantified. Based on the responses to a survey study, we propose a prior probability distribution for the half‐life under the recent float intended to capture widely held views among economists. We derive the posterior probability distribution of the half‐life under this consensus prior and confirm the presence of substantial uncertainty about the half‐life. We provide for the first time a comprehensive formal evaluation of several nonnested hypotheses of economic interest, including Rogoff's ( 1996 ) claim that the half‐life is contained in the range of 3 to 5 years. We find that no hypothesis receives strong support from the data. Copyright © 2002 John Wiley & Sons, Ltd.  相似文献   

15.
In this article, we propose a new identifiability condition by using the logarithmic calibration for the distortion measurement error models, where neither the response variable nor the covariates can be directly observed but are measured with multiplicative measurement errors. Under the logarithmic calibration, the direct-plug-in estimators of parameters and empirical likelihood based confidence intervals are proposed, and we studied the asymptotic properties of the proposed estimators. For the hypothesis testing of parameter, a restricted estimator under the null hypothesis and a test statistic are proposed. The asymptotic properties for the restricted estimator and test statistic are established. Simulation studies demonstrate the performance of the proposed procedure and a real example is analyzed to illustrate its practical usage.  相似文献   

16.
Subsampling high frequency data   总被引:1,自引:0,他引:1  
The main contribution of this paper is to propose a novel way of conducting inference for an important general class of estimators that includes many estimators of integrated volatility. A subsampling scheme is introduced that consistently estimates the asymptotic variance for an estimator, thereby facilitating inference and the construction of valid confidence intervals. The new method does not rely on the exact form of the asymptotic variance, which is useful when the latter is of complicated form. The method is applied to the volatility estimator of Aït-Sahalia et al. (2011) in the presence of autocorrelated and heteroscedastic market microstructure noise.  相似文献   

17.
Wu  Jong-Wuu  Lu  Hai-Lin  Chen  Chong-Hong  Yang  Chien-Hui 《Quality and Quantity》2004,38(2):217-233
In the researching of products' reliability, the result of life testing is used as the basis for the evaluation and improvement of reliability. During life testing, however, the future observation in an ordered sample is often expected to be predicted so as to show how long a sample of units might run until all fail in life testing. Therefore, we propose five new pivotal quantities to obtain prediction intervals of future order statistics based on right type II censored samples from the Pareto distribution with known shape parameter, then compares the lengths of the prediction intervals when using the pivotal quantity of Ouyang and Wu (1994) based on best linear unbiased estimator (BLUE) of scale parameter, and these five pivotal quantities. An advantage of these five pivotal quantities is that these are easier to calculate than the pivotal quantity of Ouyang and Wu (1994) based on BLUE of scale parameter, since they need to compute the tables of coefficients of BLUE of scale parameter.  相似文献   

18.
This paper deals with the estimation of P[Y < X] when X and Y are two independent generalized exponential distributions with different shape parameters but having the same scale parameters. The maximum likelihood estimator and its asymptotic distribution is obtained. The asymptotic distribution is used to construct an asymptotic confidence interval of P[Y < X]. Assuming that the common scale parameter is known, the maximum likelihood estimator, uniformly minimum variance unbiased estimator and Bayes estimator of P[Y < X] are obtained. Different confidence intervals are proposed. Monte Carlo simulations are performed to compare the different proposed methods. Analysis of a simulated data set has also been presented for illustrative purposes.Part of the work was supported by a grant from the Natural Sciences and Engineering Research Council  相似文献   

19.
This study constructed the industry-specific real effective exchange rate (I-REER) as a new measure of export competitiveness by industry. By aggregating I-REERs to a country-level I-REER for nine Asian economies, we assess the effect of REER appreciation on real exports by employing both static common-correlated effects (CCE) estimator and cross-sectionally augmented distributed lag (CS-DL) estimator (a dynamic version of the CCE estimator) to control for heterogeneity in the impact of unobservable common factors. The degree of REER’s negative effect is found to have declined in recent years, which may imply that growing global value chains (GVCs) tend to mitigate the negative effect of REER appreciation on exports. As is well known that Asia is characterized as active regional trade and investment through GVCs, further regional integration would make Asian economies have less concern about policy coordination for regional exchange rate stability in Asia.  相似文献   

20.
Potential output plays a central role in monetary policy and short‐term macroeconomic policy making. Yet, characterizing the output gap involves a trend‐cycle decomposition, and unobserved component estimates are typically subject to a large uncertainty at the sample end. An important consequence is that output gap estimates can be quite inaccurate in real time, as recently highlighted by Orphanides and van Norden ( 2002 ), and this causes a serious problem for policy makers. For the cases of the US, EU‐11 and two EU countries, we evaluate the benefits of using inflation data for improving the accuracy of real‐time estimates. Copyright © 2004 John Wiley & Sons, Ltd.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号