首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Modeling conditional distributions in time series has attracted increasing attention in economics and finance. We develop a new class of generalized Cramer–von Mises (GCM) specification tests for time series conditional distribution models using a novel approach, which embeds the empirical distribution function in a spectral framework. Our tests check a large number of lags and are therefore expected to be powerful against neglected dynamics at higher order lags, which is particularly useful for non-Markovian processes. Despite using a large number of lags, our tests do not suffer much from loss of a large number of degrees of freedom, because our approach naturally downweights higher order lags, which is consistent with the stylized fact that economic or financial markets are more affected by recent past events than by remote past events. Unlike the existing methods in the literature, the proposed GCM tests cover both univariate and multivariate conditional distribution models in a unified framework. They exploit the information in the joint conditional distribution of underlying economic processes. Moreover, a class of easy-to-interpret diagnostic procedures are supplemented to gauge possible sources of model misspecifications. Distinct from conventional CM and Kolmogorov–Smirnov (KS) tests, which are also based on the empirical distribution function, our GCM test statistics follow a convenient asymptotic N(0,1) distribution and enjoy the appealing “nuisance parameter free” property that parameter estimation uncertainty has no impact on the asymptotic distribution of the test statistics. Simulation studies show that the tests provide reliable inference for sample sizes often encountered in economics and finance.  相似文献   

2.
This paper reviews the recent literature on conditional duration modeling in high‐frequency finance. These conditional duration models are associated with the time interval between trades, price, and volume changes of stocks, traded in a financial market. An earlier review by Pacurar provides an exhaustive survey of the first and some of the second generation conditional duration models. We consider almost all of the third‐generation and some of the second‐generation conditional duration models. Notable applications of these models and related empirical studies are discussed. The paper may be seen as an extension to Pacurar.  相似文献   

3.
Forecasts of key interest rates set by central banks are of paramount concern for investors and policy makers. Recently it has been shown that forecasts of the federal funds rate target, the most anticipated indicator of the Federal Reserve Bank's monetary policy stance, can be improved considerably when its evolution is modeled as a marked point process (MPP). This is due to the fact that target changes occur in discrete time with discrete increments, have an autoregressive nature and are usually in the same direction. We propose a model which is able to account for these dynamic features of the data. In particular, we combine Hamilton and Jordà's [2002. A model for the federal funds rate target. Journal of Political Economy 110(5), 1135–1167] autoregressive conditional hazard (ACH) and Russell and Engle's [2005. A discrete-state continuous-time model of financial transactions prices and times: the autoregressive conditional multinomial-autoregressive conditional duration model. Journal of Business and Economic Statistics 23(2), 166 – 180] autoregressive conditional multinomial (ACM) model. The paper also puts forth a methodology to evaluate probability function forecasts of MPP models. By improving goodness of fit and point forecasts of the target, the ACH–ACM qualifies as a sensible modeling framework. Furthermore, our results show that MPP models deliver useful probability function forecasts at short and medium term horizons.  相似文献   

4.
We review some first‐order and higher‐order asymptotic techniques for M‐estimators, and we study their stability in the presence of data contaminations. We show that the estimating function (ψ) and its derivative with respect to the parameter play a central role. We discuss in detail the first‐order Gaussian density approximation, saddlepoint density approximation, saddlepoint test, tail area approximation via the Lugannani–Rice formula and empirical saddlepoint density approximation (a technique related to the empirical likelihood method). For all these asymptotics, we show that a bounded ψ (in the Euclidean norm) and a bounded (e.g. in the Frobenius norm) yield stable inference in the presence of data contamination. We motivate and illustrate our findings by theoretical and numerical examples about the benchmark case of one‐dimensional location model.  相似文献   

5.
6.
We propose new summary statistics for intensity‐reweighted moment stationary point processes, that is, point processes with translation invariant n‐point correlation functions for all , that generalise the well known J‐, empty space, and spherical Palm contact distribution functions. We represent these statistics in terms of generating functionals and relate the inhomogeneous J‐function to the inhomogeneous reduced second moment function. Extensions to space time and marked point processes are briefly discussed.  相似文献   

7.
8.
D. A. Ioannides 《Metrika》1999,50(1):19-35
Let {(X i, Y i,)}, i≥1, be a strictly stationary process from noisy observations. We examine the effect of the noise in the response Y and the covariates X on the nonparametric estimation of the conditional mode function. To estimate this function we are using deconvoluting kernel estimators. The asymptotic behavior of these estimators depends on the smoothness of the noise distribution, which is classified as either ordinary smooth or super smooth. Uniform convergence with almost sure convergence rates is established for strongly mixing stochastic processes, when the noise distribution is ordinary smooth. Received: April 1998  相似文献   

9.
A random variableY is right tail increasing (RTI) inX if the failure rate of the conditional distribution ofX givenY>y is uniformly smaller than that of the marginal distribution ofX for everyy0. This concept of positive dependence is not symmetric inX andY and is stronger than the notion of positive quadrant dependence. In this paper we consider the problem of testing for independence against the alternative thatY is RTI inX. We propose two distribution-free tests and obtain their limiting null distributions. The proposed tests are compared to Kendall's and Spearman's tests in terms of Pitman asymptotic relative efficiency. We have also conducted a Monte Carlo study to compare the powers of these tests.Research supported by an NSERC Canada operating grant at the University of Alberta.Part of this research was done while visiting the University of Alberta supported by the NSERC Canada grant of the first author.  相似文献   

10.
This paper considers the difference‐in‐differences (DID) method when the data come from repeated cross‐sections and the treatment status is observed either before or after the implementation of a program. We propose a new method that point‐identifies the average treatment effect on the treated (ATT) via a DID method when there is at least one proxy variable for the latent treatment. Key assumptions are the stationarity of the propensity score conditional on the proxy and an exclusion restriction that the proxy must satisfy with respect to the change in average outcomes over time conditional on the true treatment status. We propose a generalized method of moments estimator for the ATT and we show that the associated overidentification test can be used to test our key assumptions. The method is used to evaluate JUNTOS, a Peruvian conditional cash transfer program. We find that the program significantly increased the demand for health inputs among children and women of reproductive age.  相似文献   

11.
One of the main purposes of the frontier literature is to estimate inefficiency. Given this objective, it is unfortunate that the issue of estimating firm-specific inefficiency in cross sectional context has not received much attention. To estimate firm-specific (technical) inefficiency, the standard procedure is to use the mean of the inefficiency term conditional on the entire composed error as suggested by Jondrow, Lovell, Materov and Schmidt (1982). This conditional mean could be viewed as the average loss of output (return). It is also quite natural to consider the conditional variance which could provide a measure of production uncertainty or risk. Once we have the conditional mean and variance, we can report standard errors and construct confidence intervals for firm level technical inefficiency. Moreover, we can also perform hypothesis tests. We postulate that when a firm attempts to move towards the frontier it not only increases its efficiency, but it also reduces its production uncertainty and this will lead to shorter confidence intervals. Analytical expressions for production uncertainty under different distributional assumptions are provided, and it is shown that the technical inefficiency as defined by Jondrow et al. (1982) and the production uncertainty are monotonic functions of the entire composed error term. It is very interesting to note that this monotonicity result is valid under different distributional assumptions of the inefficiency term. Furthermore, some alternative measures of production uncertainty are also proposed, and the concept of production uncertainty is generalized to the panel data models. Finally, our theoretical results are illustrated with an empirical example.  相似文献   

12.
This paper considers semiparametric efficient estimation of conditional moment models with possibly nonsmooth residuals in unknown parametric components (θ) and unknown functions (h) of endogenous variables. We show that: (1) the penalized sieve minimum distance (PSMD) estimator can simultaneously achieve root-n asymptotic normality of and nonparametric optimal convergence rate of , allowing for noncompact function parameter spaces; (2) a simple weighted bootstrap procedure consistently estimates the limiting distribution of the PSMD ; (3) the semiparametric efficiency bound formula of [Ai, C., Chen, X., 2003. Efficient estimation of models with conditional moment restrictions containing unknown functions. Econometrica, 71, 1795–1843] remains valid for conditional models with nonsmooth residuals, and the optimally weighted PSMD estimator achieves the bound; (4) the centered, profiled optimally weighted PSMD criterion is asymptotically chi-square distributed. We illustrate our theories using a partially linear quantile instrumental variables (IV) regression, a Monte Carlo study, and an empirical estimation of the shape-invariant quantile IV Engel curves.  相似文献   

13.
In this paper, the authors evaluate the effectiveness of Statement of Cash Flows measures in the classification and prediction of bankruptcy. The problems of biased estimators and bankruptcy probabilities and of optimal cut-off rates are addressed by using a large random sample and conditional marginal probability density functions. It is found that cashflow variables provide statistically better classification and prediction rates when used with traditional accounting variables when bankruptcy is defined as a Chapter 11 filing.  相似文献   

14.
For estimating an unknown scale parameter of Gamma distribution, we introduce the use of an asymmetric scale invariant loss function reflecting precision of estimation. This loss belongs to the class of precautionary loss functions. The problem of estimation of scale parameter of a Gamma distribution arises in several theoretical and applied problems. Explicit form of risk-unbiased, minimum risk scale-invariant, Bayes, generalized Bayes and minimax estimators are derived. We characterized the admissibility and inadmissibility of a class of linear estimators of the form $cX\,{+}\,d$ , when $X\sim \varGamma (\alpha ,\eta )$ . In the context of Bayesian statistical inference any statistical problem should be treated under a given loss function by specifying a prior distribution over the parameter space. Hence, arbitrariness of a unique prior distribution is a critical and permanent question. To overcome with this issue, we consider robust Bayesian analysis and deal with Gamma minimax, conditional Gamma minimax, the stable and characterize posterior regret Gamma minimax estimation of the unknown scale parameter under the asymmetric scale invariant loss function in detail.  相似文献   

15.
A multivariate measurement error model AXB is considered. The errors in [A,B] are rowwise independent, but within each row the errors may be correlated. Some of the columns are observed without errors, and in addition the error covariance matrices may differ from row to row. The total covariance structure of the errors is supposed to be known up to a scalar factor. The fully weighted total least squares estimator of X is studied, which in the case of normal errors coincides with the maximum likelihood estimator. We give mild conditions for weak and strong consistency of the estimator, when the number of rows in A increases. The results generalize the conditions of Gallo given for a univariate homoscedastic model (where B is a vector), and extend the conditions of Gleser given for the multivariate homoscedastic model. We derive the objective function for the estimator and propose an iteratively reweighted numerical procedure.Acknowledgements.A. Kukush is supported by a postdoctoral research fellowship of the Belgian office for Scientific, Technical and Cultural Affairs, promoting Scientific and Technical Collaboration with Central and Eastern Europe. S. Van Huffel is a full professor with the Katholieke Universiteit Leuven. This paper presents research results of the Belgian Programme on Interuniversity Poles of Attraction (IUAP Phase V-22), initiated by the Belgian State, Prime Ministers Office-Federal Office for Scientific, Technical and Cultural Affairs, of the Concerted Research Action (GOA) projects of the Flemish Government MEFISTO-666 (Mathematical Engineering for Information and Communication Systems Technology), of the IDO/99/03 project (K.U. Leuven) Predictive computer models for medical classification problems using patient data and expert knowledge, of the FWO projects G.0200.00, G.0078.01 and G.0270.02. The scientific responsibility is assumed by its authors. The authors would like to thank Maria Luisa Rastello and Amedeo Premoli for bringing the EW-TLS problem to their attention. The authors are grateful to two anonymous referees for the valuable comments.  相似文献   

16.
We propose an estimator of the conditional distribution of Xt|Xt−1,Xt−2,…, and the corresponding regression function , where the conditioning set is of infinite order. We establish consistency of our estimator under stationarity and ergodicity conditions plus a mild smoothness condition.  相似文献   

17.
In this paper, we obtain recurrence relations for moment and conditional moment generating functions of generalized order statistics (gos) based on random samples drawn from a population whose distribution is a member of a doubly truncated class of distributions denoted by . Members of the class are characterized in Section (2) based on recurrence relations for moment generating functions (moments) of gos. In Section (3), we shall characterize members of the class based on recurrence relations for conditional moment generating functions (conditional moments) of gos. These results are specialized to the left, right and non-truncated cases. Ordinary order statistics and ordinary record values are also obtained as special cases of the gos. Characterizations of some members of class such as the Weibull, compound Weibull, Pareto, power function (beta is a special case), Gompertz and compound Gompertz distributions are given as illustrative examples.  相似文献   

18.
19.
EuroMInd‐ is a density estimate of monthly gross domestic product (GDP) constructed according to a bottom‐up approach, pooling the density estimates of 11 GDP components, by output and expenditure type. The components' density estimates are obtained from a medium‐size dynamic factor model handling mixed frequencies of observation and ragged‐edged data structures. They reflect both parameter and filtering uncertainty and are obtained by implementing a bootstrap algorithm for simulating from the distribution of the maximum likelihood estimators of the model parameters, and conditional simulation filters for simulating from the predictive distribution of GDP. Both algorithms process the data sequentially as they become available in real time. The GDP density estimates for the output and expenditure approach are combined using alternative weighting schemes and evaluated with different tests based on the probability integral transform and by applying scoring rules. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

20.

In stochastic frontier analysis, the conventional estimation of unit inefficiency is based on the mean/mode of the inefficiency, conditioned on the composite error. It is known that the conditional mean of inefficiency shrinks towards the mean rather than towards the unit inefficiency. In this paper, we analytically prove that the conditional mode cannot accurately estimate unit inefficiency, either. We propose regularized estimators of unit inefficiency that restrict the unit inefficiency estimators to satisfy some a priori assumptions, and derive the closed form regularized conditional mode estimators for the three most commonly used inefficiency densities. Extensive simulations show that, under common empirical situations, e.g., regarding sample size and signal-to-noise ratio, the regularized estimators outperform the conventional (unregularized) estimators when the inefficiency is greater than its mean/mode. Based on real data from the electricity distribution sector in Sweden, we demonstrate that the conventional conditional estimators and our regularized conditional estimators provide substantially different results for highly inefficient companies.

  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号