首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 21 毫秒
1.
A test statistic is developed for making inference about a block‐diagonal structure of the covariance matrix when the dimensionality p exceeds n, where n = N ? 1 and N denotes the sample size. The suggested procedure extends the complete independence results. Because the classical hypothesis testing methods based on the likelihood ratio degenerate when p > n, the main idea is to turn instead to a distance function between the null and alternative hypotheses. The test statistic is then constructed using a consistent estimator of this function, where consistency is considered in an asymptotic framework that allows p to grow together with n. The suggested statistic is also shown to have an asymptotic normality under the null hypothesis. Some auxiliary results on the moments of products of multivariate normal random vectors and higher‐order moments of the Wishart matrices, which are important for our evaluation of the test statistic, are derived. We perform empirical power analysis for a number of alternative covariance structures.  相似文献   

2.
A general parametric framework based on the generalized Student t‐distribution is developed for pricing S&P500 options. Higher order moments in stock returns as well as time‐varying volatility are priced. An important computational advantage of the proposed framework over Monte Carlo‐based pricing methods is that options can be priced using one‐dimensional quadrature integration. The empirical application is based on S&P500 options traded on select days in April 1995, a total sample of over 100,000 observations. A range of performance criteria are used to evaluate the proposed model, as well as a number of alternative models. The empirical results show that pricing higher order moments and time‐varying volatility yields improvements in the pricing of options, as well as correcting the volatility skew associated with the Black–Scholes model. Copyright © 2004 John Wiley & Sons, Ltd.  相似文献   

3.
We consider an M/G/1 queueing system where the customers may leave the queue if their services do not commence before an exponentially distributed random time. The (conditional) offered waiting time distribution is approximated by a gamma distribution via matching the first and second moments of the actual waiting time. A simulation study is conducted to assess the accuracy of the approximation and it reveals that the approximation performs satisfactorily under general conditions on service time distributions.  相似文献   

4.
5.
Holger Dette 《Metrika》1997,46(1):71-82
In his book Pukelsheim [8] pointed out that designs supported at the arcsin points are very efficient for the statistical inference in a polynomial regression model. In this note we determine the canonical moments of a class of distributions which have nearly equal weights at the arcsin points. The class contains theD-optimal arcsin support design and theD 1-optimal design for a polynomial regression. The results allow explicit representations ofD-, andD 1-efficiencies of these designs in all polynomial models with a degree less than the number of support points of the design.  相似文献   

6.
Christodoulakis and Mamatzakis (2009, Journal of Applied Econometrics 24, pp. 583–606) estimate the EU Commission loss preferences for selected economic forecasts of 12 EU Member States. They employ the generalized method of moments (GMM) estimation procedure proposed by Elliott et al. (2005, Review of Economic Studies 72, pp. 1107–1125) and find the forecasts to be somewhat optimistic on average. However, this note shows the GMM estimator to possess nonstandard limiting distributions when some of the instruments are highly persistent, which is the case with one of the instruments employed by Christodoulakis and Mamatzakis. Standard distributions are recovered in some interesting particular cases which are relevant in practice. A reexamination of the EU Commission loss preferences using methods robust to persistence and a dataset extended to 2017 reveals that, while the conclusions of the original study are, by and large, still justified, the EU Commission loss preferences have become more symmetric over the whole studied period.  相似文献   

7.
The purpose of this paper is to present a closed formula to compute the moments of a general function from the knowledge of its bivariate survival function. The result is derived by utilizing an integration by parts formula for two variables, which is not readily available in the literature. Many of the existing results are obtained as special cases. Finally, two examples are presented to illustrate the results. In both the examples, mixed moments as well as moments for the series system and parallel system are obtained. The integration by parts formula in two variables, derived here, is of interest in its own right and we hope that it will be useful in other investigations. The integration by parts formula in two variables is derived as a special case of a general formula in n variables.  相似文献   

8.
The difference and system generalized method of moments (GMM) estimators are growing in popularity. As implemented in popular software, the estimators easily generate instruments that are numerous and, in system GMM, potentially suspect. A large instrument collection overfits endogenous variables even as it weakens the Hansen test of the instruments’ joint validity. This paper reviews the evidence on the effects of instrument proliferation, and describes and simulates simple ways to control it. It illustrates the dangers by replicating Forbes [American Economic Review (2000) Vol. 90, pp. 869–887] on income inequality and Levine et al. [Journal of Monetary Economics] (2000) Vol. 46, pp. 31–77] on financial sector development. Results in both papers appear driven by previously undetected endogeneity.  相似文献   

9.
Some recent specifications for GARCH error processes explicitly assume a conditional variance that is generated by a mixture of normal components, albeit with some parameter restrictions. This paper analyses the general normal mixture GARCH(1,1) model which can capture time variation in both conditional skewness and kurtosis. A main focus of the paper is to provide evidence that, for modelling exchange rates, generalized two‐component normal mixture GARCH(1,1) models perform better than those with three or more components, and better than symmetric and skewed Student's t‐GARCH models. In addition to the extensive empirical results based on simulation and on historical data on three US dollar foreign exchange rates (British pound, euro and Japanese yen), we derive: expressions for the conditional and unconditional moments of all models; parameter conditions to ensure that the second and fourth conditional and unconditional moments are positive and finite; and analytic derivatives for the maximum likelihood estimation of the model parameters and standard errors of the estimates. Copyright © 2006 John Wiley & Sons, Ltd.  相似文献   

10.
M. C. Jones 《Metrika》2002,54(3):215-231
Relationships between F, skew t and beta distributions in the univariate case are in this paper extended in a natural way to the multivariate case. The result is two new distributions: a multivariate t/skew t distribution (on ℜm) and a multivariate beta distribution (on (0,1)m). A special case of the former distribution is a new multivariate symmetric t distribution. The new distributions have a natural relationship to the standard multivariate F distribution (on (ℜ+)m) and many of their properties run in parallel. We look at: joint distributions, mathematically and graphically; marginal and conditional distributions; moments; correlations; local dependence; and some limiting cases. Received: March 2001  相似文献   

11.
A generalization of the linear-quadratic-Gaussian problem is discussed. This provides a family of control rules, which result in different combinations of moments of the quadratic payoff. A method of approximating higher moments of the payoff is given. A recursive formula for calculating the second moment is derived. A dynamic optimal tariff example illustrates the methods.  相似文献   

12.
Autoregresive conditional volatility, skewness and kurtosis   总被引:6,自引:0,他引:6  
This paper proposes a GARCH-type model allowing for time-varying volatility, skewness and kurtosis. The model is estimated assuming a Gram–Charlier (GC) series expansion of the normal density function for the error term, which is easier to estimate than the non-central t distribution proposed by [Harvey, C. R. & Siddique, A. (1999). Autorregresive Conditional Skewness. Journal of Financial and Quantitative Analysis 34, 465–487). Moreover, this approach accounts for time-varying skewness and kurtosis while the approach by Harvey and Siddique [Harvey, C. R. & Siddique, A. (1999). Autorregresive Conditional Skewness. Journal of Financial and Quantitative Analysis 34, 465–487] only accounts for non-normal skewness. We apply this method to daily returns of a variety of stock indices and exchange rates. Our results indicate a significant presence of conditional skewness and kurtosis. It is also found that specifications allowing for time-varying skewness and kurtosis outperform specifications with constant third and fourth moments.  相似文献   

13.
D. Dyer 《Metrika》1977,24(1):99-105
Due to the physical nature of certain periodic data, the harmonic dial points (the Fourier coefficients obtained from harmonic analysis of the data) are sometimes restricted to circular regions in the dial plane. It is proposed that a circular normal distribution (CND) truncated outside a circular region be used to describe the probabilistic behavior of the random phenomena. Recurrence relations for the population moments of a CND truncated outside a circular region are derived. These recurrence relations are used to obtain consistent asymptotically (jointly) normal estimators of the unknown parameters of the distribution. A numerical example based on the harmonic dial points representing the 27-day recurrence tendency of the daily international magnetic character-figureC i is given to illustrate the theory.  相似文献   

14.
A quasi-maximum likelihood procedure for estimating the parameters of multi-dimensional diffusions is developed in which the transitional density is a multivariate Gaussian density with first and second moments approximating the true moments of the unknown density. For affine drift and diffusion functions, the moments are exactly those of the true transitional density and for nonlinear drift and diffusion functions the approximation is extremely good and is as effective as alternative methods based on likelihood approximations. The estimation procedure generalises to models with latent factors. A conditioning procedure is developed that allows parameter estimation in the absence of proxies.  相似文献   

15.
A well-known difficulty in estimating conditional moment restrictions is that the parameters of interest need not be globally identified by the implied unconditional moments. In this paper, we propose an approach to constructing a continuum of unconditional moments that can ensure parameter identifiability. These unconditional moments depend on the “instruments” generated from a “generically comprehensively revealing” function, and they are further projected along the exponential Fourier series. The objective function is based on the resulting Fourier coefficients, from which an estimator can be easily computed. A novel feature of our method is that the full continuum of unconditional moments is incorporated into each Fourier coefficient. We show that, when the number of Fourier coefficients in the objective function grows at a proper rate, the proposed estimator is consistent and asymptotically normally distributed. An efficient estimator is also readily obtained via the conventional two-step GMM method. Our simulations confirm that the proposed estimator compares favorably with that of Domínguez and Lobato (2004, Econometrica) in terms of bias, standard error, and mean squared error.  相似文献   

16.
Engineering Process Controllers (EPC) are frequently based on parametrized models. If process conditions change, the parameter estimates used by the controllers may become biased, and the quality characteristics will be affected. To detect such changes it is adequate to use Statistical Process Control (SPC) methods. The run length statistic is commonly used to describe the performance of an SPC chart. This paper develops approximations for the first two moments of the run length distribution of a one-sided Shewhart chart used to detect two types of process changes in a system that is regulated by a given EPC scheme: i) changes in the level parameter; ii) changes in the drift parameter. If the drift parameter shifts, it is further assumed that the form of the drift process changes from a linear trend under white noise (the in-control drift model) into a random walk with drift model. Two different approximations for the run length moments are presented and their accuracy is numerically analyzed. Received: August 1998  相似文献   

17.
A Simple Derivation of Moments of the Exponentiated Weibull Distribution   总被引:1,自引:0,他引:1  
Amit Choudhury 《Metrika》2005,62(1):17-22
The Exponentiated Weibull family is an extension of the Weibull family obtained by adding an additional shape parameter. The beauty and importance of this distribution lies in its ability to model monotone as well as non-monotone failure rates which are quite common in reliability and biological studies. As with any other distribution, many of its interesting characteristics and features can be studied through moments. Presently, moments of this distribution are available only under certain restrictions. In this paper, a general derivation of moments without any restriction whatsoever is proposed. A compact expression for moments is presented.  相似文献   

18.
The present paper obtains the nonnull distribution of the product moment correlation coefficient r when sample is drawn from a mixture of two bivariate Gaussian distributions. The moments of 1−r 2 have been used to derive the nonnull density of r. Received September 2000  相似文献   

19.
This paper studies efficient designs for simultaneous model discrimination among polynomial regression models up to degree k. Based on the -optimality criterion proposed by Dette (Ann Stat 22:890–903, 1994), a maximin -optimal discriminating design is derived in terms of canonical moments for . Theoretical and numerical results show that the proposed design performs well for model discrimination in most of the considered models.  相似文献   

20.
We examine the asymptotic behavior of two strategyproof mechanisms discussed by Moulin for public goods – the conservative equal costs rule (CER) and the serial cost sharing rule (SCSR) – and compare their performance to that of the pivotal mechanism (PM) from the Clarke–Groves family. Allowing the individuals’ valuations for an excludable public project to be random variables, we show under very general assumptions that expected welfare loss generated by the CER, as the size of the population increases, becomes arbitrarily large. However, all moments of the SCSR’s random welfare loss asymptotically converge to zero. The PM does better than the SCSR, with its welfare loss converging even more rapidly to zero.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号