首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 710 毫秒
1.
This paper presents a new framework which generalizes the concept of conditional expectation to mean values which are implicitly defined as unique solutions to some functional equation. We call such a mean value an implicit mean. The implicit mean and its very special example, the quasi-linear mean, have been extensively applied to economics and decision theory. This paper provides a procedure of defining the conditional implicit mean and then analyzes its properties. In particular, we show that the conditional implicit mean is in general “biased” in the sense that an analogue of the law of iterated expectations does not hold and we characterize the quasi-linear mean as the only implicit mean which is “unbiased”.  相似文献   

2.
This paper considers the regression model y = +ε with all the classical assumptions (including normality) but one, viz. it is assumed that the covariance matrix of the disturbances depends upon a finite number of unknown parameters θ1 … θm. The paper gives a method to derive simultaneously the maximum likelihood estimates of β and θ. Also the information matrix is presented. It is proved that β? is unbiased if its mean exists. Conditions are given under which the maximum likelihood estimates are consistent, asymptotically normal, and asymptotically efficient. Finally, applications are given to the autocorrelated errors model and to Zellner-type regressions.  相似文献   

3.
Michael Kohler 《Metrika》1998,47(1):147-163
Let (X, Y) be a pair of random variables withsupp(X)⊆[0,1] l andEY 2<∞. Letm * be the best approximation of the regression function of (X, Y) by sums of functions of at mostd variables (1≤dl). Estimation ofm * from i.i.d. data is considered. For the estimation interaction least squares splines, which are defined as sums of polynomial tensor product splines of at mostd variables, are used. The knot sequences of the tensor product splines are chosen equidistant. Complexity regularization is used to choose the number of the knots and the degree of the splines automatically using only the given data. Without any additional condition on the distribution of (X, Y) the weak and strongL 2-consistency of the estimate is shown. Furthermore, for everyp≥1 and every distribution of (X, Y) withsupp(X)⊆[0,1] l ,y bounded andm * p-smooth, the integrated squared error of the estimate achieves up to a logarithmic factor the (optimal) rate   相似文献   

4.
This paper considers tests of the effectiveness of a policy intervention, defined as a change in the parameters of a policy rule, in the context of a macroeconometric dynamic stochastic general equilibrium (DSGE) model. We consider two types of intervention, first the standard case of a parameter change that does not alter the steady state, and second one that does alter the steady state, e.g. the target rate of inflation. We consider two types of test, one a multi‐horizon test, where the postintervention policy horizon, H, is small and fixed, and a mean policy effect test where H is allowed to increase without bounds. The multi‐horizon test requires Gaussian errors, but the mean policy effect test does not. It is shown that neither of these two tests are consistent, in the sense that the power of the tests does not tend to unity as H→∞, unless the intervention alters the steady state. This follows directly from the fact that DSGE variables are measured as deviations from the steady state, and the effects of policy change on target variables decay exponentially fast. We investigate the size and power of the proposed mean effect test by simulating a standard three equation New Keynesian DSGE model. The simulation results are in line with our theoretical findings and show that in all applications the tests have the correct size; but unless the intervention alters the steady state, their power does not go to unity with H.  相似文献   

5.
Consider a non-homogeneous Poisson process,N(t), with mean value functionΛ(t) and intensity functionsΛ(t). A conditional test of the hypothesis that the process is homogeneous, versus alternatives for whichΛ(t) is superadditive, was proposed by Hollander and Proschan (1974). Proposing a new test for superadditivity ofΛ(t), Kochar and Ramallingam (1989) have observed the fact that the Pitman asymptotic relative efficiency of their test with respect to the Hollander-Proschan test is unity. In order to distinguish between these competing tests, we shall compute the exact Bahadur slopes of these tests for important alternatives and demonstrate that the new test has high Bahadur efficiencies relative to the test of Hollander and Proschan.  相似文献   

6.
What makes a fact appear historical to a historian is its preterite nature — its place in past time. What can be more natural than to assume that its position in past time is sufficient not only to locate an historical fact but also to explain it? The argument that post hoc is propter hoc comes naturally to historians (Postan, 1962).  相似文献   

7.
We propose a quasi-Bayesian nonparametric approach to estimating the structural relationship φφ among endogenous variables when instruments are available. We show that the posterior distribution of φφ is inconsistent in the frequentist sense. We interpret this fact as the ill-posedness of the Bayesian inverse problem defined by the relation that characterizes the structural function φφ. To solve this problem, we construct a regularized posterior distribution, based on a Tikhonov regularization of the inverse of the marginal variance of the sample, which is justified by a penalized projection argument. This regularized posterior distribution is consistent in the frequentist sense and its mean can be interpreted as the mean of the exact posterior distribution resulting from a Gaussian prior distribution with a shrinking covariance operator.  相似文献   

8.
Yijun Zuo 《Metrika》2000,52(1):69-75
Finite sample tail behavior of the Tukey-Donoho halfspace depth based multivariate trimmed mean is investigated with respect to a tail performance measure. It turns out that the tails of the sampling distribution of the α-depth-trimmed mean approach zero at least ⌊αn⌋ times as fast as the tails of the underlying population distribution and could be n−⌊αn⌋+ 1 times as fast. In addition, there is an intimate relationship among the tail behavior, the halfspace depth, and the finite sample breakdown point of the estimator. It is shown that the lower tail performance bound of the depth based trimmed mean is essentially the same as its halfspace depth and the breakdown point. This finding offers a new insight into the notion of the halfspace depth and extends the important role of the tail behavior as a quantitative assessment of robustness in the regression (He, Jurečková, Koenker and Portnoy (1990)) and the univariate location settings (Jurečková (1981)) to the multivariate location setting. Received: 1 July 1999  相似文献   

9.
Bullying in schools can be defined as a category of aggressive behaviour with an imbalance of power, and aggression event is repeated over time. Bullying occurs as a social process in nature, and takes place in groups. Attacks are mostly unprovoked, and can be physical or verbal, direct or indirect. This paper focuses on modelling the propagation of bullying in the Spanish school population aged [12, 18] during the period July 2015–January 2020, and on identifying and quantifying its main drivers. Thus, a population dynamics model is built to forecast and quantify the magnitude of the bullying problem in Spain over the July 2015–January 2020 period by taking into account qualitative and quantitative factors; e.g., demography, economy, socio-cultural behaviour, consumption of drugs and alcohol, social contagion and technology. The study provides recommendations to reduce and prevent the growth of this social problem, but to also mitigate a correlated problem, such as intimate partner violence among adults. In fact one of the main utilities of the built model is that new policies can be simulated and allow their effects to be seen.  相似文献   

10.
Let X , X 1, ..., Xk be i.i.d. random variables, and for k ∈ N let Dk ( X ) = E ( X 1 V ... V X k +1) − EX be the k th centralized maximal moment. A sharp lower bound is given for D 1( X ) in terms of the Lévy concentration Ql ( X ) = sup x ∈ R P ( X ∈[ x , x + l ]). This inequality, which is analogous to P. Levy's concentration-variance inequality, illustrates the fact that maximal moments are a gauge of how much spread out the underlying distribution is. It is also shown that the centralized maximal moments are increased under convolution.  相似文献   

11.
We study competitive interaction between two alternative models of digital content distribution over the Internet: peer‐to‐peer (p2p) file sharing and centralized client–server distribution. We present microfoundations for a stylized model of p2p file sharing where all peers are endowed with standard preferences and show that the endogenous structure of the network is conducive to sharing by a significant number of peers, even if sharing is costlier than freeriding. We build on this model of p2p to analyze the optimal strategy of a profit‐maximizing firm, such as Apple, that offers content available at positive prices. We characterize the size of the p2p network as a function of the firm's pricing strategy, and show that the firm may be better off setting high prices, allowing the network to survive, and that the p2p network may work more efficiently in the presence of the firm than in its absence.  相似文献   

12.
Summary A new multivariate kernel probability density estimator is introduced and its strong uniform consistency is proved under certain regularity conditions. This result is then applied particularly to a kernel estimator whose mean vector and covariance matrix areμ n andV n, respectively, whereμ n is an unspecified estimator of the mean vector andV n, up to a multiplicative constant, the sample covariance matrix of the probability density to be estimated, respectively. Work supported by the Natural Sciences and Engineering Research Council of Canada and by the Fonds F.C.A.R. of the Province of Quebec.  相似文献   

13.
Monitoring the mean and the variance of a stationary process   总被引:3,自引:0,他引:3  
We deal with the problem of how deviations in the mean or the variance of a time series can be detected. Several simultaneous control charts are introduced which are based on EWMA (exponentially weighted moving average) statistics for the mean and the empirical variance. The combined X − S2 EWMA chart is extended to time series. Further simultaneous charts are considered. The comparision of these schemes shows that the residual attempt must be favored if a variance change is present.  相似文献   

14.
In this article, asymptotic inference for the mean of i.i.d. observations in the context of heavy-tailed distributions is discussed. While both the standard asymptotic method based on the normal approximation and Efron's bootstrap are inconsistent when the underlying distribution does not possess a second moment, we propose two approaches based on the subsampling idea of Politis and Romano (1994) which will give correct answers. The first approach uses the fact that the sample mean, properly standardized, will under some regularity conditions have a limiting stable distribution. The second approach consists of subsampling the usual t-statistic and is somewhat more general. A simulation study compares the small sample performance of the two methods. Received: December 1998  相似文献   

15.
  • Pentecostalism is an American creation that has been exported to various parts of the world. Its message of hope is produced and promoted by churches that are led by religious entrepreneurial salespeople who create value for customers through the use of marketing techniques that facilitate the blending of the local and the global; the sacred and the profane. Its success in Africa has been attributed to the African penchant for connecting every human experience to the religious. Using the marketing strategic framework of target market and four Ps as an organization tool, we draw on a year-long ethnography in Kumasi, Ghana, to suggest that Pentecostalism in Ghana is marketed as an effective route to material success without risk to postmortem salvation. While the effective blend of the sacred and the profane allows for open public discourse, it fails to deliver on its promise of economic salvation for the poor. In fact, it causes the condition of the poor to deteriorate even further through application of its fundamental requirement: give and God will bless you according to the extent of your giving. We conclude that the new African Pentecostalism and its open embrace of contemporary consumption and materialism have arrested local religiosity and imposed an “economic imperialism” that impregnates every human experience with a commercial character.
Copyright © 2010 John Wiley & Sons, Ltd.  相似文献   

16.
17.
Meta-analysis has developed to be a most important tool in evaluation research. Heterogeneity is an issue that is present in almost any meta-analysis. However, the magnitude of heterogeneity differs across meta-analyses. In this respect, Higgins’ \(I^2\) has emerged to be one of the most used and, potentially, one of the most useful measures as it provides quantification of the amount of heterogeneity involved in a given meta-analysis. Higgins’ \(I^2\) is conventionally interpreted, in the sense of a variance component analysis, as the proportion of total variance due to heterogeneity. However, this interpretation is not entirely justified as the second part involved in defining the total variation, usually denoted as \(s^2\), is not an average of the study-specific variances, but in fact some other function of the study-specific variances. We show that \(s^2\) is asymptotically identical to the harmonic mean of the study-specific variances and, for any number of studies, is at least as large as the harmonic mean with the inequality being sharp if all study-specific variances agree. This justifies, from our point of view, the interpretation of explained variance, at least for meta-analyses with larger number of component studies or small variation in study-specific variances. These points are illustrated by a number of empirical meta-analyses as well as simulation work.  相似文献   

18.
A Bayesian-like estimator of the process capability index Cpmk   总被引:1,自引:0,他引:1  
W. L. Pearn  G. H. Lin 《Metrika》2003,57(3):303-312
Pearn et al. (1992) proposed the capability index Cpmk, and investigated the statistical properties of its natural estimator for stable normal processes with constant mean μ. Chen and Hsu (1995) showed that under general conditions the asymptotic distribution of is normal if μ≠m, and is a linear combination of the normal and the folded-normal distributions if μ=m, where m is the mid-point between the upper and the lower specification limits. In this paper, we consider a new estimator for stable processes under a different (more realistic) condition on process mean, namely, P (μ≥m)=p, 0≤p≤1. We obtain the exact distribution, the expected value, and the variance of under normality assumption. We show that for P (μ≥m)=0, or 1, the new estimator is the MLE of Cpmk, which is asymptotically efficient. In addition, we show that under general conditions is consistent and is asymptotically unbiased. We also show that the asymptotic distribution of is a mixture of two normal distributions. RID="*" ID="*"  The research was partially supported by National Science Council of the Republic of China (NSC-89-2213-E-346-003).  相似文献   

19.
A thorough study of X party material flow (XPMF), its theory and its applications is conducted in this research. The X material flow concept is an extension of the material flow (MF) theory. To further elucidate that XPMF is one type of MF service model with PMF (party, material, flow) fractal structure and the characteristics of XPMF, we develop the three-pyramid synergetic operational model of XPMF. Through the analysis of several cases, we believe that OIR (Objective relational grade, Information sharing grade, and Resource complementary grade) is the set of order parameters that control and determine the formation of XPMF new structure and its degree of being ordered. Therefore it reveals the mechanism of XPMF formation and evolution. We also provide the principles and the methods for XPMF control.  相似文献   

20.
Even a cursory perusal of the social science literature suffices to show the prevalence of dichotomous thinking. Many of these dichotomies deal with some aspect of the “conceptual versus empirical” distinction. This paper shows that while dichotomies predominate for some reason, the actual research process that they are designed to represent deals minimally with three separate and necessary levels. We term these the conceptual level (X), the empirical level (X′), and the operational or indicator level (X″). This minimal three level model is applied to an analysis of philosophical foundations of measurement, specifically the formulations of Northrop and Bridgman. It is shown that both of these formulations are essentially dichotomous, while the phenomena they deal with are trichotomous. For example, Northrop's “concepts by postulation” and “concepts by intuition” are purportedly separate levels connected by an epistemic correlation. Application of the three level model reveals that both are true concepts, and thus belong on the same level of analysis (X). Similarly, application of the three level model to Bridgman's formulation shows that both mental and physical concepts belong on the same level (X). Bridgman's formulation is valuable in pointing out that operations are not restricted to one level of analysis, and in fact we see them to be crucial on all three levels. The three level model is not a panacea, but does provide an efficacious framework for the difficult but important task of analyzing the philosophical underpinning of measurement.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号