首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 20 毫秒
1.
The main purpose of this paper is to indicate formal theorems and conditions for the extensions of Rice's Formula for intensities of crossings of a fixed level u by a stationary process X(t) in order to provide the conditional (Palm) distribution of an associated process Y(t) at the level crossing instants. Sections 1 and 2 provide brief background material on Rice's Formula and Palm distributions, while relevant historical comments and basic results are given in Section 3. These are stated with some indications of proof in Section 4 and calculation of the Palm distributions shown in Section 5. Two cases of importance in applications are considered: (a) where (roughly) X(t) , Y(t) and the derivative X'(t) have a joint density and (b) where Y(t) = X'(t) . The resulting Palm distributions are each absolutely continuous with readily calculable densities. These are evaluated for Gaussian processes in Section 6 and applications to motivating 2 structural safety questions indicated in Section 7.  相似文献   

2.
We propose new summary statistics for intensity‐reweighted moment stationary point processes, that is, point processes with translation invariant n‐point correlation functions for all , that generalise the well known J‐, empty space, and spherical Palm contact distribution functions. We represent these statistics in terms of generating functionals and relate the inhomogeneous J‐function to the inhomogeneous reduced second moment function. Extensions to space time and marked point processes are briefly discussed.  相似文献   

3.
We review a rich class of point process models, Cox point processes, and illustrate the necessity of more than one observation (point patterns) in performing parameter estimation. Furthermore, we introduce a new Cox point process model by treating the intensity function of the underlying Poisson point process as a random mixture of normal components. The behaviour and performance of the new model are compared with those of popular Cox point process models. The new model is exemplified with an application that involves a single point pattern corresponding to earthquake events in California, USA.  相似文献   

4.
In the context of stationary point processes measurements are usually made from a time point chosen at random or from an occurrence chosen at random. That is, either the stationary distribution P or its Palm distribution P° is the ruling probability measure. In this paper an approach is presented to bridge the gap between these distributions. We consider probability measures which give exactly the same events zero probability as P°, having simple relations with P . Relations between P and P° are derived with these intermediate measures as bridges. With the resulting Radon-Nikodym densities several well-known results can be proved easily. New results are derived. As a corollary of cross ergodic theorems a conditional version of the well-known inversion formula is proved. Several approximations of P° are considered, for instance the local characterization of Po as a limit of conditional probability measures P° N The total variation distance between P° and P1 can be expressed in terms of the P-distribution function of the forward recurrence time.  相似文献   

5.
The use of properties of a Poisson process to study the randomness of stars is traced back to a 1767 paper. The process was used and rediscovered many times, and we mention some of the early scientific areas. The name Poisson process was first used in print in 1940, and we believe the term was coined in the corridors of Stockholm University some time between 1936 and 1939. We follow the early developments of doubly stochastic processes and cluster processes, and describe different efforts to apply the Markov property to point processes.  相似文献   

6.
Statistical methodology for spatio‐temporal point processes is in its infancy. We consider second‐order analysis based on pair correlation functions and K‐functions for general inhomogeneous spatio‐temporal point processes and for inhomogeneous spatio‐temporal Cox processes. Assuming spatio‐temporal separability of the intensity function, we clarify different meanings of second‐order spatio‐temporal separability. One is second‐order spatio‐temporal independence and relates to log‐Gaussian Cox processes with an additive covariance structure of the underlying spatio‐temporal Gaussian process. Another concerns shot‐noise Cox processes with a separable spatio‐temporal covariance density. We propose diagnostic procedures for checking hypotheses of second‐order spatio‐temporal separability, which we apply on simulated and real data.  相似文献   

7.
Over the last decade, there have been an increasing interest in the techniques of process monitoring of high-quality processes. Based upon the cumulative counts of conforming (CCC) items, Geometric distribution is particularly useful in these cases. Nonetheless, in some processes the number of one or more types of defects on a nonconforming observation is also of great importance and must be monitored simultaneously. However, there usually exist some correlations between these two measures, which obligate the use of multi-attribute process monitoring. In the literature, by assuming independence between the two measures and for the cases in which there is only one type of defect in nonconforming items, the generalized Poisson distribution is proposed to model such a problem and the simultaneous use of two separate control charts (CCC & C chats) is recommended. In this paper, we propose a new methodology to monitor multi-attribute high-quality processes in which not only there exist more than one type of defects on the observed nonconforming item but also there is a dependence structure between the two measures. To do this, first we transform multi-attribute data in a way that their marginal probability distributions have almost zero skewnesses. Then, we estimate the transformed mean vector and covariance matrix and apply the well-known χ2 control chart. In order to illustrate the proposed method and evaluate its performance, we use two numerical examples by simulation and compare the results. The results of the simulation studies are encouraging.  相似文献   

8.
This paper introduces a simple method to construct a stationary process on the real line with a Pólya‐type covariance function and with any infinitely divisible marginal distribution, by randomising the timescale of the increment of a second‐order Lévy process with an appropriate positive random variable. With the construction method extended to the multivariate case, we construct vector stochastic processes with Pólya‐type direct covariance functions and with any specified infinitely divisible marginal distributions. This makes available a new class of non‐Gaussian vector stochastic processes with flexible correlation structure for use in modelling and simulation.  相似文献   

9.
In this paper, we investigate the goodness-of-fit of three Lévy processes, namely Variance-Gamma (VG), Normal-Inverse Gaussian (NIG) and Generalized Hyperbolic (GH) distributions, and probability distribution of the Heston model to index returns of twenty developed and emerging stock markets. Furthermore, we extend our analysis by applying a Markov regime switching model to identify normal and turbulent periods. Our findings indicate that the probability distribution of the Heston model performs well for emerging markets under full sample estimation and retains goodness of fit for high volatility periods, as it explicitly accounts for the volatility process. On the other hand, the distributions of the Lévy processes, especially the VG and NIG distributions, generally improves upon the fit of the Heston model, particularly for developed markets and low volatility periods. Furthermore, some distributions yield to significantly large test statistics for some countries, even though they fit well to other markets, which suggest that properties of the stock markets are crucial in identifying the best distribution representing empirical returns.  相似文献   

10.
Consider a triangular array of mean zero Gaussian random variables. Under some weak conditions this paper proves that the partial sums and the point processes of exceedances formed by the array are asymptotically independent. For a standardized stationary Gaussian sequence, it is shown under some mild conditions that the point process of exceedances formed by the sequence (after centered at the sample mean) converges in distribution to a Poisson process and it is asymptotically independent of the partial sums. Finally, the joint limiting distributions of the extreme order statistics and the partial sums are obtained.  相似文献   

11.
Two‐state models (working/failed or alive/dead) are widely used in reliability and survival analysis. In contrast, multi‐state stochastic processes provide a richer framework for modeling and analyzing the progression of a process from an initial to a terminal state, allowing incorporation of more details of the process mechanism. We review multi‐state models, focusing on time‐homogeneous semi‐Markov processes (SMPs), and then describe the statistical flowgraph framework, which comprises analysis methods and algorithms for computing quantities of interest such as the distribution of first passage times to a terminal state. These algorithms algebraically combine integral transforms of the waiting time distributions in each state and invert them to get the required results. The estimated transforms may be based on parametric distributions or on empirical distributions of sample transition data, which may be censored. The methods are illustrated with several applications.  相似文献   

12.
Spatial marked point processes are models for systems of points which are randomly distributed in space and provided with measured quantities called marks. This study deals with marking, that is methods of constructing marked point processes from unmarked ones. The focus is density‐dependent marking where the local point intensity affects the mark distribution. This study develops new markings for log Gaussian Cox processes. In these markings, both the mean and variance of the mark distribution depend on the local intensity. The mean, variance and mark correlation properties are presented for the new markings, and a Bayesian estimation procedure is suggested for statistical inference. The performance of the new approach is studied by means of simulation experiments. As an example, a tropical rainforest data is modelled.  相似文献   

13.
As well as arising naturally in the study of non-intersecting random paths, random spanning trees, and eigenvalues of random matrices, determinantal point processes (sometimes also called fermionic point processes) are relatively easy to simulate and provide a quite broad class of models that exhibit repulsion between points. The fundamental ingredient used to construct a determinantal point process is a kernel giving the pairwise interactions between points: the joint distribution of any number of points then has a simple expression in terms of determinants of certain matrices defined from this kernel. In this paper we initiate the study of an analogous class of point processes that are defined in terms of a kernel giving the interaction between 2M points for some integer M. The role of matrices is now played by 2M-dimensional “hypercubic” arrays, and the determinant is replaced by a suitable generalization of it to such arrays—Cayley’s first hyperdeterminant. We show that some of the desirable features of determinantal point processes continue to be exhibited by this generalization.  相似文献   

14.
The purpose of this paper is to characterize three commonly used double unit root tests in terms of their asymptotic local power. To this end we study a class of nearly doubly integrated processes which in the limit will behave as a weighted integral of a double indexed Ornstein-Uhlenbeck process. Based on a numerical examination of the analytical distributions, a comparison of the tests is made via their asymptotic local power functions.  相似文献   

15.
In the project "statistical image analysis" of CWI we have studied some spatial point patterns that originated from biological observations. These observations were the positions of so called EGF-receptors on the surface of human carcinoma cells.
We propose a stochastic model for these point patterns. Since the EGF-receptors appear in clusters on the cell surface, we have opted for the Poisson-cluster-process as the model. We estimated the three parameters in this process by means of a method described by Diggle. We also did some work in assessing the statistical reliability of our estimates.  相似文献   

16.
Several authors have proposed stochastic and non‐stochastic approximations to the maximum likelihood estimate (MLE) for Gibbs point processes in modelling spatial point patterns with pairwise interactions. The approximations are necessary because of the difficulty of evaluating the normalizing constant. In this paper, we first provide a review of methods which yield crude approximations to the MLE. We also review methods based on Markov chain Monte Carlo techniques for which exact MLE has become feasible. We then present a comparative simulation study of the performance of such methods of estimation based on two simulation techniques, the Gibbs sampler and the Metropolis‐Hastings algorithm, carried out for the Strauss model.  相似文献   

17.
The Dirichlet distributions have long been the subject of intense scrutiny in statistics and probability. Despite the enormous interest in, and wide-ranging applications of, these distributions, little appears to be known about their history. In this article we review the development of the Dirichlet distributions and their companions, the Liouville distributions. After reviewing some integral formulas of Dirichlet and Liouville, we survey the theory and applications of these distributions in statistics.  相似文献   

18.
19.
In spite of Taguchi's robust parameter design (Introduction to quality engineering: designing quality into products and processes, 1986, Asian Productivity Organization, Tokiyo), tolerance design is still important at the design stage of products and processes. Taguchi's proposal and related methods for tolerance design, however, do not efficiently use the information that can be obtained from the parameter design experiment. In this paper, we introduce a new method for tolerance design based on the response surface approach to parameter design. It is a flexible method because non-normal distributions of the noise factors and the quality characteristic are allowed. Moreover, it is unnecessary to perform a new physical experiment. Essentially, tolerances of noise factors are maximized, subject to constraints to ensure that the mean value of the quality characteristic remains on target and the fraction nonconforming is below a pre-specified maximum. Some aspects of model uncertainty are discussed and the method is illustrated by means of an example.  相似文献   

20.
Using frequency distributions of daily closing price time series of several financial market indices, we investigate whether the bias away from an equiprobable sequence distribution found in the data, predicted by algorithmic information theory, may account for some of the deviation of financial markets from log‐normal, and if so for how much of said deviation and over what sequence lengths. We do so by comparing the distributions of binary sequences from actual time series of financial markets and series built up from purely algorithmic means. Our discussion is a starting point for a further investigation of the market as a rule‐based system with an algorithmic component, despite its apparent randomness, and the use of the theory of algorithmic probability with new tools that can be applied to the study of the market price phenomenon. The main discussion is cast in terms of assumptions common to areas of economics in agreement with an algorithmic view of the market.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号