首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   12篇
  免费   0篇
计划管理   12篇
  2003年   1篇
  2001年   2篇
  2000年   1篇
  1996年   1篇
  1995年   2篇
  1993年   2篇
  1988年   1篇
  1983年   1篇
  1974年   1篇
排序方式: 共有12条查询结果,搜索用时 15 毫秒
1.
The truncated Poisson regression model is used to arrive at point and interval estimates of the size of two offender populations, i.e. drunk drivers and persons who illegally possess firearms. The dependent capture–recapture variables are constructed from Dutch police records and are counts of individual arrests for both violations. The population size estimates are derived assuming that each count is a realization of a Poisson distribution, and that the Poisson parameters are related to covariates through the truncated Poisson regression model. These assumptions are discussed in detail, and the tenability of the second assumption is assessed by evaluating the marginal residuals and performing tests on overdispersion. For the firearms example, the second assumption seems to hold well, but for the drunk drivers example there is some overdispersion. It is concluded that the method is useful, provided it is used with care.  相似文献   
2.
3.
4.
5.
The length of repeated hypercalcemia free periods of patients with bone metastasis of breast cancer with at least one hypercalcemic event was modelled according to a generalized linear mixed model formulated in terms of transition probabilities and according to a latent variable model. In the former case the periods were assumed to be lognormally distributed with two variance components (patients and residue). In the latter case the conditional intensity given a patient was assumed to be the intensity of the Weibull distribution, while the random patient effect (frailty) was assumed to be drawn from a gamma distribution. In both cases the selection of only patients with at least one hypercalcemic event was taken into consideration. In both models the variance of the patient effect turned out to be negligible. For the second and later periods the Weibull appeared to fit better than the lognormal model. For the first period there was almost no information available.  相似文献   
6.
Logistic Regression, a review   总被引:1,自引:0,他引:1  
A review is given of the development of logistic regression as a multi-purpose statistical tool.
A historical introduction shows several lines culminating in the unifying paper of Cox (1966), in which theory as developed in the field of bio-assay is shown to be applicable to designs as discriminant-analysis and case-control study. A review is given of several designs all leading to the same analysis. The link is made with epidemiological literature.
Several optimization criteria are discussed that can be used in the case of more observations per cell, namely maximum likelihood, minimum chi-square and weighted regression on the observed logits. Recent literature on the goodness of fit problem is reviewed and finally, comments are made about the non-parametric approach to logistic regression which is still in rapid development.  相似文献   
7.
This paper gives a short introduction into the empirical B ayes approach. To illustrate this approach the classical problem of testing a simple hypothesis against a simple alternative is discussed. Some explicit results are given for testing Ho: N(- 1, 1) against H,: N( 1, 1). This paper is a further elaboration of section 3.1 of the author's dissertation W an H ouwelingen (1973)).  相似文献   
8.
Abstract  A class of empirical Bayes estimators (EBE's) is proposed for estimating the natural parameter of a one-parameter exponential family. In contrast to related EBE's proposed and investigated until now, the EBE's presented in this paper possess the nice property of being monotone by construction. Based on an arbitrary reasonable estimator of the underlying marginal density, a simple algorithm is given to construct a monotone EBE. Two representations of these EBE's are given, one of which serves as a tool in establishing asymptotic results, while the other one, related with isotonic regression, proves useful in the actual computation.  相似文献   
9.
A review is given of shrinkage and penalization as tools to improve predictive accuracy of regression models. The James-Stein estimator is taken as starting point. Procedures covered are Pre-test Estimation, the Ridge Regression of Hoerl and Kennard, the Shrinkage Estimators of Copas and Van Houwelingen and Le Cessie, the LASSO of Tibshirani and the Garotte of Breiman. An attempt is made to place all these procedures in a unifying framework of semi-Bayesian methodology. Applications are briefly mentioned, but not amply discussed.  相似文献   
10.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号