首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
2.
3.
4.
5.
6.
7.
8.
9.
Statistical Surveillance. Optimality and Methods   总被引:1,自引:0,他引:1  
Different criteria of optimality are used in different subcultures of statistical surveillance. One aim with this review is to bridge the gap between the different areas. The shortcomings of some criteria of optimality are demonstrated by their implications. Some commonly used methods are examined in detail, with respect to optimality. The examination is made for a standard situation in order to focus on the inferential principles. A uniform presentation of methods, by expressions of likelihood ratios, facilitates the comparisons between methods. The correspondences between criteria of optimality and methods are examined. The situations and parameter values for which some commonly used methods have optimality properties are thus determined. A linear approximation of the full likelihood ratio method, which satisfies several criteria of optimality, is presented. This linear approximation is used to examine when linear methods are approximately optimal. Methods for complicated situations are reviewed with respect to optimality and robustness.  相似文献   

10.
11.
12.
13.
14.
15.
16.
17.
Framework maps of the human genome are an important staging post in the on-going effort to sequence the entire genome. The existence of high quality maps is also a prerequistite for studies attempting to determine the location of genes involved in common diseases. The basic experimental approaches to constructing both genetic and physical maps are briefly described as well as their respective uses. A variety of statistical approaches to map construction are outlined including parsimony, maximum likelihood and Bayesian methodologies. The mostly widely used of these, the method of maximum likelihood, is discussed in detail, particularly in the context of physical mapping using radiation hybrids. Finally, current statistical issues and problems in the field of genome mapping are described.  相似文献   

18.
This paper provides a review of common statistical disclosure control (SDC) methods implemented at statistical agencies for standard tabular outputs containing whole population counts from a census (either enumerated or based on a register). These methods include record swapping on the microdata prior to its tabulation and rounding of entries in the tables after they are produced. The approach for assessing SDC methods is based on a disclosure risk–data utility framework and the need to find a balance between managing disclosure risk while maximizing the amount of information that can be released to users and ensuring high quality outputs. To carry out the analysis, quantitative measures of disclosure risk and data utility are defined and methods compared. Conclusions from the analysis show that record swapping as a sole SDC method leaves high probabilities of disclosure risk. Targeted record swapping lowers the disclosure risk, but there is more distortion of distributions. Small cell adjustments (rounding) give protection to census tables by eliminating small cells but only one set of variables and geographies can be disseminated in order to avoid disclosure by differencing nested tables. Full random rounding offers more protection against disclosure by differencing, but margins are typically rounded separately from the internal cells and tables are not additive. Rounding procedures protect against the perception of disclosure risk compared to record swapping since no small cells appear in the tables. Combining rounding with record swapping raises the level of protection but increases the loss of utility to census tabular outputs. For some statistical analysis, the combination of record swapping and rounding balances to some degree opposing effects that the methods have on the utility of the tables.  相似文献   

19.
20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号