首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 62 毫秒
1.
This paper considers the problem of unbiasedly estimating the population proportion when the study variable is potential sensitive in nature. In order to protect the respondent’s privacy, various techniques of generating randomized response rather than direct response are available in the literature. But the theory concerning them is developed only under the hypothesis of completely truthful reporting. Actually, the occurrence of untruthful reporting is a prospect in dealing with highly sensitive matters such as abortion or socially deviant behaviors. Illustrating Warner’s [(1965), Journal of the American Statistical Association. 60: 63–69] randomized response technique we show how unbiased estimation of the population proportion can be extended to cover a case when some respondents may lie.  相似文献   

2.
This paper considers the problem of procuring truthful responses to estimate the proportion of qualitative characteristics. In order to protect the respondent's privacy, various techniques of generating randomized response rather than direct response are available in the literature. But the theory concerning them is generally developed with no attention to the required level of privacy protection. Illustrating two randomization devices, we show how optimal randomized response designs may be achieved. The optimal designs of forced response procedure as well as BHARGAVA and SINGH [Statistica (2000) Vol. 6, pp 315–321] procedure are shown to be special cases. In addition, the equivalent designs of optimal WARNER [Journal of the American Statistical Association (1965) Vol. 60, pp. 63–69] procedure are considered as well. It is also shown that stratification with proportional allocation will be helpful for improving the estimation efficiency.  相似文献   

3.
This article analyzes the impact of the EU on Portugal’s economy, how the EU rules and regulations have affected the country’s policy-making style, and the impact of EU’s transfer of resources. It shows that Portugal has greatly benefitted from joining the EU: the rapid exposure to foreign competition has forced modernization of many sectors; there was a retreat of the state from direct involvement in economic activities and the creation of a number of new export-oriented sectors, and EU transfers were effectively applied in modernizing the country’s infrastructure. However, increased productivity has not improved equity from both an income distribution point of view and from regional income concentration.  相似文献   

4.
Without accounting for sensitive items in sample surveys, sampled units may not respond (nonignorable nonresponse) or they respond untruthfully. There are several survey designs that address this problem and we will review some of them. In our study, we have binary data from clusters within small areas, obtained from a version of the unrelated‐question design, and the sensitive proportion is of interest for each area. A hierarchical Bayesian model is used to capture the variation in the observed binomial counts from the clusters within the small areas and to estimate the sensitive proportions for all areas. Both our example on college cheating and a simulation study show significant reductions in the posterior standard deviations of the sensitive proportions under the small‐area model as compared with an analogous individual‐area model. The simulation study also demonstrates that the estimates under the small‐area model are closer to the truth than for the corresponding estimates under the individual‐area model. Finally, for small areas, we discuss many extensions to accommodate covariates, finite population sampling, multiple sensitive items and optional designs.  相似文献   

5.
The present study tests the proposition that the normative rational model of decision making influences diversification strategy which, in turn, influences the firm’s performance. Questionnaires measuring rational decision making were mailed to 441 large U.S. manufacturing firms with a response rate of 23%. Compustat was used to measure Palepu’s entropy measures of diversification: total, related, and unrelated diversification. The results show a significant positive relationship between top management’s emphasis on rational decision making and diversification as well as a significant negative relationship between diversification and firm performance. Thus, the study shows strong support for the role of diversification strategy as a mediator between rational decision making and firm performance.  相似文献   

6.
In the present investigation, a new forced quantitative randomized response (FQRR) model has been proposed. Both situations when the values of the forced quantitative response are known and unknown are studied. The forced qualitative randomized response models due to Liu and Chow (J Am Stat Assoc 71:72–73, 1976a, Biometrics 32:607–618, 1976b) and Stem and Steinhorst (J Am Stat Assoc 79:555–564, 1984) are shown as a special case of the situation when the value of the forced quantitative randomized response is simply replaced by a forced “yes” response. The proposed FQRR model remains more efficient than the recent Bar-Lev et al. (Metrika, 60:255–260, 2004), say BBB model. The relative efficiency of the proposed FQRR model with respect to the existing competitors, like the BBB model, has been investigated under different situations. No doubt the present model will lead to several new developments in the field of randomized response sampling. The proposed FQRR model will encourage researchers/scientists to think more on these lines.  相似文献   

7.
Advances in information technology have not only promoted competition since the 1980s, but they also advanced globalization in the 1990s. The information technology group has lost its momentum since 2000. One of its vital segments was the enterprise software software sector. The sector’s products—such as ERP, CRM—have spurred the speed and efficiency of scores of U.S. companies and service firms in the past fifty months. In 2000, the majority of enterprise software firms incurred losses. Were one to assume that these innovative firms have a real contribution to augment client firms’ efficiency, the question of how to analyze their performance in the absence of positive net income becomes a Gordian knot. To unravel, we have employed selected non-income based measures as proxy criteria. We classify twenty-two firms into three subsets: Pre-1990 and post-1990 firms, full-line and partial-line firms, and large-capitalized and small-capitalized firms, and perform a comparative analysis on the basis of the proxy measures. While pointing to significant differences in productivity and perceived risks among these subsets, our findings allow us to speculate that, one, cottage-industry status of the sector has ended, and, two, new beginnings, in the post-September era of anxiety, will crystallize in the form of a consolidated industry comprising a few firms that possess the intrinsic value of leveraging their innovative capabilities.  相似文献   

8.
The ability to design experiments in an appropriate and efficient way is an important skill, but students typically have little opportunity to get that experience. Most textbooks introduce standard general‐purpose designs, and then proceed with the analysis of data already collected. In this paper we explore a tool for gaining design experience: computer‐based virtual experiments. These are software environments which mimic a real situation of interest and invite the user to collect data to answer a research question. Two prototype environments are described. The first one is suitable for a course that deals with screening or response surface designs, the second one allows experimenting with block and row‐column designs. They are parts of a collection we developed called ENV2EXP, and can be freely used over the web. We also describe our experience in using them in several courses over the last few years.  相似文献   

9.
This paper systematically reviews empirical studies looking at the effectiveness of the Delphi technique, and provides a critique of this research. Findings suggest that Delphi groups outperform statistical groups (by 12 studies to two with two ‘ties’) and standard interacting groups (by five studies to one with two ‘ties’), although there is no consistent evidence that the technique outperforms other structured group procedures. However, important differences exist between the typical laboratory version of the technique and the original concept of Delphi, which make generalisations about ‘Delphi’ per se difficult. These differences derive from a lack of control of important group, task, and technique characteristics (such as the relative level of panellist expertise and the nature of feedback used). Indeed, there are theoretical and empirical reasons to believe that a Delphi conducted according to ‘ideal’ specifications might perform better than the standard laboratory interpretations. It is concluded that a different focus of research is required to answer questions on Delphi effectiveness, focusing on an analysis of the process of judgment change within nominal groups.  相似文献   

10.
It is now becoming apparent that the current ‘stock-push’ vehicle supply in the automotive industry by fulfilling the large majority of orders from existing stock is no longer a viable proposition. Cost pressure from rising stock levels in the market and high discounts needed to sell these vehicles have forced vehicle manufacturers to rethink their order fulfilment strategy in favour of stock-less ‘build-to-order’ systems. More responsive order fulfilment at vehicle manufacturer level however will not only require flexible and responsive component supply and vehicle assembly, but will also have wide ramifications for all logistics operations in the auto supply chain. Based on findings of the 3DayCar research programme, this paper compares the implications on inbound, outbound and sea transportation logistics, leading to the development of a strategic framework for future automotive logistics operations.  相似文献   

11.
In this study we investigate the question of whether institutional investors enhance or reduce efficiency in the market for corporate control. In particular, given unequivocal evidence that target stockholders gain in successful takeover bids, we investigate the impact of institutional ownership in target firms on the adoption of the type of antitakeover defense as well as the outcome of takeover bids. We find that target firms are more likely to adopt value-reducing antitakeover defenses and successfully thwart takeover bids when a higher percentage of target common stock is owned by ‘pressure-indeterminate’ investors (investment counsel firms in particular). On the other hand, the probability of a successful takeover rises with the ownership of both ‘pressure-sensitive’ and ‘pressure-resistant’ investors. The above findings support the view that institutional investors do not play a homogeneous role in the market for corporate control.  相似文献   

12.
How do companies in the fast-growing industries achieve good customer satisfaction together with efficiency in supply chain management (SCM)? This inductive case study of six customer cases of Nokia Networks, one of the leading providers of mobile telecommunication technology, led to propositions exploring that question. Good relationship between the customer and the supplier contributes to reliable information flows, and reliable demand information flows in turn contribute to high efficiency—these are well-researched issues also in other industry environments. But in a fast-growing systems business such as mobile telecommunications industry, the supplier needs to be able to adapt its offering to a wide variety of customer situations and needs. Understanding the customer’s situation and need together with the right offering contributes to good co-operation in improving the joint demand chain, which further leads to superior demand chain efficiency and high customer satisfaction.  相似文献   

13.
Planning, implementing and evaluating an intervention program all hinge around time. A program’s actions are planned according to a forecast of the time required to achieve certain objectives, and the program’s implementation among a group of users is conditioned by its real time application. Similarly, program evaluation needs to take into consideration the time resource when analysing objectively the extent to which a program’s targets have been reached, and when conducting a cost analysis of the program. In limited resource programs, any disparity between the scheduled time and the real time available can have serious consequences, and even undermine a program’s efficacy. Time management, above all where resources are limited, is therefore the linchpin in the planning, implementation and evaluation of an intervention program. In this study we analyse the utility of PERT and CPM as basic tools for the efficient time management of limited resource programs.  相似文献   

14.
Optimal exact designs are notoriously hard to study and only a few of them are known for polynomial models. Using recently obtained optimal exact designs (I mhof , 1997), we show that the efficiency of the frequently used rounded optimal approximate designs can be sensitive if the sample size is small. For some criteria, the efficiency of the rounded optimal approximate design can vary by as much as 25% when the sample size is changed by one unit. The paper also discusses lower efficiency bounds and shows that they are sometimes the best possible bounds for the rounded optimal approximate designs.  相似文献   

15.
In this paper, we propose an alternative approach for pricing and hedging American barrier options. Specifically, we obtain an analytic representation for the value and hedge parameters of barrier options, using the decomposition technique of separating the European option value from the early exercise premium. This allows us to identify some new put-call ‘symmetry’ relations and the homogeneity in price parameters of the optimal exercise boundary. These properties can be utilized to increase the computational efficiency of our method in pricing and hedging American options. Our implementation of the obtained solution indicates that the proposed approach is both efficient and accurate in computing option values and option hedge parameters. Our numerical results also demonstrate that the approach dominates the existing lattice methods in both accuracy and efficiency. In particular, the method is free of the difficulty that existing numerical methods have in dealing with spot prices in the proximity of the barrier, the case where the barrier options are most problematic.  相似文献   

16.
We establish the inferential properties of the mean-difference estimator for the average treatment effect in randomised experiments where each unit in a population is randomised to one of two treatments and then units within treatment groups are randomly sampled. The properties of this estimator are well understood in the experimental design scenario where first units are randomly sampled and then treatment is randomly assigned but not for the aforementioned scenario where the sampling and treatment assignment stages are reversed. We find that the inferential properties of the mean-difference estimator under this experimental design scenario are identical to those under the more common sample-first-randomise-second design. This finding will bring some clarifications about sampling-based randomised designs for causal inference, particularly for settings where there is a finite super-population. Finally, we explore to what extent pre-treatment measurements can be used to improve upon the mean-difference estimator for this randomise-first-sample-second design. Unfortunately, we find that pre-treatment measurements are often unhelpful in improving the precision of average treatment effect estimators under this design, unless a large number of pre-treatment measurements that are highly associative with the post-treatment measurements can be obtained. We confirm these results using a simulation study based on a real experiment in nanomaterials.  相似文献   

17.
In contrast to a posterior analysis given a particular sampling model, posterior model probabilities in the context of model uncertainty are typically rather sensitive to the specification of the prior. In particular, ‘diffuse’ priors on model-specific parameters can lead to quite unexpected consequences. Here we focus on the practically relevant situation where we need to entertain a (large) number of sampling models and we have (or wish to use) little or no subjective prior information. We aim at providing an ‘automatic’ or ‘benchmark’ prior structure that can be used in such cases. We focus on the normal linear regression model with uncertainty in the choice of regressors. We propose a partly non-informative prior structure related to a natural conjugate g-prior specification, where the amount of subjective information requested from the user is limited to the choice of a single scalar hyperparameter g0j. The consequences of different choices for g0j are examined. We investigate theoretical properties, such as consistency of the implied Bayesian procedure. Links with classical information criteria are provided. More importantly, we examine the finite sample implications of several choices of g0j in a simulation study. The use of the MC3 algorithm of Madigan and York (Int. Stat. Rev. 63 (1995) 215), combined with efficient coding in Fortran, makes it feasible to conduct large simulations. In addition to posterior criteria, we shall also compare the predictive performance of different priors. A classic example concerning the economics of crime will also be provided and contrasted with results in the literature. The main findings of the paper will lead us to propose a ‘benchmark’ prior specification in a linear regression context with model uncertainty.  相似文献   

18.
In this paper, a simple survey technique to measure the sensitivity of survey issues is presented. It can be applied to estimate the population proportion as well as the probability that a respondent truthfully states that he or she bears a sensitive character when experienced in a direct response survey. An unbiased estimator of mean square error for direct response survey is obtainable so as to be able to judge the effect on the accuracy in estimation. It is also found that the proposed technique is more efficient than some traditional techniques. A simple extension for polychotomous situations can be developed as well.  相似文献   

19.
Box-Behnken designs and central composite designs are efficient designs for fitting second order polynomials to response surfaces, because they use relatively small numbers of observations to estimate the parameters. In this paper we investigate the robustness of Box-Behnken designs to the unavailability of observations, in the sense of finding t max , the maximum number of arbitrary rows in the design matrix that can be removed and still leave all of the parameters of interest estimable. The results are compared to the known results for the central composite designs found in MacEachern, Notz, Whittinghill & Zhu (1995). The blocked Box-Behnken designs are equally as robust as those that are not blocked. Received December 1997  相似文献   

20.
In this paper, we propose a new unrelated question model for estimating the prevalence of a sensitive characteristic within a population by utilizing two decks of cards. The resultant estimator is then compared to its competitors as to efficiency and as to protection of respondents. A real data application analyzing e-cigarette use among college students is considered.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号