首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
Warner’s (J Am Stat Assoc 60:63–69, 1965) pioneering ‘randomized response’ (RR) technique (RRT), useful in unbiasedly estimating the proportion of people bearing a sensitive characteristic, is based exclusively on simple random sampling (SRS) with replacement (WR). Numerous developments that follow from it also use SRSWR and employ in estimation the mean of the RR’s yielded by each person everytime he/she is drawn in the sample. We examine how the accuracy in estimation alters when instead we use the mean of the RR’s from each distinct person sampled and also alternatively if the Horvitz and Thompson’s (HT, J Am Stat Assoc 47:663–685, 1952) method is employed for an SRSWR eliciting only, a single RR from each. Arnab’s (1999) approach of using repeated RR’s from each distinct person sampled is not taken up here to avoid algebraic complexity. Pathak’s (Sánkhya 24:287–302, 1962) results for direct response (DR) from SRSWR are modified with the use of such RR’s. Through simulations we present relative levels in accuracy in estimation by the above three alternative methods.  相似文献   

2.
The work–nonwork supportiveness of an organization may influence applicant decision making among young applicants. This possibility was tested using a phased narrowing decision making task and three organizational attributes (salary, number of work–nonwork supportive policies/benefits and their related culture supportiveness). Data gathered from a sample of 110 graduating college business majors partially supported the hypotheses (p < 0.05), revealing a dynamic influence of the organizational attributes across decision making stages and a differential impact of the attributes depending on their framing as family-friendly or life-friendly. Salary was especially important in initial screening of organizational options, and the organizational culture support of work–nonwork challenges was increasingly influential as the final choice was formed. Implications for young applicant attraction are discussed. The research presented here (a portion of the author’s master’s thesis at Bowling Green State University) was presented at the 2006 Annual Conference of the Society for Industrial and Organizational Psychology in Dallas, TX. Thanks go to Dr. Steve M. Jex, Dr. Scott Highhouse, and Dr. Cathy Stein for their assistance in the development of this study. Thanks are also owed to Dr. Michael Doherty for his thoughtful comments on an earlier draft.  相似文献   

3.
This paper discusses dispersion of growth patterns of macroeconomic models in thermodynamic limits. More specifically, the paper shows that the coefficients of variations of the total numbers of clusters and the numbers of clusters of specific sizes of one- and two-parameter Poisson–Dirichlet models behave qualitatively differently in the thermodynamic limits. The coefficients of variations of the numbers of clusters in the former class of distributions are all self-averaging, while the those in the latter class are all non-self averaging. In other words, dispersions or variations of growth rates about the means do not vanish in the two-parameter version of the model, while they do in the one-parameter version in the thermodynamic limits. The paper ends by pointing out other models, such as triangular urn models, may converge to Mittag–Leffler distributions which exhibit non-self-averaging behavior for certain parameter combinations. The author is grateful for many helps he received from H. Yoshikawa, and M. Sibuya.  相似文献   

4.
We consider the ability to detect interaction structure from data in a regression context. We derive an asymptotic power function for a likelihood-based test for interaction in a regression model, with possibly misspecified alternative distribution. This allows a general investigation of different types of interactions which are poorly or well detected via data. Principally we contrast pairwise-interaction models with ‘diffuse interaction models’ as introduced in Gustafson et al. (Stat Med 24:2089–2104, 2005).  相似文献   

5.
The paper addresses the problem of agent-based asset pricing models with order-based strategies that the implied positions of the agents remain indeterminate. To overcome this inconsistency, two easily applicable risk aversion mechanisms are proposed which modify the original actions of a market maker and the speculative agents, respectively. Here the concepts are incorporated into the classical Beja–Goldman model. For the deterministic version of the thus enhanced model a four-dimensional mathematical stability analysis is provided. In a stochastic version it is demonstrated that jointly the mechanisms are indeed able to keep the agents’ positions within bounds, provided the corresponding risk aversion coefficients are neither too low nor too high. A similar result holds for the misalignment of the market price. We wish to thank two anonymous referees for their observations and detailed comments. Financial support from EU STREP ComplexMarkets, contract number 516446, is gratefully acknowledged.  相似文献   

6.
We consider nonparametric estimation of multivariate versions of Blomqvist’s beta, also known as the medial correlation coefficient. For a two-dimensional population, the sample version of Blomqvist’s beta describes the proportion of data which fall into the first or third quadrant of a two-way contingency table with cutting points being the sample medians. Asymptotic normality and strong consistency of the estimators are established by means of the empirical copula process, imposing weak conditions on the copula. Though the asymptotic variance takes a complicated form, we are able to derive explicit formulas for large families of copulas. For the copulas of elliptically contoured distributions we obtain a variance stabilizing transformation which is similar to Fisher’s z-transformation. This allows for an explicit construction of asymptotic confidence bands used for hypothesis testing and eases the analysis of asymptotic efficiency. The computational complexity of estimating Blomqvist’s beta corresponds to the sample size n, which is lower than the complexity of most competing dependence measures.   相似文献   

7.
This paper investigates Black–Scholes call and put option thetas, and derives upper and lower bounds for thetas as a function of underlying asset value. It is well known that the maximum time premium of an option occurs when the underlying asset value equals the exercise price. However, we show that the maximum option theta does not occur at that point, but instead occurs when the asset value is somewhat above the exercise price. We also show that option theta is not monotonic in any of the parameters in the Black–Scholes option-pricing model, including time to maturity. We further explain why the implications of these findings are important for trading and hedging strategies that are affected by the decay in an option’s time premium.
Tie Su (Corresponding author)Email:
  相似文献   

8.
We examine the asymptotic behavior of two strategyproof mechanisms discussed by Moulin for public goods – the conservative equal costs rule (CER) and the serial cost sharing rule (SCSR) – and compare their performance to that of the pivotal mechanism (PM) from the Clarke–Groves family. Allowing the individuals’ valuations for an excludable public project to be random variables, we show under very general assumptions that expected welfare loss generated by the CER, as the size of the population increases, becomes arbitrarily large. However, all moments of the SCSR’s random welfare loss asymptotically converge to zero. The PM does better than the SCSR, with its welfare loss converging even more rapidly to zero.  相似文献   

9.
The paper proposes a general framework for modeling multiple categorical latent variables (MCLV). The MCLV models extend latent class analysis or latent transition analysis to allow flexible measurement and structural components between endogenous categorical latent variables and exogenous covariates. Therefore, modeling frameworks in conventional structural equation models, for example, CFA and MIMIC models are feasible in the MCLV circumstances. Parameter estimations for the MCLV models are performed by using generalized expectation–maximization (E–M) algorithm. In addition, the adjusted Bayesian information criterion provides help for model selections. A substantive study of reading development is analyzed to illustrate the feasibility of MCLV models.  相似文献   

10.
Tao  C. J.  Chen  S. C.  Chang  L. 《Quality and Quantity》2009,43(4):677-694
At present time the ISP (Internet Service Provider) already make a great impact to the human life as well as the economic society, everything has a close relation with the internet service and due to the fast development of information technology further lower the cost of ISP and improved the service speed as well as its quality. All of this make the customer layer of ISP more popular than before and thus bring up the fierce competition in this industry. Due to the entry barrier of high investment and high technology of ISP industry it has become as a monopoly market and the monopoly market is very respect the competition and cooperation relationship with competitors. Besides, most of the relevant literature in the past is focus on the measurement and analysis of customer to the self-company and few of them mention how to include the satisfaction of competitor’s customer into measurement and analysis. As golden strategy stated in the Sun-Tze Strategic: “Know yourself and know your competitor well, winning every war.” we must consider the strength, weakness, opportunity and threat on customer satisfaction of your company and competitors. Base on the above motivation, this article will apply the DMAIC (Define, Measure, Analyze, Improve, Control) methodology of 6-Sigma, focus on the viewpoint of customer satisfaction of ISP industry. At first we utilize the 5 equal scale measurement define the network quality item of ISP, which provided by the self-company and competitors. And measuring the degree of “satisfaction” and “importance” of these quality items, then use the performance evaluation matrix and strength and weakness matrix to analyze the Strength, Weakness, Opportunity and Threat of self and competitors, focus on the quality items that fall out of the 2 sigma and use the strength–weakness strategic chart to establish the improvement strategy. And at last, utilize the strength–weakness matrix chart as the control tool to verify and sustain the effectiveness of the improved performance. The complete and easy measurement improvement model provided in this article can be used by the enterprise to effectively and quickly evaluating and analyzing the service quality of self and competitors. And under the reasonable cost condition with considering the competing opportunity and threat of market to effectively improving the customer service quality and promoting the overall customer satisfaction and create powerful high value-added quality competition strength.  相似文献   

11.
This paper generalizes Kunert and Martin’s (Ann Stat 28:1728–1742, 2000) method for finding optimal designs under a fixed interference model, to find optimal designs under a mixed interference model. The results are based on the properties of information matrices in fixed and mixed models given in Markiewicz (J Stat Plan Inference 59:127–137, 1997). The method is applied to find a design which is optimal for any given variances of random neighbor effects. Research partially supported by the KBN Grant Number 5 P03A 041 21.  相似文献   

12.
T. Shiraishi 《Metrika》1991,38(1):163-178
Summary Ink samples with unequal variances,M-tests for homogeneity ofk location parameters are proposed. The asymptoticχ 2-distributions of the test statistics and the robustness of the tests are investigated. NextM-estimators (ME’s) of parameters are discussed. Furthermore positive-part shrinkage versions (PSME’s) of theM-estimators for the location parameters are considered along with modified James-Stein estimation rule. In asymptotic distributional risks based on a special feasible loss, it is shown that the PSME’s dominate the ME’s, and preliminary test and shrinkageM-versions fork≧4.  相似文献   

13.
Liu  Shuangzhe  Neudecker  Heinz 《Metrika》1997,45(1):53-66
Extending Scheffé’s simplex-centroid design for experiments with mixtures, we introduce aweighted simplex-centroid design for a class of mixture models. Becker’s homogeneous functions of degree one belong to this class. By applying optimal design theory, we obtainA-, D- andI-optimal allocations of observations for Becker’s models.  相似文献   

14.
We use a stochastic frontier model with firm-specific technical inefficiency effects in a panel framework (Battese and Coelli in Empir Econ 20:325–332, 1995) to assess two popular probability of bankruptcy (PB) measures based on Merton model (Merton in J Financ 29:449–470, 1974) and discrete-time hazard model (DHM; Shumway in J Bus 74:101–124, 2001). Three important results based on our empirical studies are obtained. First, a firm with a higher PB generally has less technical efficiency. Second, for an ex-post bankrupt firm, its PB tends to increase and its technical efficiency of production tends to decrease, as the time to its bankruptcy draws near. Finally, the information content about firm’s technical inefficiency provided by PB based on DHM is significantly more than that based on Merton model. By the last result and the fact that economic-based efficiency measures are reasonable indicators of the long-term health and prospects of firms (Baek and Pagán in Q J Bus Econ 41:27–41, 2002), we conclude that PB based on DHM is a better credit risk proxy of firms.  相似文献   

15.
We employ a Bayesian normal hierarchical model to investigate the relationship between intention and behavior as it is posited by Ajzen and Fishbein’s theory of planned behavior (TPB). Area of application is the field of environmental behavior. Eleven studies reporting correlations between intention and behavior were identified. Our Bayesian hierarchical model expects a correlation of r xy = 0.54 between those variables. This effect size is above average with regard to meta-analyses, which include other, non-environmental areas of application.  相似文献   

16.
An important issue when conducting stochastic frontier analysis is how to choose a proper parametric model, which includes choices of the functional form of the frontier function, distributions of the composite errors, and also the exogenous variables. In this paper, we extend the likelihood ratio test of Vuong, Econometrica 57(2):307–333, (1989) and Takeuchi’s, Suri-Kagaku (Math Sci) 153:12–18, (1976) model selection criterion to the stochastic frontier models. The most attractive feature of this test is that it can not only be used for testing a non-nested model, but also still be applicable even when the general model is misspecified. Finally, we also demonstrate how to apply this test to the Indian farm data used by Battese and Coelli, J Prod Anal 3:153–169, (1992), Empir Econ 20(2):325–332, (1995) and Alvarez et al., J Prod Anal 25:201–212, (2006).  相似文献   

17.
Let {X j } be a strictly stationary sequence of negatively associated random variables with the marginal probability density function f(x). The recursive kernel estimators of f(x) are defined by
and the Rosenblatt–Parzen’s kernel estimator of f(x) is defined by , where 0  <  b n → 0 are bandwidths and K is some kernel function. In this paper, we study the uniformly Berry–Esseen bounds for these estimators of f(x). In particular, by choice of the bandwidths, the Berry–Esseen bounds of the estimators attain .  相似文献   

18.
Dr. J. M. Begun 《Metrika》1987,34(1):65-82
Summary For the two-sample problem with proportional hazard functions, we consider estimation of the constant of proportionality, known as relative risk, using complete uncensored data. For this very special case of Cox’s (1972) regression model for survival data, we find a two-step estimate which is asymptotically equivalent to Cox’s partial likelihood estimate, and we show that both estimates are asymptotically optimal (in the sense of minimum asymptotic variance) among all regular rank estimates of relative risk. Supported by NSF grant 81-00748.  相似文献   

19.
This paper uses quarterly price data and examines the transmission of shocks across different spatially separated locations besides identifying causality among these locations. Johansen and Juselius’s (Econ. Stat., 52, 160–210, 1990) multivariate cointegration procedure identified two cointegrating vectors among these locations. Following Toda and Yamamoto (J. Econom., 66, 225–250, 1995), causality tests showed only one bi-directional causality and it was between Peshawar and Hyderabad locations. Faisalabad and Sargodha appeared independent (i.e. exogenous) market locations in price discovery process. Peshawar market showed maximum (i.e. 5) number of significant links. The generalized impulse response functions, though, suggested similar (cyclical) pattern of responses across the markets, but their time profile, which provides insight into the system’s speed of convergence to long run equilibrium path, varied with different level of extent and persistency. Responses to shock originating in consumption markets (i.e. Karachi, Peshawar and Lahore) remained short lived; whereas the shocks stemming from surplus wheat producing locations (i.e Multan, Sargodha and Faisalabad) produced long and more persistent responses.   相似文献   

20.
Public infrastructure investment is an essential part of China’s regional development policy. This raises the question to what degree public infrastructure capital matters for labor productivity in China, at the regional level as well as over time. This paper estimates cost function models of production in industrial enterprises, using province-level data from 1993 to 2003. The estimated rate of return in industrial production is 23–25%, and on average public infrastructure contributes 2–3% points to the growth in labor productivity among these enterprises.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号