首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 133 毫秒
1.
In longitudinal trials, the number of accrual groups and their sizes should carefully be chosen to ensure a desired power to detect a specified treatment effect. Methods are proposed to obtain a cost‐effective combination of the number and size of accrual groups that provides high efficiency at minimal cost. We focus on trials where an event occurs at any point in time, but it is recorded on a discrete scale. The Weibull survival function is considered for modeling the underlying time to event. By using a cost function, it is shown that the ratio of the cost of recruiting and treating subjects to the cost of measuring them and also the survival pattern highly influence the optimal combination of the number and size of accrual groups. A maximin approach is further presented to obtain robust designs with respect to poor specification of these modeling parameters. We also show the application of the proposed optimal design methodology using real examples.  相似文献   

2.
Recent survey literature shows an increasing interest in survey designs that adapt data collection to characteristics of the survey target population. Given a specified quality objective function, the designs attempt to find an optimal balance between quality and costs. Finding the optimal balance may not be straightforward as corresponding optimisation problems are often highly non‐linear and non‐convex. In this paper, we discuss how to choose strata in such designs and how to allocate these strata in a sequential design with two phases. We use partial R‐indicators to build profiles of the data units where more or less attention is required in the data collection. In allocating cases, we look at two extremes: surveys that are run only once, or infrequent, and surveys that are run continuously. We demonstrate the impact of the sample size in a simulation study and provide an application to a real survey, the Dutch Crime Victimisation Survey.  相似文献   

3.
Many industrial and engineering applications are built on the basis of differential equations. In some cases, parameters of these equations are not known and are estimated from measurements leading to an inverse problem. Unlike many other papers, we suggest to construct new designs in the adaptive fashion ‘on the go’ using the A‐optimality criterion. This approach is demonstrated on determination of optimal locations of measurements and temperature sensors in several engineering applications: (1) determination of the optimal location to measure the height of a hanging wire in order to estimate the sagging parameter with minimum variance (toy example), (2) adaptive determination of optimal locations of temperature sensors in a one‐dimensional inverse heat transfer problem and (3) adaptive design in the framework of a one‐dimensional diffusion problem when the solution is found numerically using the finite difference approach. In all these problems, statistical criteria for parameter identification and optimal design of experiments are applied. Statistical simulations confirm that estimates derived from the adaptive optimal design converge to the true parameter values with minimum sum of variances when the number of measurements increases. We deliberately chose technically uncomplicated industrial problems to transparently introduce principal ideas of statistical adaptive design.  相似文献   

4.
This paper uses semidefinite programming (SDP) to construct Bayesian optimal design for nonlinear regression models. The setup here extends the formulation of the optimal designs problem as an SDP problem from linear to nonlinear models. Gaussian quadrature formulas (GQF) are used to compute the expectation in the Bayesian design criterion, such as D‐, A‐ or E‐optimality. As an illustrative example, we demonstrate the approach using the power‐logistic model and compare results in the literature. Additionally, we investigate how the optimal design is impacted by different discretising schemes for the design space, different amounts of uncertainty in the parameter values, different choices of GQF and different prior distributions for the vector of model parameters, including normal priors with and without correlated components. Further applications to find Bayesian D‐optimal designs with two regressors for a logistic model and a two‐variable generalised linear model with a gamma distributed response are discussed, and some limitations of our approach are noted.  相似文献   

5.
Response‐adaptive designs are being used increasingly in applications, and this is especially so in early phase clinical trials. This paper reviews a particular class of response‐adaptive designs that have the property of picking the superior treatment with probability tending to one. This is a desirable property from an ethical point of view in clinical trials. The model underlying such designs is a randomly reinforced urn. This paper provides an overview of results for these designs, starting from the early paper of Durham and Yu (1990) until the recent work by Flournoy, May, Moler and Plo (2010).  相似文献   

6.
This paper deals with the finite‐sample performance of a set of unit‐root tests for cross‐correlated panels. Most of the available macroeconomic time series cover short time periods. The lack of information, in terms of time observations, implies that univariate tests are not powerful enough to reject the null of a unit‐root while panel tests, by exploiting the large number of cross‐sectional units, have been shown to be a promising way of increasing the power of unit‐root tests. We investigate the finite sample properties of recently proposed panel unit‐root tests for cross‐sectionally correlated panels. Specifically, the size and power of Choi's [Econometric Theory and Practice: Frontiers of Analysis and Applied Research: Essays in Honor of Peter C. B. Phillips, Cambridge University Press, Cambridge (2001)], Bai and Ng's [Econometrica (2004), Vol. 72, p. 1127], Moon and Perron's [Journal of Econometrics (2004), Vol. 122, p. 81], and Phillips and Sul's [Econometrics Journal (2003), Vol. 6, p. 217] tests are analysed by a Monte Carlo simulation study. In synthesis, Moon and Perron's tests show good size and power for different values of T and N, and model specifications. Focusing on Bai and Ng's procedure, the simulation study highlights that the pooled Dickey–Fuller generalized least squares test provides higher power than the pooled augmented Dickey–Fuller test for the analysis of non‐stationary properties of the idiosyncratic components. Choi's tests are strongly oversized when the common factor influences the cross‐sectional units heterogeneously.  相似文献   

7.
Bayesian experimental design is a fast growing area of research with many real‐world applications. As computational power has increased over the years, so has the development of simulation‐based design methods, which involve a number of algorithms, such as Markov chain Monte Carlo, sequential Monte Carlo and approximate Bayes methods, facilitating more complex design problems to be solved. The Bayesian framework provides a unified approach for incorporating prior information and/or uncertainties regarding the statistical model with a utility function which describes the experimental aims. In this paper, we provide a general overview on the concepts involved in Bayesian experimental design, and focus on describing some of the more commonly used Bayesian utility functions and methods for their estimation, as well as a number of algorithms that are used to search over the design space to find the Bayesian optimal design. We also discuss other computational strategies for further research in Bayesian optimal design.  相似文献   

8.
This article considers processes of urban development within the context of mega‐event preparations in Rio de Janeiro. We begin with a brief overview of these development processes, highlighting their connections to political and economic change in recent years. Proponents of these mega‐event‐led initiatives argue that Rio is undergoing a period of inclusive growth and integration: a perspective we call here a ‘post‐Third‐World city' narrative of urban renewal. Critics, however, contend that urban officials are harnessing mega‐events (e.g. the 2014 World Cup and the 2016 Olympic Games) to push forward a neoliberal agenda of socially unjust policies benefiting the interests of capital and marginalizing the city's poor and especially its favelas (i.e. the ‘city‐of‐exception' thesis). In this article we explore the insights of these two perspectives and consider why they have grown popular in recent years. Though we side generally with the city‐of‐exception thesis, we argue that important geographic and historical particularities must also be accounted for. Without carefully situating analytical perspectives empirically—in particular, cases in which theoretical models are drawn from European and North American contexts—urban researchers risk concealing more than they reveal in analyses of rapidly developing countries like Brazil.  相似文献   

9.
In the paper the problem of optimum experimental design for estimating parameters of multivariate regression functions is considered. We address the question: under what conditions one can compose the optimal design from partial designs, obtained by considering partial regressions, which depend on reduced number of variables. After reinterpreting and reviewing briefly existing results we provide some new conditions.  相似文献   

10.
We consider the Case 1 interval censoring approach for right‐censored survival data. An important feature of the model is that right‐censored event times are not observed exactly, but at some inspection times. The model covers as particular cases right‐censored data, current status data, and life table survival data with a single inspection time. We discuss the nonparametric estimation approach and consider three nonparametric estimators for the survival function of failure time: maximum likelihood, pseudolikelihood, and the naïve estimator. We establish strong consistency of the estimators with the L1 rate of convergence. Simulation results confirm consistency of the estimators.  相似文献   

11.
Non‐response is a common source of error in many surveys. Because surveys often are costly instruments, quality‐cost trade‐offs play a continuing role in the design and analysis of surveys. The advances of telephone, computers, and Internet all had and still have considerable impact on the design of surveys. Recently, a strong focus on methods for survey data collection monitoring and tailoring has emerged as a new paradigm to efficiently reduce non‐response error. Paradata and adaptive survey designs are key words in these new developments. Prerequisites to evaluating, comparing, monitoring, and improving quality of survey response are a conceptual framework for representative survey response, indicators to measure deviations thereof, and indicators to identify subpopulations that need increased effort. In this paper, we present an overview of representativeness indicators or R‐indicators that are fit for these purposes. We give several examples and provide guidelines for their use in practice.  相似文献   

12.
Norms of citizenship are seen as a precondition for a functioning polity and society. But what determines the importance citizens attach to these norms? Are individual‐level features, like education or social embeddedness, relevant? Do system‐level features like the economic situation or quality of governance matter? Our findings from a multilevel analysis indicate that, paradoxically, a political system's effectiveness and legitimacy undermine the very norms on which it depends for both effectiveness and legitimation. In well‐functioning states, citizens' attachment to civic norms declines. As for the effect of welfare policies, there is no “crowding‐out” effect in the sense that if the state provides for citizens who are less well off, solidarity among citizens was reduced. Few individual‐level characteristics that relate to the public sphere—such as social embeddedness—are found to matter, indicating that norms are perpetuated in the private sphere.  相似文献   

13.
This study investigates the economic impact of the financial regulations that aimed to control the housing market in Korea during the reign of late President Ro's Administration, which had diligently fought against the then speculative bubble in the Korean real‐estate market. We test for the validity of the general prediction that the financial regulations in the form of the loan‐to‐value (LTV) and debt‐to‐income (DTI) restrictions would have adverse impacts on the value of the firms operating in the mortgage‐lending industry. In this event study, we select two critical days as event dates and check whether the stock prices of the financial firms react negatively to the announcements of the regulations. Overall, the initial imposition of the DTI restrictions (i.e., the first event) adversely affects those banks that possess a relatively large number of mortgage loans in their asset portfolio. By contrast, banks that hold a small number of mortgage loans appear to benefit from the risk‐reducing effect of the DTI regulation. Subsequently, the reinforcement of the LTV and DTI rules (i.e., the second event) has negative impacts on the banks with large mortgage loans. The degree of this adverse effect is greater in the second event than in the first event (i.e., the DTI restrictions). The reinforced regulations also unfavorably affect the savings banks with large mortgage loans but to a lesser degree compared with their counterparts in the banks. Meanwhile, the reinforcement of the financial regulations has negligible impacts on the banks and the savings banks with smaller mortgage loans.  相似文献   

14.
Estimation of nitrogen response functions has a long history and yet there is still considerable uncertainty about how much nitrogen to apply to agricultural crops. Nitrogen recommendations are usually based on estimation of agronomic production functions that typically use data from designed experiments. Nitrogen experiments, for example, often use equally spaced levels of nitrogen. Past agronomic research is mostly supportive of plateau-type functional forms. The question addressed is if one is willing to accept a specific plateau-type functional form as the true model, what experimental design is the best to use for estimating the production function? The objective is to minimize the variance of the estimated expected profit maximizing level of input. Of particular interest is how well does the commonly used equally-spaced design perform in comparison to the optimal design. Mixed effects models for winter wheat (Triticum aestivium L.) yield are estimated for both Mitscherlich and linear plateau functions. With three design points, one should be high enough to be on the plateau and one should be at zero. The choice of the middle design point makes little difference over a wide range of values. The optimal middle design point is lower for the Mitscherlich functional form than it is for the plateau function. Equally spaced designs with more design points have a similar precision and thus the loss from using a nonoptimal experimental design is small.  相似文献   

15.
The Wooldridge method is based on a simple and novel strategy to deal with the initial values problem in nonlinear dynamic random‐effects panel data models. The characteristic of the method makes it very attractive in empirical applications. However, its finite sample performance and robustness are not fully known as of yet. In this paper we investigate the performance and robustness of this method in comparison with an ideal case in which the initial values are known constants; the worst scenario is based on an exogenous initial values assumption, and the Heckman's reduced‐form approximation method, which is widely used in the literature. The dynamic random‐effects probit and Tobit (type I) models are used as working examples. Various designs of the Monte Carlo experiments and two further empirical illustrations are provided. The results suggest that the Wooldridge method works very well only for the panels of moderately long duration (longer than 5–8 periods). Heckman's reduced‐form approximation is suggested for short panels (shorter than 5 periods). It is also found that all the methods tend to perform equally well for panels of long duration (longer than 15–20 periods). Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

16.
We study competitive interaction between two alternative models of digital content distribution over the Internet: peer‐to‐peer (p2p) file sharing and centralized client–server distribution. We present microfoundations for a stylized model of p2p file sharing where all peers are endowed with standard preferences and show that the endogenous structure of the network is conducive to sharing by a significant number of peers, even if sharing is costlier than freeriding. We build on this model of p2p to analyze the optimal strategy of a profit‐maximizing firm, such as Apple, that offers content available at positive prices. We characterize the size of the p2p network as a function of the firm's pricing strategy, and show that the firm may be better off setting high prices, allowing the network to survive, and that the p2p network may work more efficiently in the presence of the firm than in its absence.  相似文献   

17.
When outsourcing their logistics operations to transportation companies, manufacturers/retailers need to design a contract, under which payment can be made either in a lump sum or over time (i.e., per each delivery). This paper investigates how the payment method (i.e., type of contracts) impacts the transporter's delivery schedule by developing an analytical model based on the optimal control and game theories. Our findings show that the transporter's delivery schedule depends on the method of payment and the overall cost of hiring a transporter varies with the types of contracts. We provide theoretical explanations to these findings along with managerial implications. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

18.
We study an agency model in which an entrepreneur selects a manager from a candidate set. The selected manager's effort improves the project's potential environment, and is a hidden action. The realized project environment is the entrepreneur's private information. A manager's utility has two components—(i) loyalty, with which the manager values the organization's profit, and (ii) selfishness, with which the manager values the monetary transfer he receives from the entrepreneur. We find that if the manager's task is easy enough, it is optimal to use a purely loyal manager. Otherwise, it can be optimal to use a manager with mixture of loyalty and selfishness—the manager's mixed motivation alleviates the entrepreneur's misrepresenting incentive, and as a result, the output distortion in the optimal contract can be reduced. In addition, when it is optimal to use a manager with mixed motivations, the entrepreneur selects someone who is more selfish than loyal.  相似文献   

19.
The Optimality of Single-group Designs for Certain Mixed Models   总被引:1,自引:0,他引:1  
Thomas Schmelter 《Metrika》2007,65(2):183-193
In this paper optimal designs for the estimation of the fixed effects (population parameters) in a certain class of mixed models are investigated. Two classes of designs are compared: the class of single-group designs, where all individuals are observed under the same approximate design, and the class of more-group designs with the same mean number of observations per individual as before, where each individual can be observed under a different approximate design. It is shown that any design that is Φ-optimal in the class of single-group designs is also Φ-optimal in the larger class of more-group designs. The considered optimality criteria only have to satisfy mild assumptions, which is eg the case for the D-criterion and all linear criteria.  相似文献   

20.
This paper studies minimally-supported D-optimal designs for polynomial regression model with logarithmically concave (log-concave) weight functions. Many commonly used weight functions in the design literature are log-concave. For example, and exp(−x 2) in Theorem 2.3.2 of Fedorov (Theory of optimal experiments, 1972) are all log-concave. We show that the determinant of information matrix of minimally-supported design is a log-concave function of ordered support points and the D-optimal design is unique. Therefore, the numerically D-optimal designs can be constructed efficiently by cyclic exchange algorithm.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号