首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 93 毫秒
1.
This paper uses free-knot and fixed-knot regression splines in a Bayesian context to develop methods for the nonparametric estimation of functions subject to shape constraints in models with log-concave likelihood functions. The shape constraints we consider include monotonicity, convexity and functions with a single minimum. A computationally efficient MCMC sampling algorithm is developed that converges faster than previous methods for non-Gaussian models. Simulation results indicate the monotonically constrained function estimates have good small sample properties relative to (i) unconstrained function estimates, and (ii) function estimates obtained from other constrained estimation methods when such methods exist. Also, asymptotic results show the methodology provides consistent estimates for a large class of smooth functions. Two detailed illustrations exemplify the ideas.  相似文献   

2.
We consider the problem of estimating a relationship nonparametrically using regression splines when there exist both continuous and categorical predictors. We combine the global properties of regression splines with the local properties of categorical kernel functions to handle the presence of categorical predictors rather than resorting to sample splitting as is typically done to accommodate their presence. The resulting estimator possesses substantially better finite‐sample performance than either its frequency‐based peer or cross‐validated local linear kernel regression or even additive regression splines (when additivity does not hold). Theoretical underpinnings are provided and Monte Carlo simulations are undertaken to assess finite‐sample behavior; and two illustrative applications are provided. An implementation in R is available; see the R package ‘crs’ for details. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

3.
Penalized splines are used in various types of regression analyses, including non‐parametric quantile, robust and the usual mean regression. In this paper, we focus on the penalized spline estimator with general convex loss functions. By specifying the loss function, we can obtain the mean estimator, quantile estimator and robust estimator. We will first study the asymptotic properties of penalized splines. Specifically, we will show the asymptotic bias and variance as well as the asymptotic normality of the estimator. Next, we will discuss smoothing parameter selection for the minimization of the mean integrated squares error. The new smoothing parameter can be expressed uniquely using the asymptotic bias and variance of the penalized spline estimator. To validate the new smoothing parameter selection method, we will provide a simulation. The simulation results show that the consistency of the estimator with the proposed smoothing parameter selection method can be confirmed and that the proposed estimator has better behavior than the estimator with generalized approximate cross‐validation. A real data example is also addressed.  相似文献   

4.
《Statistica Neerlandica》2018,72(2):109-125
Consider the standard nonparametric regression model and take as estimator the penalized least squares function. In this article, we study the trade‐off between closeness to the true function and complexity penalization of the estimator, where complexity is described by a seminorm on a class of functions. First, we present an exponential concentration inequality revealing the concentration behavior of the trade‐off of the penalized least squares estimator around a nonrandom quantity, where such quantity depends on the problem under consideration. Then, under some conditions and for the proper choice of the tuning parameter, we obtain bounds for this nonrandom quantity. We illustrate our results with some examples that include the smoothing splines estimator.  相似文献   

5.
The economic theory of option pricing imposes constraints on the structure of call functions and state price densities. Except in a few polar cases, it does not prescribe functional forms. This paper proposes a nonparametric estimator of option pricing models which incorporates various restrictions (such as monotonicity and convexity) within a single least squares procedure. The bootstrap is used to produce confidence intervals for the call function and its first two derivatives and to calibrate a residual regression test of shape constraints. We apply the techniques to option pricing data on the DAX.  相似文献   

6.
abstract This study analyses how minority employees engage with control in organizations. Differently from most critical studies of diversity management, which focus on how minority employees are discursively controlled, we approach (diversity) management as a constellation of both identity‐regulating discourses and bureaucratic controls. We assume that minority employees are agents who actively resist and/or comply with the constellation of controls they are subject to. Based on qualitative data collected in a technical drawing company and a hospital, the specific constellation of controls in each organization is first reconstructed. Four interviews with minority employees are then analysed in depth, showing how their engagement with material and discursive controls creates both constraints and possibilities of micro‐emancipation.  相似文献   

7.
In this paper, we study an estimation problem where the variables of interest are subject to both right censoring and measurement error. In this context, we propose a nonparametric estimation strategy of the hazard rate, based on a regression contrast minimized in a finite‐dimensional functional space generated by splines bases. We prove a risk bound of the estimator in terms of integrated mean square error and discuss the rate of convergence when the dimension of the projection space is adequately chosen. Then we define a data‐driven criterion of model selection and prove that the resulting estimator performs an adequate compromise. The method is illustrated via simulation experiments that show that the strategy is successful.  相似文献   

8.
The flow of natural gas within a gas transmission network is studied with the aim to optimize such networks. The analysis of real data provides a deeper insight into the behaviour of gas in‐ and outflow. Several models for describing dependence between the maximal daily gas flow and the temperature on network exits are proposed. A modified sigmoidal regression is chosen from the class of parametric models. As an alternative, a semi‐parametric regression model based on penalized splines is considered. The comparison of models and the forecast of gas loads for very low temperatures based on both approaches is included. The application of the obtained results is discussed.  相似文献   

9.
Copulas are distributions with uniform marginals. Non‐parametric copula estimates may violate the uniformity condition in finite samples. We look at whether it is possible to obtain valid piecewise linear copula densities by triangulation. The copula property imposes strict constraints on design points, making an equi‐spaced grid a natural starting point. However, the mixed‐integer nature of the problem makes a pure triangulation approach impractical on fine grids. As an alternative, we study the ways of approximating copula densities with triangular functions which guarantees that the estimator is a valid copula density. The family of resulting estimators can be viewed as a non‐parametric MLE of B‐spline coefficients on possibly non‐equally spaced grids under simple linear constraints. As such, it can be easily solved using standard convex optimization tools and allows for a degree of localization. A simulation study shows an attractive performance of the estimator in small samples and compares it with some of the leading alternatives. We demonstrate empirical relevance of our approach using three applications. In the first application, we investigate how the body mass index of children depends on that of parents. In the second application, we construct a bivariate copula underlying the Gibson paradox from macroeconomics. In the third application, we show the benefit of using our approach in testing the null of independence against the alternative of an arbitrary dependence pattern.  相似文献   

10.
We investigate a novel database of 10,217 extreme operational losses from the Italian bank UniCredit. Our goal is to shed light on the dependence between the severity distribution of these losses and a set of macroeconomic, financial, and firm‐specific factors. To do so, we use generalized Pareto regression techniques, where both the scale and shape parameters are assumed to be functions of these explanatory variables. We perform the selection of the relevant covariates with a state‐of‐the‐art penalized‐likelihood estimation procedure relying on L1‐penalty terms. A simulation study indicates that this approach efficiently selects covariates of interest and tackles spurious regression issues encountered when dealing with integrated time series. Lastly, we illustrate the impact of different economic scenarios on the requested capital for operational risk. Our results have important implications in terms of risk management and regulatory policy.  相似文献   

11.
In the continuation primal and dual optimal control problems are formulated. The disticntion between state and control variables, state equations and control variable constraints are discarded. A general treatment is achieved by the introduction of a matrix F. The derivation of the dual from the primal follows approximately the approach by J.B. Rosen. The corresponding weak duality theorem is deduced. It is then indicated how typical optimal control formulations may be obtained by the specialisation of the matrices and vectors and their substitution.The authors gratefully acknowledge friendly advice and assistance given by Professor G.B. Dantzig, Stanford University, USA.  相似文献   

12.
The article presents an algorithm for linear regression computations subject to linear parametric equality constraints, linear parametric inequality constraints, or a mixture of the two. No rank conditions are imposed on the regression specification or the constraint specification. The algorithm requires a full Moore-Penrose g-inverse which entails extra computational effort relative to other orthonormalization type algorithms. In exchange, auxiliary statistical information is generated: feasibility of a set of constraints may be checked, estimability of a linear parametric function may be checked, and bias and variance may be decomposed by source.  相似文献   

13.
Covariate information is often available in randomised clinical trials for each subject prior to treatment assignment and is commonly utilised to make covariate adjustment for baseline characteristics predictive of the outcome in order to increase precision and improve power in the detection of a treatment effect. Motivated by a nonparametric covariance analysis, we study a projection approach to making objective covariate adjustment in randomised clinical trials on the basis of two unbiased estimating functions that decouple the outcome and covariate data. The proposed projection approach extends a weighted least‐squares procedure by projecting one of the estimating functions onto the linear subspace spanned by the other estimating function that is E‐ancillary for the average treatment effect. Compared with the weighted least‐squares method, the projection method allows for objective inference on the average treatment effect by exploiting the treatment specific covariate–outcome associations. The resulting projection‐based estimator of the average treatment effect is asymptotically efficient when the treatment‐specific working regression models are correctly specified and is asymptotically more efficient than other existing competitors when the treatment‐specific working regression models are misspecified. The proposed projection method is illustrated by an analysis of data from an HIV clinical trial. In a simulation study, we show that the proposed projection method compares favourably with its competitors in finite samples.  相似文献   

14.
Macro‐integration is the process of combining data from several sources at an aggregate level. We review a Bayesian approach to macro‐integration with special emphasis on the inclusion of inequality constraints. In particular, an approximate method of dealing with inequality constraints within the linear macro‐integration framework is proposed. This method is based on a normal approximation to the truncated multivariate normal distribution. The framework is then applied to the integration of international trade statistics and transport statistics. By combining these data sources, transit flows can be derived as differences between specific transport and trade flows. Two methods of imposing the inequality restrictions that transit flows must be non‐negative are compared. Moreover, the figures are improved by imposing the equality constraints that aggregates of incoming and outgoing transit flows must be equal.  相似文献   

15.
The goal of this article is to develop a flexible Bayesian analysis of regression models for continuous and categorical outcomes. In the models we study, covariate (or regression) effects are modeled additively by cubic splines, and the error distribution (that of the latent outcomes in the case of categorical data) is modeled as a Dirichlet process mixture. We employ a relatively unexplored but attractive basis in which the spline coefficients are the unknown function ordinates at the knots. We exploit this feature to develop a proper prior distribution on the coefficients that involves the first and second differences of the ordinates, quantities about which one may have prior knowledge. We also discuss the problem of comparing models with different numbers of knots or different error distributions through marginal likelihoods and Bayes factors which are computed within the framework of Chib (1995) as extended to DPM models by Basu and Chib (2003). The techniques are illustrated with simulated and real data.  相似文献   

16.
abstract The paper extends the Robbins and Pearce (1992 ) two‐stage turnaround response model to include governance factors. In addition to retrenchment and recovery, the paper proposes the addition of a realignment stage, referring specifically to the realignment of expectations of principal and agent groups. The realignment stage imposes a threshold that must be crossed before the retrenchment and hence recovery stage can be entered. Crossing this threshold is problematic to the extent that the interests of governance‐stakeholder groups diverge in a crisis situation. The severity of the crisis impacts on the bases of strategy contingent asset valuation leading to the fragmentation of stakeholder interests. In some cases the consequence may be that management are prevented from carrying out turnarounds by governance constraints. The paper uses a case study to illustrate these dynamics, and like the Robbins and Pearce study, it focuses on the textile industry. A longitudinal approach is used to show the impact of the removal of governance constraints. The empirical evidence suggests that such financial constraints become less serious to the extent that there is a functioning market for corporate control. Building on governance research and turnaround literature, the paper also outlines the general case necessary and sufficient conditions for successful turnarounds.  相似文献   

17.
The traditional fuzzy regression model involves two solving processes. First, the extension principle is used to derive the membership function of extrapolated values, and then, attempts are made to include every collected value with a membership degree of at least h in the fuzzy regression interval. However, the membership function of extrapolated values is sometimes highly complex, and it is difficult to determine the h value, i.e., the degree of fit between the input values and the extrapolative fuzzy output values, when the information obtained from the collected data is insufficient. To solve this problem, we proposed a simplified fuzzy regression equation based on Carlsson and Fullér’s possibilistic mean and variance method and used it for modeling the constraints and objective function of a fuzzy regression model without determining the membership function of extrapolative values and the value of h. Finally, we demonstrated the application of our model in forecasting pneumonia mortality. Thus, we verified the effectiveness of the proposed model and confirmed the potential benefits of our approach, in which the forecasting error is very small.  相似文献   

18.
This article develops a context‐sensitive approach to analyse how and why voice operates in small‐ to medium‐sized enterprises (SMEs), an area that remains under‐theorised and under‐researched. By building on a priori frameworks with proven ability to unpack complexity and take account of the wider context of SMEs, this article explores how resources (human and social capital) and constraints (product market, labour market and strategic orientation) interact to shape voice practices. The article finds significant differences between ‘reported’ compared with ‘actual’ practices in situ, and identifies different types of firms (‘strategic market regulation’, ‘strategic market‐led’ and ‘non‐strategic market‐led’) along with the factors that influence the form and practice of voice. Overall, the article argues that researchers should further pursue research that appreciates the layered nature of ontology and the role played by firm context to explain complex organisational phenomena, if we are to advance our understanding of voice practices in SMEs and beyond.  相似文献   

19.
Inference in the inequality constrained normal linear regression model is approached as a problem in Bayesian inference, using a prior that is the product of a conventional uninformative distribution and an indicator function representing the inequality constraints. The posterior distribution is calculated using Monte Carlo numerical integration, which leads directly to the evaluation of expected values of functions of interest. This approach is compared with others that have been proposed. Three empirical examples illustrate the utility of the proposed methods using an inexpensive 32-bit microcomputer.  相似文献   

20.
Single‐index models are popular regression models that are more flexible than linear models and still maintain more structure than purely nonparametric models. We consider the problem of estimating the regression parameters under a monotonicity constraint on the unknown link function. In contrast to the standard approach of using smoothing techniques, we review different “non‐smooth” estimators that avoid the difficult smoothing parameter selection. For about 30 years, one has had the conjecture that the profile least squares estimator is an ‐consistent estimator of the regression parameter, but the only non‐smooth argmin/argmax estimators that are actually known to achieve this ‐rate are not based on the nonparametric least squares estimator of the link function. However, solving a score equation corresponding to the least squares approach results in ‐consistent estimators. We illustrate the good behavior of the score approach via simulations. The connection with the binary choice and current status linear regression models is also discussed.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号