首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
This paper presents a design for compensation systems for green strategy implementation based on parametric and non‐parametric approaches. The purpose of the analysis is to use formal modeling to explain the issues that arise with the multi‐task problem of implementing an environmental strategy in addition to an already existing profit‐oriented strategy. For the first class of compensation systems (parametric), a multi‐task model is used as a basis. For the second class of compensation systems (non‐parametric), data envelopment analysis is applied.Copyright © 2003 John Wiley & Sons, Ltd. and ERP Environment  相似文献   

2.
I examine the effects of insurance status and managed care on hospitalization spells, and develop a new approach for sample selection problems in parametric duration models. MLE of the Flexible Parametric Selection (FPS) model does not require numerical integration or simulation techniques. I discuss application to the exponential, Weibull, log‐logistic and gamma duration models. Applying the model to the hospitalization data indicates that the FPS model may be preferred even in cases in which other parametric approaches are available. Copyright © 2002 John Wiley & Sons, Ltd.  相似文献   

3.
We consider the estimation of the conditional mode function when the covariates take values in some abstract function space. The main goal of this paper was to establish the almost complete convergence and the asymptotic normality of the kernel estimator of the conditional mode when the process is assumed to be strongly mixing and under the concentration property over the functional regressors. Some applications are given. This approach can be applied in time‐series analysis to the prediction and confidence band building. We illustrate our methodology by using El Nio data.  相似文献   

4.
This article is concerned with the inference on seemingly unrelated non‐parametric regression models with serially correlated errors. Based on an initial estimator of the mean functions, we first construct an efficient estimator of the autoregressive parameters of the errors. Then, by applying an undersmoothing technique, and taking both of the contemporaneous correlation among equations and serial correlation into account, we propose an efficient two‐stage local polynomial estimation for the unknown mean functions. It is shown that the resulting estimator has the same bias as those estimators which neglect the contemporaneous and/or serial correlation and smaller asymptotic variance. The asymptotic normality of the resulting estimator is also established. In addition, we develop a wild block bootstrap test for the goodness‐of‐fit of models. The finite sample performance of our procedures is investigated in a simulation study whose results come out very supportive, and a real data set is analysed to illustrate the usefulness of our procedures.  相似文献   

5.
Stochastic frontier models are often employed to estimate fishing vessel technical efficiency. Under certain assumptions, these models yield efficiency measures that are means of truncated normal distributions. We argue that these measures are flawed, and use the results of Horrace ( 2005 ) to estimate efficiency for 39 vessels in the Northeast Atlantic herring fleet, based on each vessel's probability of being efficient. We develop a subset selection technique to identify groups of efficient vessels at pre‐specified probability levels. When homogeneous production is assumed, inferential inconsistencies exist between our methods and the methods of ranking the means of the technical inefficiency distributions for each vessel. When production is allowed to be heterogeneous, these inconsistencies are mitigated. Copyright © 2007 John Wiley & Sons, Ltd.  相似文献   

6.
  • As a result of the increasing adoption of private sector firms' values and concepts, non‐profit organizations (NPOs) are becoming more and more aware of intangible assets' importance for achieving competitive advantages. Even though reputation can be considered an organization's central intangible asset, there is still no appropriate measurement approach for reputation in this context. In this paper, we identify the dimensions of NPO reputation and develop indices to measure these components. We develop a model by means of a qualitative inquiry and a quantitative study using a large‐scale sample from the German general public. We find support for a two‐dimensional measurement approach comprising an affective and cognitive component as well as four antecedent constructs (“quality,” “performance,” “organizational social responsibility (OSR),” and “attractiveness”). The results of a second quantitative study in which we examine NPO reputation's relationship with important outcome variables, such as willingness to donate or work as an honorary member, provide support for the measurement approach's stability as well as criterion validity. Furthermore, the results reveal the affective dimension's importance regarding positively influencing donor behavior.
Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

7.
For business and environmental reasons, increased understanding of green consumer behavior is essential. This paper addresses consumer adoption and non‐adoption of a high involvement eco‐innovation (the alternative fuel vehicle, AFV). The purpose is to integrate two research streams to explore factors driving and hindering adoption. The factors are rooted in environmental psychology research and the diffusion of innovation literature. Survey results on Swedish car owners are reported. The results indicate that adopters and non‐adopters differ on norms, attitudes, novelty seeking and on how innovation attributes are perceived. Furthermore, the results show that the groups rank car attributes such as fuel consumption and carbon dioxide emissions differently. The main contribution of the paper is the integration of norms and attitudes together with consumer adoption factors in analyzing green consumer behavior in relation to a high involvement product. The implications for business and marketing strategy and for environmental policy are discussed. Copyright © 2010 John Wiley & Sons, Ltd and ERP Environment.  相似文献   

8.
This paper analyzes the effect of public R&D subsidies on firms' private R&D investment per employee and new product sales in German manufacturing. Parametric and semiparametric two‐step selection models are applied to this evaluation problem. The results show that the average treatment effect on the treated firms' R&D intensity is positive. The estimated effects are robust with respect to the different selection models. Further results show that publicly induced R&D spending is as productive as private R&D investment in generating new product sales. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   

9.
10.
Many new statistical models may enjoy better interpretability and numerical stability than traditional models in survival data analysis. Specifically, the threshold regression (TR) technique based on the inverse Gaussian distribution is a useful alternative to the Cox proportional hazards model to analyse lifetime data. In this article we consider a semi‐parametric modelling approach for TR and contribute implementational and theoretical details for model fitting and statistical inferences. Extensive simulations are carried out to examine the finite sample performance of the parametric and non‐parametric estimates. A real example is analysed to illustrate our methods, along with a careful diagnosis of model assumptions.  相似文献   

11.
Economic theory does not always specify the functional relationship between dependent and explanatory variables, or even isolate a particular set of covariates. This means that model uncertainty is pervasive in empirical economics. In this paper, we indicate how Bayesian semi‐parametric regression methods in combination with stochastic search variable selection can be used to address two model uncertainties simultaneously: (i) the uncertainty with respect to the variables which should be included in the model and (ii) the uncertainty with respect to the functional form of their effects. The presented approach enables the simultaneous identification of robust linear and nonlinear effects. The additional insights gained are illustrated on applications in empirical economics, namely willingness to pay for housing, and cross‐country growth regression.  相似文献   

12.
Consumption‐based equivalence scales are estimated by applying the extended partially linear model (EPLM) to the 1998 German Income and Consumption Survey (EVS). In this model the equivalence scales are identified from nonlinearities in household demand. The econometric framework should not therefore impose strong restrictions on the functional forms of household expenditure shares. The chosen semi‐parametric specification meets this requirement. It is flexible, it yields ‐consistent parameter estimates and it is consistent with consumer theory. Estimated equivalence scales are below or in the range of the expert equivalence scales of the German social benefits system. Copyright © 2006 John Wiley & Sons, Ltd.  相似文献   

13.
14.
This paper proposes a method of data analysis founded on the philosophy and understanding of uncertain knowledge developed by Bruno de Finetti. Specifically, the paper investigates the informational content of interest rates for the prediction of M1. This empirical application replicates that of Cooley and Leroy (1981) and McAleer, Pagan, and Volker (1985), but the procedures and their interpretation follow the operational subjective approach. The issue of an autocorrelated error structure is recast in the operational subjective context. Methods are developed to assess the interest sensitivity of the demand for money in this context. © 1996 John Wiley & Sons, Ltd.  相似文献   

15.
We provide a partial ordering view of horizontal inequity (HI), based on the Lorenz criterion, associated with different post‐tax income distributions and a (bistochastic) non‐parametric estimated benchmark distribution. As a consequence, several measures consistent with the Lorenz criterion can be rationalized. In addition, we establish the so‐called HI transfer principle, which imposes a normative minimum requirement that any HI measure must satisfy. Our proposed HI ordering is consistent with this principle. Moreover, we adopt a cardinal view to decompose the total effect of a tax system into a welfare gain caused by HI‐free income redistribution and a welfare loss caused by HI, without any additive decomposable restriction on the indices. Hence, more robust tests can be applied. Other decompositions in the literature are seen as particular cases.  相似文献   

16.
Immanent to the retail business is its multiproduct nature. In a recent issue of this journal Richards (2006) introduces a multiproduct model for perishable food items. Richards concludes that depth and breadth of sales are complementary marketing tools to increase store sales. He also provides empirical support for this hypothesis. We will show in the following that Richards' model does not suggest a complementarity between depth and breadth of sales. Instead, these tools are substitutes in his model. Therefore, the interpretation of his empirical result has to be reconsidered. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

17.
This paper derives a procedure for simulating continuous non‐normal distributions with specified L‐moments and L‐correlations in the context of power method polynomials of order three. It is demonstrated that the proposed procedure has computational advantages over the traditional product‐moment procedure in terms of solving for intermediate correlations. Simulation results also demonstrate that the proposed L‐moment‐based procedure is an attractive alternative to the traditional procedure when distributions with more severe departures from normality are considered. Specifically, estimates of L‐skew and L‐kurtosis are superior to the conventional estimates of skew and kurtosis in terms of both relative bias and relative standard error. Further, the L‐correlation also demonstrated to be less biased and more stable than the Pearson correlation. It is also shown how the proposed L‐moment‐based procedure can be extended to the larger class of power method distributions associated with polynomials of order five.  相似文献   

18.
Despite certain advances for non‐randomized response (NRR) techniques in the past 6 years, the existing non‐randomized crosswise and triangular models have several limitations in practice. In this paper, I propose a new NRR model, called the parallel model with a wider application range. Asymptotical properties of the maximum likelihood estimator (and its modified version) for the proportion of interest are explored. Theoretical comparisons with the crosswise and triangular models show that the parallel model is always more efficient than the two existing NRR models for most of the possible parameter ranges. Bayesian methods for analyzing survey data from the parallel model are developed. A case study on college students' premarital sexual behavior at Wuhan and a case study on plagiarism at the University of Hong Kong are conducted and are used to illustrate the proposed methods. © 2014 The Authors. Statistica Neerlandica © 2014 VVS.  相似文献   

19.
A two‐part process is employed to analyse the role of efficiency in merger and acquisition (M&A) activity in Australian credit unions during the period 1993–1997. The measures of efficiency are derived using the non‐parametric technique of data envelopment analysis. The first part uses panel data in the probit model to relate pure technical efficiency, along with other managerial, regulatory and financial factors, to the probability of merger activity, either as an acquiring or acquired entity. The results indicate that loan portfolio diversification, management ability, earnings and asset size are a significant influence on the probability of acquisition, though the primary determinant of being acquired is smaller asset size. The second part uses a tobit model adapted to a panel framework to analyse post‐merger efficiency. Mergers appear to have improved both pure technical efficiency and scale efficiency in the credit union industry. Copyright © 2001 John Wiley & Sons, Ltd.  相似文献   

20.
A rich theory of production and analysis of productive efficiency has developed since the pioneering work by Tjalling C. Koopmans and Gerard Debreu. Michael J. Farrell published the first empirical study, and it appeared in a statistical journal (Journal of the Royal Statistical Society), even though the article provided no statistical theory. The literature in econometrics, management sciences, operations research and mathematical statistics has since been enriched by hundreds of papers trying to develop or implement new tools for analysing productivity and efficiency of firms. Both parametric and non‐parametric approaches have been proposed. The mathematical challenge is to derive estimators of production, cost, revenue or profit frontiers, which represent, in the case of production frontiers, the optimal loci of combinations of inputs (like labour, energy and capital) and outputs (the products or services produced by the firms). Optimality is defined in terms of various economic considerations. Then the efficiency of a particular unit is measured by its distance to the estimated frontier. The statistical problem can be viewed as the problem of estimating the support of a multivariate random variable, subject to some shape constraints, in multiple dimensions. These techniques are applied in thousands of papers in the economic and business literature. This ‘guided tour’ reviews the development of various non‐parametric approaches since the early work of Farrell. Remaining challenges and open issues in this challenging arena are also described. © 2014 The Authors. International Statistical Review © 2014 International Statistical Institute  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号