首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 62 毫秒
1.
We consider situations where the a priori guidance provided by theoretical considerations indicates only that the function linking the endogenous and exogenous variables is monotone and concave (or convex). We present methods to evaluate the adequacy of a parametric functional form to represent the relationship given the minimal maintained assumption of monotonicity and concavity (or convexity). We evaluate the adequacy of an assumed parametric form by comparing the deviations of the fitted parametric form from the observed data with the corresponding deviations estimated under DEA. We illustrate the application of our proposed methods using data collected from school districts in Texas. Specifically, we examine whether the Cobb–Douglas and translog specifications commonly employed in studies of education production are appropriate characterizations. Our tests reject the hypotheses that either the Cobb–Douglas or the translog specification is an adequate approximation to the general monotone and concave production function for the Texas school districts.  相似文献   

2.
It is shown geometrically that a monotone concave preference order can be approximated by orders representable by a concave utility function. This is applied to proving that preferences with ‘desirable’ properties (such as inducing smooth excess demand functions, analyticity, strict convexity) are dense.  相似文献   

3.
A procedure is given for the construction of a monotone estimator that dominates a given estimator for a class of discrete distributions with monotone likelihood ratio. This procedure is applied to some empirical Bayes estimators. Monte Carlo results are given that demonstrate the usefulness of monotonizing.  相似文献   

4.
In the paradigm of von Neumann and Morgenstern (1947), a representation of affine preferences in terms of an expected utility can be obtained under the assumption of weak continuity. Since the weak topology is coarse, this requirement is a priori far from being negligible. In this work, we replace the assumption of weak continuity by monotonicity. More precisely, on the space of lotteries on an interval of the real line, it is shown that any affine preference order which is monotone with respect to the first stochastic order admits a representation in terms of an expected utility for some nondecreasing utility function. As a consequence, any affine preference order on the subset of lotteries with compact support, which is monotone with respect to the second stochastic order, can be represented in terms of an expected utility for some nondecreasing concave utility function. We also provide such representations for affine preference orders on the subset of those lotteries which fulfill some integrability conditions. The subtleties of the weak topology are illustrated by some examples.  相似文献   

5.

This paper proposes a semiparametric smooth-varying coefficient input distance frontier model with multiple outputs and multiple inputs, panel data, and determinants of technical inefficiency for the Indonesian banking industry during the period 2000 to 2015. The technology parameters are unknown functions of a set of environmental factors that shift the input distance frontier non-neutrally. The computationally simple constraint weighted bootstrapping method is employed to impose the regularity constraints on the distance function. As a by-product, total factor productivity (TFP) growth is estimated and decomposed into technical change, scale component, and efficiency change. The distance elasticities, marginal effects of the environmental factors on the distance elasticities, temporal behavior of technical efficiency, and also TFP growth and its components are investigated.

  相似文献   

6.
In this paper we deal with the problem of classifying a p-dimensional random vector into one of two elliptically contoured populations with unknown and distinct mean vectors and a common, but unknown, scale matrix. The classification procedure is based on two-step monotone training samples, one from each population, with the same monotone pattern. Our aim is to extend the classification procedure, which proposed recently by Chung and Han (Ann Ins Stat Math 52:544–556, 2000). This procedure is a linear combination of two discriminant functions, one based on the complete samples and the other on the incomplete samples. The performance of the proposed classification rule is compared with the plug-in method, this means with the classification rule which arises if the unknown parameters are substituted, into the usual classification rule, by their estimators. In order to apply the plug-in method, the MLE of the location parameters and of the common scale matrix of g ≥ 2 elliptically contoured populations are analytically obtained on the basis of two-step monotone training samples.  相似文献   

7.
In this paper, we propose a general approach to find the closest targets for a given unit according to a previously specified criterion of similarity. The idea behind this approach is that closer targets determine less demanding levels of operation for the inputs and outputs of the inefficient units to perform efficiently. Similarity can be interpreted as closeness between the inputs and outputs of the assessed unit and the proposed targets, and this closeness can be measured by using either different distance functions or different efficiency measures. Depending on how closeness is measured, we develop several mathematical programming problems that can be easily solved and guarantee to reach the closest projection point on the Pareto-efficient frontier. Thus, our approach leads to the closest targets by means of a single-stage procedure, which is easier to handle than those based on algorithms aimed at identifying all the facets of the efficient frontier.
José L. RuizEmail:
  相似文献   

8.
It is commonly accepted that some financial data may exhibit long-range dependence, while other financial data exhibit intermediate-range dependence or short-range dependence. These behaviours may be fitted to a continuous-time fractional stochastic model. The estimation procedure proposed in this paper is based on a continuous-time version of the Gauss–Whittle objective function to find the parameter estimates that minimize the discrepancy between the spectral density and the data periodogram. As a special case, the proposed estimation procedure is applied to a class of fractional stochastic volatility models to estimate the drift, standard deviation and memory parameters of the volatility process under consideration. As an application, the volatility of the Dow Jones, S&P 500, CAC 40, DAX 30, FTSE 100 and NIKKEI 225 is estimated.  相似文献   

9.
In this work, we analyze the performance of production units using the directional distance function which allows to measure the distance to the frontier of the production set along any direction in the inputs/outputs space. We show that this distance can be expressed as a simple transformation of radial or hyperbolic distance. This formulation allows to define robust directional distances in the lines of α-quantile or order-m partial frontiers and also conditional directional distance functions, conditional to environmental factors. We propose simple methods of estimation and derive the asymptotic properties of our estimators.  相似文献   

10.
In industry sectors where market prices for goods and services are unavailable, it is common to use estimated output and input distance functions to estimate rates of productivity change. It is also possible, but less common, to use estimated distance functions to estimate the normalised support (or efficient) prices of individual inputs and outputs. A problem that arises in the econometric estimation of these functions is that more than one variable in the estimating equation may be endogenous. In such cases, maximum likelihood estimation can lead to biased and inconsistent parameter estimates. To solve the problem, we use linear programming to construct a quantity index. The distance function is then written in the form of a conventional stochastic frontier model where the explanatory variables are unambiguously exogenous. We use this approach to estimate productivity indexes, measures of environmental change, levels of efficiency, and support prices for a sample of Australian public hospitals.  相似文献   

11.
In this note we study some properties of pseudo P-convex functions a class of generalized convex functions, recently introduced by Hackman and Passy, defined in product spaces of finite dimension. In particular we introduce some generalized monotone maps studying their relationships with the gradient of differentiable pseudo P-convex functions and a class of continuous P-connected functions that in a differentiable setting belongs to the class of pseudo P-convex functions.
Sommario In questa nota si studiano alcune proprietà delle funzioni pseudo P-convesse, una classe di funzioni convesse generalizzate, recentemente introdotte da Hackman e Passy, definite nel prodotto cartesiano di spazi euclidei. In particolare si studiano alcune proprietà di monotonia del gradiente di funzioni pseudo P-convesse differenziabili e si evidenziano alcuni collegamenti con le funzioni P-connesse continue.


This research is partially supported by the Italian Ministery of University and Scientific Research.  相似文献   

12.
In this paper, a particular class of bicriteria maximization problems over a compact polyhedron is considered. The first component of the objective function is the ratio of powers of affine functions and the second one is linear. Several theoretical properties are provided, such as the pseudoconcavity of the first criterium of the objective function, the connectedness and compactness of both the efficient frontier and the set of efficient points. The obtained results allow us to propose a new simplex-like solution method for generating the whole efficient frontier; to better clarify the use of the suggested algorithm, several examples are described and the results of a computational test are presented.  相似文献   

13.
The object of the present paper is to discuss some applications of an inequality involving conditional and unconditional expectations of a monotone function. The fields of application are accident statistics, personnel selection and Bessel functions.  相似文献   

14.
We present short proofs of some basic results from isotonic regression theory. A straightforward argument is given to show that the left continuous version of the concave majorant of the empirical distribution function maximizes the likelihood function f↦f (X,)… f (X n ) within the class of non-increasing densities. Similarly, it is shown that the nonparametric maximum likelihood estimator (NPMLE) of the distribution function of interval censored data has an interpretation in terms of the left derivative of a convex minor ant. Finally, a short proof is given to show that the number of vertices of the concave major ant of the uniform empirical distribution function is asymptotically normal with asymptotic mean and variance both equal to log n .  相似文献   

15.
The ability of a production unit to transform inputs into outputs is influenced by its technical efficiency and external operating environment. This paper introduces a nonparametric, linear programming, frontier procedure for obtaining a measure of managerial efficiency that controls for exogenous features of the operating environment. The approach also provides statistical tests of the effects of external conditions on the efficient use of each individual input (for an input oriented model) or for each individual output (for an output oriented model). The procedure is illustrated for a sample of nursing homes.  相似文献   

16.
The ability to quantify tradeoffs involved in the process of reducing harmful emissions is essential to successful policy-making in the environmental planning area. The approach by Färe et al. (J Econom 126: 469–492, 2005) to computing point estimates of the marginal abatement costs (MACs) of reducing pollution by estimating the directional output distance function has been gaining popularity in recent years. The contribution of this study is to compute MACs as slopes of the iterated parametric production possibilities frontier (PPF) estimated on the basis of the set of efficient projections of observable output combinations obtained from the parameters of directional output distance function. Policy makers are thus provided with the general shape of the production possibilities set for a polluting technology rather than with a set of point estimates of the MACs. We apply our methodology to a balanced panel of seven Korean manufacturing sectors spanning the period between 1999 and 2009, obtaining theoretically consistent concave PPFs based on a large set of directional output distance vectors. Finally, we estimate the parameters of a directional output distance function corresponding to the iterated PPF.  相似文献   

17.
We consider Grenander‐type estimators for a monotone function , obtained as the slope of a concave (convex) estimate of the primitive of λ. Our main result is a central limit theorem for the Hellinger loss, which applies to estimation of a probability density, a regression function or a failure rate. In the case of density estimation, the limiting variance of the Hellinger loss turns out to be independent of λ.  相似文献   

18.
There are two main methods for measuring the efficiency of decision-making units (DMUs): data envelopment analysis (DEA) and stochastic frontier analysis (SFA). Each of these methods has advantages and disadvantages. DEA is more popular in the literature due to its simplicity, as it does not require any pre-assumption and can be used for measuring the efficiency of DMUs with multiple inputs and multiple outputs, whereas SFA is a parametric approach that is applicable to multiple inputs and a single output. Since many applied studies feature multiple output variables, SFA cannot be used in such cases. In this research, a unique method to transform multiple outputs to a virtual single output is proposed. We are thus able to obtain efficiency scores from calculated virtual single output by the proposed method that are close (or even the same depending on targeted parameters at the expense of computation time and resources) to the efficiency scores obtained from multiple outputs of DEA. This will enable us to use SFA with a virtual single output. The proposed method is validated using a simulation study, and its usefulness is demonstrated with real application by using a hospital dataset from Turkey.  相似文献   

19.
Multi-input multi-output production technologies can be represented using distance functions. Econometric estimation of these functions typically involves factoring out one of the outputs or inputs and estimating the resulting equation using maximum likelihood methods. A problem with this approach is that the outputs or inputs that are not factored out may be correlated with the composite error term. Fernandez et al. (J Econ 98:47–79, 2000) show how to solve this so-called ‘endogeneity problem’ using Bayesian methods. In this paper I use the approach to estimate an output distance function and an associated index of total factor productivity (TFP) change. The TFP index is a new index that satisfies most, if not all, economically-relevant axioms from index number theory. It can also be exhaustively decomposed into a measure of technical change and various measures of efficiency change. I illustrate the methodology using state-level data on U.S. agricultural input and output quantities (no prices are needed). Results are summarized in terms of the characteristics (e.g., means) of estimated probability density functions for measures of TFP change, technical change and efficiency change.  相似文献   

20.
Fundamental analysis of stocks links financial data to firm value in two consecutive steps: a predictive information link tying current financial data to future earnings, and a valuation link tying future earnings to firm value. At each step, a large number of causal factors have to be factored into the evaluation. To effect these calculations, we propose a new two‐stage multi‐criteria procedure, drawing on the techniques of data envelopment analysis. At each stage, a piecewise linear efficiency frontier is fitted to the observed data. The procedure is illustrated by a numerical example, analyzing some 30 stocks in the Spanish manufacturing industry in the years 1991–1996. Copyright © 2004 John Wiley & Sons, Ltd.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号