首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 389 毫秒
1.
This paper investigates the technical efficiency of labor market matching from a stochastic frontier approach. The true fixed-effects model (Greene J Prod Anal 23:7–32, 2005a; J Econom 126:269–303, 2005b) is utilised in order to separate cross-sectional heterogeneity from inefficiency, and inefficiency terms are modelled following Battese and Coelli (Empir Econ 20:325–332, 1995). The data set consists of almost 17,000 observations from Local Labor Offices (LLOs) in Finland. According to the results, there are notable differences in matching efficiency between regions, and these differences contribute significantly to the number of filled vacancies. If all regions were as efficient as the most efficient one, the number of total matches per month would increase by over 23%. The heterogeneity of the job-seeker stock is an important determinant of matching efficiency: the weight of the composition of the job-seeker stock in the inefficiency terms is on average 85%.
Sanna-Mari HynninenEmail:
  相似文献   

2.
A stochastic frontier model with correction for sample selection   总被引:3,自引:2,他引:1  
Heckman’s (Ann Econ Soc Meas 4(5), 475–492, 1976; Econometrica 47, 153–161, 1979) sample selection model has been employed in three decades of applications of linear regression studies. This paper builds on this framework to obtain a sample selection correction for the stochastic frontier model. We first show a surprisingly simple way to estimate the familiar normal-half normal stochastic frontier model using maximum simulated likelihood. We then extend the technique to a stochastic frontier model with sample selection. In an application that seems superficially obvious, the method is used to revisit the World Health Organization data (WHO in The World Health Report, WHO, Geneva 2000; Tandon et al. in Measuring the overall health system performance for 191 countries, World Health Organization, 2000) where the sample partitioning is based on OECD membership. The original study pooled all 191 countries. The OECD members appear to be discretely different from the rest of the sample. We examine the difference in a sample selection framework.  相似文献   

3.
Since Solow (Q J Econ 70:65–94, 1956) the economic literature has widely accepted innovation and technological progress as the central drivers of long-term economic growth. From the microeconomic perspective, this has led to the idea that the growth effects on the macroeconomic level should be reflected in greater competitiveness of the firms. Although innovation effort does not always translate into greater competitiveness, it is recognized that innovation is, in an appropriate sense, unique and differs from other inputs like labor or capital. Nonetheless, often this uniqueness is left unspecified. We analyze two arguments rendering innovation special, the first related to partly non-discretionary innovation input levels and the second to the induced increase in the firm’s competitiveness on the global market. Methodologically the analysis is based on restriction tests in non-parametric frontier models, where we use and extend tests proposed by Simar and Wilson (Commun Stat Simul Comput 30(1):159–184, 2001; J Prod Anal, forthcoming, 2010). The empirical data is taken from the German Community Innovation Survey 2007 (CIS 2007), where we focus on mechanical engineering firms. Our results are consistent with the explanation of the firms’ inability to freely choose the level of innovation inputs. However, we do not find significant evidence that increased innovation activities correspond to an increase in the ability to serve the global market.  相似文献   

4.
This paper aims at comparing macroeconomic performance of three European socialist economies (Hungary, Poland, Yugoslavia) with developing and developed countries during the 1970s and the 1980s. Using panel data for 89 countries, we measure macroeconomic performance with two panel data production frontier models: the WITHIN model proposed by (Cornwell et al J Econom 46:185–200, 1990), and the firm effects model developed by (Battese and Coelli J Prod Anal 3:153–169, 1992). We conclude in favor of the underperformance of socialist countries in relation to developed countries but also to developing countries in most cases, which may be explained by the features of the socialist economic system.
Laurent WeillEmail:
  相似文献   

5.
This paper examines the wide-spread practice where data envelopment analysis (DEA) efficiency estimates are regressed on some environmental variables in a second-stage analysis. In the literature, only two statistical models have been proposed in which second-stage regressions are well-defined and meaningful. In the model considered by Simar and Wilson (J Prod Anal 13:49–78, 2007), truncated regression provides consistent estimation in the second stage, where as in the model proposed by Banker and Natarajan (Oper Res 56: 48–58, 2008a), ordinary least squares (OLS) provides consistent estimation. This paper examines, compares, and contrasts the very different assumptions underlying these two models, and makes clear that second-stage OLS estimation is consistent only under very peculiar and unusual assumptions on the data-generating process that limit its applicability. In addition, we show that in either case, bootstrap methods provide the only feasible means for inference in the second stage. We also comment on ad hoc specifications of second-stage regression equations that ignore the part of the data-generating process that yields data used to obtain the initial DEA estimates.  相似文献   

6.
We use a stochastic frontier model with firm-specific technical inefficiency effects in a panel framework (Battese and Coelli in Empir Econ 20:325–332, 1995) to assess two popular probability of bankruptcy (PB) measures based on Merton model (Merton in J Financ 29:449–470, 1974) and discrete-time hazard model (DHM; Shumway in J Bus 74:101–124, 2001). Three important results based on our empirical studies are obtained. First, a firm with a higher PB generally has less technical efficiency. Second, for an ex-post bankrupt firm, its PB tends to increase and its technical efficiency of production tends to decrease, as the time to its bankruptcy draws near. Finally, the information content about firm’s technical inefficiency provided by PB based on DHM is significantly more than that based on Merton model. By the last result and the fact that economic-based efficiency measures are reasonable indicators of the long-term health and prospects of firms (Baek and Pagán in Q J Bus Econ 41:27–41, 2002), we conclude that PB based on DHM is a better credit risk proxy of firms.  相似文献   

7.
This study estimates cost inefficiency and economies of scale of Slovenian water distribution utilities over the 1997–2003 period by employing several different stochastic frontier methods. The results indicate that significant cost inefficiencies are present in the utilities. An introduction of incentive-based price regulation scheme might help resolve this problem. However, the inefficiency scores obtained from different cost frontier models are not found to be robust. The levels of inefficiency estimates as well as the rankings depend on the econometric specification of the model. The established lack of robustness can be at least partly explained by different ability of the models to separate unobserved heterogeneity from inefficiency. Newly proposed true fixed effects model (Greene, J Econom 126:269–303, 2005; J Prod Anal 23(1):7–32, 2005) appears to perform better than the conventional panel data models with respect to distinguishing between unobserved heterogeneity and inefficiency. On the other hand, different models produce fairly robust results with respect to estimates of economies of output density, customer density and economies of scale. The optimal size of a company is found to closely corresponds to the sample median. Economies of scale are found in small-sized utilities, while large companies exhibit diseconomies of scale.
Jelena Zorić (Corresponding author)Email:
  相似文献   

8.
Stochastic FDH/DEA estimators for frontier analysis   总被引:2,自引:2,他引:0  
In this paper we extend the work of Simar (J Product Ananl 28:183–201, 2007) introducing noise in nonparametric frontier models. We develop an approach that synthesizes the best features of the two main methods in the estimation of production efficiency. Specifically, our approach first allows for statistical noise, similar to Stochastic frontier analysis (even in a more flexible way), and second, it allows modelling multiple-inputs-multiple-outputs technologies without imposing parametric assumptions on production relationship, similar to what is done in non-parametric methods, like Data Envelopment Analysis (DEA), Free Disposal Hull (FDH), etc.... The methodology is based on the theory of local maximum likelihood estimation and extends recent works of Kumbhakar et al. (J Econom 137(1):1–27, 2007) and Park et al. (J Econom 146:185–198, 2008). Our method is suitable for modelling and estimation of the marginal effects onto inefficiency level jointly with estimation of marginal effects of input. The approach is robust to heteroskedastic cases and to various (unknown) distributions of statistical noise and inefficiency, despite assuming simple anchorage models. The method also improves DEA/FDH estimators, by allowing them to be quite robust to statistical noise and especially to outliers, which were the main problems of the original DEA/FDH estimators. The procedure shows great performance for various simulated cases and is also illustrated for some real data sets. Even in the single-output case, our simulated examples show that our stochastic DEA/FDH improves the Kumbhakar et al. (J Econom 137(1):1–27, 2007) method, by making the resulting frontier smoother, monotonic and, if we wish, concave.  相似文献   

9.
The explanation of productivity differentials is very important to identify the economic conditions that create inefficiency and to improve managerial performance. In the literature two main approaches have been developed: one-stage approaches and two-stage approaches. Daraio and Simar (2005, J Prod Anal 24(1):93–121) propose a fully nonparametric methodology based on conditional FDH and conditional order-m frontiers without any convexity assumption on the technology. However, convexity has always been assumed in mainstream production theory and general equilibrium. In addition, in many empirical applications, the convexity assumption can be reasonable and sometimes natural. Lead by these considerations, in this paper we propose a unifying approach to introduce external-environmental variables in nonparametric frontier models for convex and nonconvex technologies. Extending earlier contributions by Daraio and Simar (2005, J Prod Anal 24(1):93–121) as well as Cazals et al. (2002, J Econometrics 106:1–25), we introduce a conditional DEA estimator, i.e., an estimator of production frontier of DEA type conditioned to some external-environmental variables which are neither inputs nor outputs under the control of the producer. A robust version of this conditional estimator is proposed too. These various measures of efficiency provide also indicators of convexity which we illustrate using simulated and real data. Cinzia Daraio received Research support from the Italian Ministry of Education Research on Innovation Systems Project (iRis) “The reorganization of the public system of research for the technological transfer: governance, tools and interventions” and from the Italian Ministry of Educational Research Project (MIUR 40% 2004) “System spillovers on the competitiveness of Italian economy: quantitative analysis for sectoral policies” which are acknowledged. Léopold Simar received Research support from the “Interuniversity Attraction Pole”, Phase V (No. P5/24) from the Belgian Government (Belgian Science Policy) is acknowledged.  相似文献   

10.
This paper proposes a new two-step stochastic frontier approach to estimate technical efficiency (TE) scores for firms in different groups adopting distinct technologies. Analogous to Battese et al. (J Prod Anal 21:91–103, 2004), the metafrontier production function allows for calculating comparable TE measures, which can be decomposed into group specific TE measures and technology gap ratios. The proposed approach differs from Battese et al. (J Prod Anal 21:91–103, 2004) and O’Donnell et al. (Empir Econ 34:231–255, 2008) mainly in the second step, where a stochastic frontier analysis model is formulated and applied to obtain the estimates of the metafrontier, instead of relying on programming techniques. The so-derived estimators have the desirable statistical properties and enable the statistical inferences to be drawn. While the within-group variation in firms’ technical efficiencies is frequently assumed to be associated with firm-specific exogenous variables, the between-group variation in technology gaps can be specified as a function of some exogenous variables to take account of group-specific environmental differences. Two empirical applications are illustrated and the results appear to support the use of our model.  相似文献   

11.
Based on the seminal paper of Farrell (J R Stat Soc Ser A (General) 120(3):253–290, 1957), researchers have developed several methods for measuring efficiency. Nowadays, the most prominent representatives are nonparametric data envelopment analysis (DEA) and parametric stochastic frontier analysis (SFA), both introduced in the late 1970s. Researchers have been attempting to develop a method which combines the virtues—both nonparametric and stochastic—of these “oldies”. The recently introduced Stochastic non-smooth envelopment of data (StoNED) by Kuosmanen and Kortelainen (J Prod Anal 38(1):11–28, 2012) is such a promising method. This paper compares the StoNED method with the two “oldies” DEA and SFA and extends the initial Monte Carlo simulation of Kuosmanen and Kortelainen (J Prod Anal 38(1):11–28, 2012) in several directions. We show, among others, that, in scenarios without noise, the rivalry is still between the “oldies”, while in noisy scenarios, the nonparametric StoNED PL now constitutes a promising alternative to the SFA ML.  相似文献   

12.
This paper considers the estimation of Kumbhakar et al. (J Prod Anal. doi:10.1007/s11123-012-0303-1, 2012) (KLH) four random components stochastic frontier (SF) model using MLE techniques. We derive the log-likelihood function of the model using results from the closed-skew normal distribution. Our Monte Carlo analysis shows that MLE is more efficient and less biased than the multi-step KLH estimator. Moreover, we obtain closed-form expressions for the posterior expected values of the random effects, used to estimate short-run and long-run (in)efficiency as well as random-firm effects. The model is general enough to nest most of the currently used panel SF models; hence, its appropriateness can be tested. This is exemplified by analyzing empirical results from three different applications.  相似文献   

13.
In frontier analysis, most nonparametric approaches (DEA, FDH) are based on envelopment ideas which assume that with probability one, all observed units belong to the attainable set. In these “deterministic” frontier models, statistical inference is now possible, by using bootstrap procedures. In the presence of noise, envelopment estimators could behave dramatically since they are very sensitive to extreme observations that might result only from noise. DEA/FDH techniques would provide estimators with an error of the order of the standard deviation of the noise. This paper adapts some recent results on detecting change points [Hall P, Simar L (2002) J Am Stat Assoc 97:523–534] to improve the performances of the classical DEA/FDH estimators in the presence of noise. We show by simulated examples that the procedure works well, and better than the standard DEA/FDH estimators, when the noise is of moderate size in term of signal to noise ratio. It turns out that the procedure is also robust to outliers. The paper can be seen as a first attempt to formalize stochastic DEA/FDH estimators.   相似文献   

14.
《Journal of econometrics》2005,126(2):241-267
This paper is an extension of Ahn et al. (J. Econom. 101 (2001) 219) to allow a parametric function for time-varying coefficients of the individual effects. It provides a fixed-effect treatment of models like those proposed by Kumbhakar (J. Econom. 46 (1990) 201) and Battese and Coelli (J. Prod. Anal. 3 (1992) 153). We present a number of GMM estimators based on different sets of assumptions. Least squares has unusual properties: its consistency requires white noise errors, and given white noise errors it is less efficient than a GMM estimator. We apply this model to the measurement of the cost efficiency of Spanish savings banks.  相似文献   

15.
Foreign presence and efficiency in transition economies   总被引:1,自引:1,他引:0  
This paper presents empirical evidence on the role of foreign presence in the performance of domestic manufacturing firms in five Central and Eastern European countries. Data Envelopment Analysis (DEA) was used to estimate a frontier for each sector with similar technology common for five transition countries in the sample − Bulgaria, Estonia, Hungary, Poland and Romania. Following Simar and Wilson (J Econom 136(1):31–64, 2007), this study applies a truncated regression and bootstrap technique in a second stage post-DEA analysis. Some evidence is found to support the hypothesis that foreign presence has an overall positive spillover effects on the performance of domestic firms.  相似文献   

16.
The aim of this study is to confirm the factorial structure of the Identification-Commitment Inventory (ICI) developed within the frame of the Human System Audit (HSA) (Quijano et al. in Revist Psicol Soc Apl 10(2):27–61, 2000; Pap Psicól Revist Col Of Psicó 29:92–106, 2008). Commitment and identification are understood by the HSA at an individual level as part of the quality of human processes and resources in an organization; and therefore as antecedents of important organizational outcomes, such as personnel turnover intentions, organizational citizenship behavior, etc. (Meyer et al. in J Org Behav 27:665–683, 2006). The theoretical integrative model which underlies ICI Quijano et al. (2000) was tested in a sample (N = 625) of workers in a Spanish public hospital. Confirmatory factor analysis through structural equation modeling was performed. Elliptical least square solution was chosen as estimator procedure on account of non-normal distribution of the variables. The results confirm the goodness of fit of an integrative model, which underlies the relation between Commitment and Identification, although each one is operatively different.  相似文献   

17.
Using the stochastic metafrontier framework of Battese et al. (J Prod Anal 21:91–103, 2004), this study proposes to compare and measure the cost efficiency and cost frontier gap between the banking industry in Taiwan and China. It further identifies environmental variables that determine bank’s cost frontier gap between two countries. Data on 69 sample banks for years 2005–2009 are used in the empirical analysis and inference. The empirical results show that Taiwanese banks have in general the superior cost frontier of production, but inferior cost efficiency of operation than Chinese banks. Private banks in Taiwan have the best cost frontier, whereas foreign banks in China are most cost efficient among all banks in Taiwan and China. The empirical results also reveal that the majority of Taiwanese banks are undersized and that most banks in China are oversized. Finally, the regression results show that the financial market structure, institutional environment, and political development are significant determinants of the cost gap between the cost frontiers of banks in Taiwan and China. Therefore, programs related to changes directed toward a better bank production environment can be initiated to improve the cost technologies of banks in both Taiwan and China.  相似文献   

18.
Liang and Ng (Metrika 68:83–98, 2008) proposed a componentwise conditional distribution method for L p -uniform sampling on L p -norm n-spheres. On the basis of properties of a special family of L p -norm spherical distributions we suggest a wide class of algorithms for sampling uniformly distributed points on n-spheres and n-balls in L p spaces, generalizing the approach of Harman and Lacko (J Multivar Anal 101:2297–2304, 2010), and including the method of Liang and Ng as a special case. We also present results of a numerical study proving that the choice of the best algorithm from the class significantly depends on the value of p.  相似文献   

19.
The productive efficiency of a firm can be seen as composed of two parts, one persistent and one transient. The received empirical literature on the measurement of productive efficiency has paid relatively little attention to the difference between these two components. Ahn and Sickles (Econ Rev 19(4):461–492, 2000) suggested some approaches that pointed in this direction. The possibility was also raised in Greene (Health Econ 13(10):959–980, 2004. doi:10.1002/hec.938), who expressed some pessimism over the possibility of distinguishing the two empirically. Recently, Colombi (A skew normal stochastic frontier model for panel data, 2010) and Kumbhakar and Tsionas (J Appl Econ 29(1):110–132, 2012), in a milestone extension of the stochastic frontier methodology have proposed a tractable model based on panel data that promises to provide separate estimates of the two components of efficiency. The approach developed in the original presentation proved very cumbersome actually to implement in practice. Colombi (2010) notes that FIML estimation of the model is ‘complex and time consuming.’ In the sequence of papers, Colombi (2010), Colombi et al. (A stochastic frontier model with short-run and long-run inefficiency random effects, 2011, J Prod Anal, 2014), Kumbhakar et al. (J Prod Anal 41(2):321–337, 2012) and Kumbhakar and Tsionas (2012) have suggested other strategies, including a four step least squares method. The main point of this paper is that full maximum likelihood estimation of the model is neither complex nor time consuming. The extreme complexity of the log likelihood noted in Colombi (2010), Colombi et al. (2011, 2014) is reduced by using simulation and exploiting the Butler and Moffitt (Econometrica 50:761–764, 1982) formulation. In this paper, we develop a practical full information maximum simulated likelihood estimator for the model. The approach is very effective and strikingly simple to apply, and uses all of the sample distributional information to obtain the estimates. We also implement the panel data counterpart of the Jondrow et al. (J Econ 19(2–3):233–238, 1982) estimator for technical or cost inefficiency. The technique is applied in a study of the cost efficiency of Swiss railways.  相似文献   

20.
Faraz and Parsian (Stat Pap 47:569–593, 2006) have shown that the double warning lines (DWL) scheme detects process shifts more quickly than the other variable ratio sampling schemes such as variable sample sizes (VSS), variable sampling intervals (VSI) and variable sample sizes and sampling intervals (VSSVSI). In this paper, the DWL T2 control chart for monitoring the process mean vector is economically designed. The cost model proposed by Costa and Rahim (J Appl Stat 28:875–885, 2001) is used here and is minimized through a genetic algorithm (GA) approach. Then the effects of the model parameters on the chart parameters and resulting operating loss is studied and finally a comparison between all possible variable ratio sampling (VRS) schemes are made to choose the best option economically.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号