首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   9篇
  免费   1篇
财政金融   3篇
计划管理   6篇
经济学   1篇
  2018年   1篇
  2009年   1篇
  2004年   1篇
  1996年   1篇
  1994年   1篇
  1993年   1篇
  1984年   1篇
  1982年   1篇
  1975年   1篇
  1966年   1篇
排序方式: 共有10条查询结果,搜索用时 15 毫秒
1
1.
2.
Up to recently, economists have had no good tools to measure the returns to scale of individual corporations in an industry. Data envelopment analysis (DEA) is a linear programming technique for determining the efficiency frontier (the envelope) to the inputs and outputs of a collection of individual corporations or other productive units. While DEA offers an avenue for calculating the returns to scale of individual corporations, the approach has been riddled by mathematical complications arising from the possibility of alternate optima. The present paper develops theory for calculating the entire range of these alternate optima. Furthermore, in a quite ambitions empirical application, DEA is employed to determine the time path of returns to scale of all publicly held U.S. computer companies over the time period 1980–1991. For the great majority of companies, a unique time path is obtained; only in less than 4 percent of the linear programming calculations is an entire range of alternate optima obtained. The results indicate that the computer industry was polarized into two camps: large aging corporations with decreasing returns to scale, and swarms of small upstart companies with advanced technology exhibiting increasing returns to scale.  相似文献   
3.
4.
Data envelopment analysis (DEA) is extended to the case of stochastic inputs and outputs through the use of chance-constrained programming. The chance-constrained frontier envelops a given set of observations ‘most of the time’. As an empirical illustration, we re-examine the pioneering 1981 study of Program Follow Through by Charnes, Cooper and Rhodes.  相似文献   
5.
The performance of information criteria and tests for residual heteroscedasticity for choosing between different models for time‐varying volatility in the context of structural vector autoregressive analysis is investigated. Although it can be difficult to find the true volatility model with the selection criteria, using them is recommended because they can reduce the mean squared error of impulse response estimates substantially relative to a model that is chosen arbitrarily based on the personal preferences of a researcher. Heteroscedasticity tests are found to be useful tools for deciding whether time‐varying volatility is present but do not discriminate well between different types of volatility changes. The selection methods are illustrated by specifying a model for the global market for crude oil.  相似文献   
6.
7.
8.
Fundamental analysis of stocks links financial data to firm value in two consecutive steps: a predictive information link tying current financial data to future earnings, and a valuation link tying future earnings to firm value. At each step, a large number of causal factors have to be factored into the evaluation. To effect these calculations, we propose a new two‐stage multi‐criteria procedure, drawing on the techniques of data envelopment analysis. At each stage, a piecewise linear efficiency frontier is fitted to the observed data. The procedure is illustrated by a numerical example, analyzing some 30 stocks in the Spanish manufacturing industry in the years 1991–1996. Copyright © 2004 John Wiley & Sons, Ltd.  相似文献   
9.
The International Labor Office, an arm of the UN based in Geneva, has as its goal the promotion of opportunities for women and men to obtain decent and productive work, in conditions of freedom, equity, security and human dignity. Since 1999, the ILO has conducted a series of studies of decent work. In 2001, the organization posed the global challenge of reducing the decent work deficit as measured by an employment gap, rights gap, a social protection gap, and a social dialogue gap. Using standard economic terms, “decent work” may be seen as an efficiency point along a generalized input-output function, dependent upon variables of both economic performance and economic and social policy. The decent work deficit of a given country (if any) is then obtained as the difference between an observed point and its projection on the efficiency frontier. Using data envelopment analysis (DEA), we fit a piecewise linear frontier to observations for 61 countries from all continents. Importantly, 27 of these countries lie on the decent work frontier; the remaining ones reveal conditions of decent work deficit. The possibilities of reducing such deficits by appropriate control of policy variables are discussed.  相似文献   
10.
DEA (data envelopment analysis) is a technique for determining the efficiencyfrontier (the envelope) to the inputs and outputs of a collection of individual corporations or other productive units. DEA is here employed to estimate the intertemporal productive efficiency of U.S. computer manufactures, using financial data brought from earnings statements and balance sheets. The results indicate that a few corporations, including Apple Computer Inc., Compaq Computer Corp., and Seagate Technology were able to stay at the productivity efficiency frontier throughout the time period investigated. But not all successful corporations did; sometimes subefficiency (=disequilibrium) actually goes together with very rapid growth. A new Malmquist type productivity index is calculated for each corporation, measuring shifts of the estimated intertemporal efficiency frontier.  相似文献   
1
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号