首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 734 毫秒
1.
2.
An F test [Nelson (1976)] of Parzen's prediction variance horizon [Parzen (1982)] of an ARMA model yields the number of steps ahead that forecasts contain information (short memory). A special 10 year pattern in Finnish GDP is introduced as a ‘seasonal’ in an ARMA-model. Forecasts three years ahead are statistically informative but exploiting the complete 10 year pattern raises doubts both about model memory and model validity.  相似文献   

3.
Decomposing productivity patterns in a conditional convergence framework   总被引:1,自引:0,他引:1  
In this study we examine regional data on per worker GDP, disaggregated at sectoral level, by focusing our interest on the role of differences in the sectoral composition of activities, and in productivity gaps that are uniform across sectors, in explaining the catching-up process, which is realized through physical and human capital as well as technological knowledge accumulation. Our objective is to investigate how much of the interregional inequality in aggregate productivity per worker is imputable to each component. A methodology for identifying and analyzing sources of inequality from a decomposed perspective is developed in the growth framework by combining a shift-share based technique and a SUR model specification for the conditional-convergence analysis. The proposed approach is employed to analyze aggregate interregional inequality of per worker productivity levels in Italy over the period 1970–2000. With respect to the existing empirical results, our approach provides a more comprehensive and detailed examination of the contribution of each identified component in explaining the regional productivity gaps in Italy. It is argued that region-specific productivity differentials, uniform across sectors, explain a quite large share of differences in productivity per worker. However, sectoral composition plays a non negligible role, although decreasing since the end of 1980s, and very different productivity patterns emerge within geographical areas.
Silvia BertarelliEmail:
  相似文献   

4.
5.
6.
This paper develops a parametric decomposition framework of labor productivity growth relaxing the assumption of labor-specific efficiency. The decomposition analysis is applied to a sample of 121 developed and developing countries during the 1970–2007 period drawn from the recently updated Penn World Tables and Barro and Lee (A new data set of educational attainment in the world 1950–2010. NBER Working Paper No. 15902, 2010) educational databases. A generalized Cobb–Douglas functional specification is used taking into account differences in technological structures across groups of countries to approximate aggregate production technology using Jorgenson and Nishimizu (Econ J 88:707–726, 1978) bilateral model of production. The measurement of labor efficiency is based on Kopp’s (Quart J Econ 96:477–503, 1981) orthogonal non-radial index of factor-specific efficiency modified in a parametric frontier framework. The empirical results indicate that the weighted average annual rate of labor productivity growth was 1.239 % over the period analyzed. Technical change was found to be the driving force of labor productivity, while improvements in human capital and factor intensities account for the 19.5 and 12.4 % of that productivity growth, respectively. Finally, labor efficiency improvements contributed by 9.8 % to measured labor productivity growth.  相似文献   

7.
In this paper, the robust game model proposed by Aghassi and Bertsimas (Math Program Ser B 107:231–273, 2006) for matrix games is extended to games with a broader class of payoff functions. This is a distribution-free model of incomplete information for finite games where players adopt a robust-optimization approach to contend with payoff uncertainty. They are called robust players and seek the maximum guaranteed payoff given the strategy of the others. Consistently with this decision criterion, a set of strategies is an equilibrium, robust-optimization equilibrium, if each player’s strategy is a best response to the other player’s strategies, under the worst-case scenarios. The aim of the paper is twofold. In the first part, we provide robust-optimization equilibrium’s existence result for a quite general class of games and we prove that it exists a suitable value \(\epsilon \) such that robust-optimization equilibria are a subset of \(\epsilon \)-Nash equilibria of the nominal version, i.e., without uncertainty, of the robust game. This provides a theoretical motivation for the robust approach, as it provides new insight and a rational agent motivation for \(\epsilon \)-Nash equilibrium. In the last part, we propose an application of the theory to a classical Cournot duopoly model which shows significant differences between the robust game and its nominal version.  相似文献   

8.
This study evaluates the Federal Reserve forecasts of manufacturing capacity utilization employing, as benchmarks, the forecasts from a univariate model which utilizes past information in capacity utilization, and from a bivariate model which utilizes past information in both capacity utilization and the federal funds rate. In addition to accurately predicting the directional change in capacity utilization, the Federal Reserve forecasts are “weakly” rational and generally superior to the bivariate forecasts. In light of another finding that monetary policy is non-neutral, we argue the Federal Reserve forecasts of capacity utilization have positively contributed to the Fed’s success in maintaining a low inflationary environment.
Hamid BaghestaniEmail:
  相似文献   

9.
Debreu’s coefficient of resource utilization is freed from individual data requirements. The procedure is shown to be equivalent to the imposition of Leontief preferences. The rate of growth of the modified Debreu coefficient and the Solow residual are shown to add up to TFP growth. This decomposition is the neoclassical counterpart to the frontier analytic decomposition of productivity growth into technical change and efficiency change. The terms can now be broken down by sector as well as by factor input.
Thijs ten RaaEmail:
  相似文献   

10.
We analyze the determinants of ICT investment and the impact of information technology on productivity and efficiency on a representative sample of small and medium sized Italian firms. In order to test the most relevant theoretical predictions from the ICT literature we evaluate the impact of investment in software, hardware and telecommunications of these firms on a series of intermediate variables and on productivity. Among intermediate variables we consider the demand for skilled workers, the introduction of new products and processes and the rate of capacity utilization. Among productivity measures we include total factor productivity, the productivity of labor, and the distance from the best practice by using a stochastic frontier approach. Our results show that the effect of ICT investment on firm efficiency can be more clearly detected at firm level data by decomposing it into software and telecommunications investment. We find that telecommunications investment positively affects the creation of new products and processes, while software investment increases the demand for skilled workers, average labor productivity and proximity to the optimal production frontier. We interpret these results by arguing that ICT investment modifies the trade-off between scale and scope economies. While software investment increases the scale of firm operations, telecommunications investment creates a flexibility option easing the switch from a Fordist to a flexible network productive model in which products and processes are more frequently adapted to satisfy consumers taste for variety.  相似文献   

11.
The purpose of this paper is to apply some novel features in the theory of productivity indexes to the measurement of productivity gaps. It advances the proposition that one reason for the persistence of productivity gaps might be that the methodology of measuring gaps does not separate shifts of the production function due to intercountry efficiency from shifts due to intercountry differences in capacity utilization.In this paper we calculate productivity gaps for four OECD countries relative to the U.S., adjusted for cyclical variations in capacity utilization for the period 1963–1982. The theoretical foundation of our measurement is based on a variable cost function approach with short-run fixity of capital. Without adjusting for differences in capacity utilization within the countries, productivity gaps are a mixture of differences in productivity and in capacity utilization.The refereeing process of this paper was handled through C. Morrison.  相似文献   

12.
13.
Hayashi and Prescott (Rev Econ Dyn 5(1):206–235, 2002) argue that the ‘lost decade’ of the 1990s in Japan is explained by the slowdown in exogenous TFP growth rates. At the same time, other research suggests that Japanese banks’ support for inefficient firms prolonged recessions by reducing productivity through misallocation of resources. Using the data on large manufacturing firms between 1969 and 1996, the paper attempts to disentangle the factors behind the slowdown in productivity growth during the 1990s. The main results show that there was a significant drop in within-firm productivity, the component that is not affected by reallocation of input and output shares across firms over time, during the 1990s. Although we find that misallocation among large continuing firms represents a substantial drag to overall TFP growth for these firms throughout the sample period, the negative impact of misallocation was least visible during the 1990s. The significant reduction in within-firm productivity growth suggests that, as the Japanese economy has matured, a policy which fosters technological innovations via greater competition, R&D, and fast technological adoption may have become increasingly important in promoting economic growth.
Kazuhiko OdakiEmail:
  相似文献   

14.
Ramez Ghazoul 《Socio》1979,13(3):149-157
Costs are the valuations placed on the use of resources; they include operating and opportunity costs. As such they vary according to one's orientation. In higher education, costs can be evaluated from the point of view of three entities: the university as an economic firm (institutional cost); the students as private individuals (private cost); and society at large (social cost). This paper considers the institutional costs of higher education.Based on a hypothetical college model, two methodologies are suggested for evaluating the institutional costs in the “production’ of university graduates. The net-value-added method assumes that the cost of dropouts is inherent in the cost of graduates. The cost-per-student-year method assumes that dropouts and graduates are joint products of the educational system each with their own separate costs.The application of the two cost models is demonstrated with empirical data based on the University of Mosul in Iraq. The implications of the suggested methodologies for institutions with diverse specializations, high dropout rates, or large proportions of transfer students are also discussed.  相似文献   

15.
The methodologies that have been used in existing research to assess the efficiency with which organic farms are operating are generally based either on the stochastic frontier methodology or on a deterministic non-parametric approach. Recently, Kumbhakar et al. (J Econom 137:1–27, 2007) proposed a new nonparametric, stochastic method based on the local maximum likelihood principle. We use this methodology to compare the efficiency ratings of organic and conventional arable crop farms in the Spanish region of Andalucía. Nonparametrically encompassing the stochastic frontier model is especially useful when comparing the performance of two groups that are likely to be characterized by different production technologies.
Teresa SerraEmail: Email:
  相似文献   

16.
We develop a model of labor productivity as a combination of capital-labour ratio, vintage of capital stock, regional externalities, and total factor productivity (TFP). The skewness of TFP distribution is related to different growth theories. While negative skewness is consistent with the neo-Schumpeterian idea of catching up with leaders, zero skewness supports the neoclassical view that deviations from the frontier reflect only idiosyncratic productivity shocks. We argue that positive skewness is consistent with an economy where exogenous technology is combined with non-transferable knowledge accumulated in specific sectors and regions. This argument provides the framework for an empirical model based on stochastic frontier analysis. The model is used to analyse regional and sectoral inequalities in Denmark.
Arnab BhattacharjeeEmail:
  相似文献   

17.
We reexamine the Information Technology (IT) productivity paradox from the standpoints of theoretical basis, measurement issues and potential inefficiency in IT management. Two key objectives are: (i) to develop an integrated microeconomic framework for IT productivity and efficiency assessment using developments in production economics, and (ii) to apply the framework to a dataset used in prior research with mixed results to obtain new evidence regarding IT contribution. Using a stochastic frontier with a production economics framework involving the behavioral assumptions of profit maximization and cost minimization, we obtain a unified basis to assess both productivity and efficiency impacts of IT investments. The integrated framework is applied to a manufacturing database spanning 1978–1984. While previous productivity research with this dataset found mixed results regarding the contribution from IT capital, we show the negative marginal contribution of IT found in an important prior study is attributable primarily to the choices of the IT deflator and modeling technique. Further, by ignoring the potential inefficiency in IT investment and management, studies that have reported positive results may have significantly underestimated the true contribution of IT. This positive impact of IT is consistent across multiple model specifications, estimation techniques and capitalization methods. The stochastic production frontier analysis shows that while there were significant technical, allocative and scale inefficiencies, the inefficiencies reduced with an increase in the IT intensity. Given that the organizational units in our sample increased their IT intensity during the time period covered by the study, management was taking a step in the right direction by increasing the IT share of capital inputs. Our results add to a small body of MIS literature which reports significant positive returns from IT investments.  相似文献   

18.
Sources of profit change for Telstra, Australia’s largest telecommunications firm, are examined. A new method allows for changes, in a firm’s profits to be broken down into separate effects due to productivity change, price changes, and growth in the firm’s size. This in turn allows us to calculate the distribution of the benefits of productivity improvements between consumers, labor, and shareholders. The results show that around half the benefits from Telstra’s productivity improvements from 1984 to 1994 were passed on to consumers in the form of real price reductions.
Kevin J. FoxEmail:
  相似文献   

19.
We compare the productivity performances of 15 matched manufacturing sectors in Korea and Taiwan, using the Malmquist productivity indexes, based on category-wise meta frontiers, 1978–1996. Comparisons at the sector levels are made using sequential multiplicative products of the indexes. The overall productivity and technology growth rates of Taiwan were higher than those of Korea. However, at disaggregated levels, the productivity and technology growth rates of the high-tech industries of Korea were much larger than those of Taiwan, while Taiwans high overall growth rate rested mainly on its traditional and basic industries. The leading innovators of both countries are also discussed.JEL Classification: C43, D24, L16, O11, O47, O53  相似文献   

20.
Excess capacity can be viewed as wasteful (an unnecessary cost) or as prudential (a ready source of supply). The role of excess capacity is an important issue at the individual firm level as well as at the community level. In this paper we explore hospital capacity for a sample hospitals operating in the 15 largest standard metropolitan statistical areas (SMSAs) in the U.S. during 2002. Using Johanson’s (1968, Production Functions and the Concept of Capacity, Namur, Belgium, Recherches Récentes sur le Fonction de Production (Collection, Economie Mathematique et Econometrie no. 2). [Reprinted in Finn R. Førsund (ed) (1987) The Collected Works of Leif Johanson, vol 1. Amsterdam, North-Holland, pp 350–282]) notion of capacity as the maximum rate of output possible from fixed inputs (i.e., without restrictions on variable inputs), we measure capacity in a frontier setting using directional distance functions. Rather than attempt to determine the “optimal” level of hospital capacity, we instead quantify capacity and capacity utilization rates at both the individual hospital and, by aggregating, the SMSA levels. After determining capacity and capacity utilization rates, we then introduce a model that calculates the changes in variable inputs that would be needed to utilize excess capacity. Finally, we introduce a simulation model that is used to examine whether each SMSA has enough “excess” hospital capacity to accommodate the loss of one of its five largest hospitals. The approach developed in this study should be of value to decision makers and planners in a variety of fields.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号