首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 187 毫秒
1.
First, this paper explores Main Battle Tank (MBT) data set with different statistical methods in order to decide the most appropriate variables as reliable yardsticks in applying technology forecasting (TF) using data envelopment analysis (TFDEA) technique. It then applies TF using DEA method to forecast MBT technologies. This article attempts to predict technology development year of MBT commercialised from 1941 to 1994. This article presents the processes of TFDEA in detail and identifies some issues to search for appropriate input and output variables to forecast MBT technologies. The purpose of this study is to address some issues and identify an appropriate data to predict future trends of MBT technologies when using TFDEA and multiple linear regression tools. Finally, the study provides an understanding of the technological advances being sought in MBT technologies and information for use in making decisions regarding development strategy.  相似文献   

2.
In an effort to encourage consumers to purchase electric vehicles (EVs), the government has been funding battery research to solve some of these problems. This paper presents a study using technology forecasting using data envelopment analysis (TFDEA) to forecast future battery performance characteristics. The results were compared against the performance goals established by the US Department of Energy (DOE). We find that the foreseen progress of EV battery performance will be insufficient to meet the DOE projected goals for the range that EVs can travel before running out of power. Therefore, a new battery technology must be developed because the incremental improvements in current battery technologies leave EVs considerably short of the DOE performance specification for longer trip ranges.  相似文献   

3.
Abstract We introduce a new data set on over 230,000 monthly prices for 10 goods in 50 Canadian cities over the 40‐year period from 1910 to 1950. This information, coupled with previously published price information from the late twentieth century, allows us to present one of the first comprehensive views of nominal rigidities and retail price dispersion over the past 100 years. We find that nominal rigidities have been conditioned upon prevailing rates of inflation with a greater frequency of price changes occurring in the 1920s and the 1970s. Additionally, the process of retail market integration has followed a U‐shaped trajectory with many domestic markets being better integrated – as measured by the average dispersion of retail prices – at mid‐century than in the 1990s. We also consider the linkages between nominal rigidities and price dispersion, finding results consistent with present‐day data.  相似文献   

4.
Since PHILLIPS (1958) wrote his landmark article on the relationship between money wage rates and unemployment, much work has been conducted concerning ‘Phillips’ curves'. While some writers, such as LIPSEY (1960), SAMUELSON and SOLOW (1960) and HANSEN (1970), remained relatively close to Phillips' original model, other strayed from it. BRECHLING (1968), DICKS-MIREAUX and DOW (1959), ECKSTEIN (1968), ECKSTEIN and WILSON (1962), HAMERMESH (1970), KUH (1967), PERRY (1966), PHELPS (1968), and TAYLOR (1970) are a few researchers who felt Phillips' formulation did not adequately explain changes in money wage rates. Their approaches ranged from formulating entirely new models to merely adding other explanatory variables to Phillips' model.

The present study estimates a Phillips' curve for the United Kingdom within the framework of Phillips' basic hypothesis. He grouped all observations according to unemployment levels, averaged the wage changes for each group, and fitted a curve by trial and error to the compact body of data. The current approach fits a curve to all data points using a direct estimation procedure. Only two basic variables are used; money wage rates and unemployment.

REES and HAMILTON (1967) pointed out that the estimated relation between wage changes and unemployment is highly sensitive to the form of the variables and the regression employed. One can obtain widely scattered results by either changing the type of numerical analysis on existing data or by changing the defined variables. It would not be surprising if the results of the present analysis differ from previous works. Not only are some of the data sources different from those previously employed but the numerical analysis is entirely new to Phillips' curve theory. Spline theory is used to fit the Phillips' curve.  相似文献   

5.
This note follows the article by L. V. Defris, 'Leading Australian Cyclical Indicators, 1960–1975' in Review 475 and reports on the current movements of the Dynamic Deviation Reference Cycle and the Housing, Finance and 10 Leading Indicator Indices. The indices have been re-estimated by the methods described in that article on the latest available data, and the re-estimation results in minor changes in previously published values of the indices.  相似文献   

6.
This paper computes new indexes of output for refrigerators, using hedonic methods to adjust for quality change. The hedonic technique is applied in a new way (it is used to make quality adjustments to prices before they are used in the index), and the results are compared with those from methods used in previous hedonic investigations. There are three major findings. (1) Overall (1960–1972), our hedonic deflated output series rise more rapidly than conventional measures, because the price indexes used for deflation rise more slowly. (2) The output measures fluctuate more than do output measures produced by conventional methods, because adding hedonic quality adjustments to WPI indexes moves them up in some years and down in others, and the resulting adjustments to the output series were positively correlated with changes in output. (3) Applying methods used in previous studies produces larger adjustments to the published indexes, suggesting that some of the differences noted in previous studies between hedonic indexes and official published indexes are related to computational methods, not to quality adjustment.  相似文献   

7.
Growth and human capital: good data,good results   总被引:7,自引:0,他引:7  
We present a new data set for years of schooling across countries for the 1960–2000 period. The series are constructed from the OECD database on educational attainment and from surveys published by UNESCO. Two features that improve the quality of our data with respect to other series, particularly for series in first-differences, are the use of surveys based on uniform classification systems of education over time, and an intensified use of information by age groups. As a result of the improvement in quality, these new series can be used as a direct substitute for Barro and Lee’s (2001; Oxford Economic Papers, 3, 541–563) data in empirical research. In standard cross-country growth regressions we find that our series yield significant coefficients for schooling. In panel data estimates our series are also significant even when the regressions account for the accumulation of physical capital. Moreover, the estimated macro return is consistent with those reported in labour studies. These results differ from the typical findings of the earlier literature and are a consequence of the reduction in measurement error in the series.   相似文献   

8.
This paper investigates the possibility of Granger causality between the logarithms of real exports and real GDP in twenty-four OECD countries from 1960 to 1997. A new panel data approach is applied which is based on SUR systems and Wald tests with country specific bootstrap critical values. Two different models are used. A bivariate (GDP–exports) model and a trivariate (GDP–exports–openness) model, both without and with a linear time trend. In each case the analysis focusses on direct, one-period-ahead causality between exports and GDP. The results indicate one-way causality from exports to GDP in Belgium, Denmark, Iceland, Ireland, Italy, New Zealand, Spain and Sweden, one-way causality from GDP to exports in Austria, France, Greece, Japan, Mexico, Norway and Portugal, two-way causality between exports and growth in Canada, Finland and the Netherlands, while in the case of Australia, Korea, Luxembourg, Switzerland, the UK and the USA there is no evidence of causality in either direction.  相似文献   

9.
In an attempt to account for the huge observed disparity in international incomes, several recent papers study models in the spirit of R. Solow (1960, in “Mathematical Methods in the Social Sciences 1959,” Stanford Univ. Press, Stanford) where the adoptions of better technologies require investments in new equipment. This paper continues this line of research. It describes an economy in which firms install more productive machines and subsequent to these adoptions, learn how to use those machines. In contrast to these other papers, this one does not predict that firms always adopt the frontier technology whenever they switch technologies. In this model both the upper and lower supports of the distribution of operated technologies may differ between economies that differ in policy. Consequently, this model can generate larger differences in international incomes than these other models. These differences are still small relative to the data, however. Journal of Economic Literature Classification Numbers: O11, O33, O41.  相似文献   

10.
Objectives:

To conduct an economic evaluation of the currently prescribed treatments for stroke prevention in patients with non-valvular atrial fibrillation (NVAF) including warfarin, aspirin, and novel oral anticoagulants (NOACs) from a French payer perspective.

Methods:

A previously published Markov model was adapted in accordance to the new French guidelines of the Commission for Economic Evaluation and Public Health (CEESP), to adopt the recommended efficiency frontier approach. A cohort of patients with NVAF eligible for stroke preventive treatment was simulated over lifetime. Clinical events modeled included strokes, systemic embolism, intracranial hemorrhage, other major bleeds, clinically relevant non-major bleeds, and myocardial infarction. Efficacy and bleeding data for warfarin, apixaban, and aspirin were obtained from ARISTOTLE and AVERROES trials, whilst efficacy data for other NOACs were from published indirect comparisons. Acute medical costs were obtained from a dedicated analysis of the French national hospitalization database (PMSI). Long-term medical costs and utility data were derived from the literature. Univariate and probabilistic sensitivity analyses were performed to assess the robustness of the model projections.

Results:

Warfarin and apixaban were the two optimal treatment choices, as the other five treatment strategies including aspirin, dabigatran 110?mg, dabigatran in sequential dosages, dabigatran 150?mg, and rivaroxaban were strictly dominated on the efficiency frontier. Further, apixaban was a cost-effective alternative vs warfarin with an incremental cost of €2314 and an incremental quality-adjusted life year (QALY) of 0.189, corresponding to an incremental cost-effectiveness ratio (ICER) of €12,227/QALY.

Conclusions:

Apixaban may be the most economically efficient alternative to warfarin in NVAF patients eligible for stroke prevention in France. All other strategies were dominated, yielding apixaban as a less costly yet more effective treatment alternative. As formally requested by the CEESP, these results need to be verified in a French clinical setting using stroke reduction and bleeding safety observed in real-life patient cohorts using these anticoagulants.  相似文献   

11.
Abstract

Objective:

This study constructed the Economic and Health Outcomes Model for type 2 diabetes mellitus (ECHO-T2DM), a long-term stochastic microsimulation model, to predict the costs and health outcomes in patients with T2DM. Naturally, the usefulness of the model depends upon its predictive accuracy. The objective of this work is to present results of a formal validation exercise of ECHO-T2DM.

Methods:

The validity of ECHO-T2DM was assessed using criteria recommended by the International Society for Pharmacoeconomics and Outcomes Research/Society for Medical Decision Making (ISPOR/SMDM). Specifically, the results of a number of clinical trials were predicted and compared with observed study end-points using a scatterplot and regression approach. An F-test of the best-fitting regression was added to assess whether it differs statistically from the identity (45°) line defining perfect predictions. In addition to testing the full model using all of the validation study data, tests were also performed of microvascular, macrovascular, and survival outcomes separately. The validation tests were also performed separately by type of data (used vs not used to construct the model, economic simulations, and treatment effects).

Results:

The intercept and slope coefficients of the best-fitting regression line between the predicted outcomes and corresponding trial end-points in the main analysis were ?0.0011 and 1.067, respectively, and the R2 was 0.95. A formal F-test of no difference between the fitted line and the identity line could not be rejected (p?=?0.16). The high R2 confirms that the data points are closely (and linearly) associated with the fitted regression line. Additional analyses identified that disagreement was highest for macrovascular end-points, for which the intercept and slope coefficients were 0.0095 and 1.225, respectively. The R2 was 0.95 and the estimated intercept and slope coefficients were 0.017 and 1.048, respectively, for mortality, and the F-test was narrowly rejected (p?=?0.04). The sub-set of microvascular end-points showed some tendency to over-predict (the slope coefficient was 1.095), although concordance between predictions and observed values could not be rejected (p?=?0.16).

Limitations:

Important study limitations include: (1) data availability limited one to tests based on end-of-study outcomes rather than time-varying outcomes during the studies analyzed; (2) complex inclusion and exclusion criteria in two studies were difficult to replicate; (3) some of the studies were older and reflect outdated treatment patterns; and (4) the authors were unable to identify published data on resource use and costs of T2DM suitable for testing the validity of the economic calculations.

Conclusions:

Using conventional methods, ECHO-T2DM simulated the treatment, progression, and patient outcomes observed in important clinical trials with an accuracy consistent with other well-accepted models. Macrovascular outcomes were over-predicted, which is common in health-economic models of diabetes (and may be related to a general over-prediction of event rates in the United Kingdom Prospective Diabetes Study [UKPDS] Outcomes Model). Work is underway in ECHO-T2DM to incorporate new risk equations to improve model prediction.  相似文献   

12.
Data created in a controlled laboratory setting are a relatively new phenomenon to economists. Traditional data analysis methods using either parametric or nonparametric tests are not necessarily the best option available to economists analyzing laboratory data. In 1935, Fisher proposed the randomization technique as an alternative data analysis method when examining treatment effects. The observed data are used to create a test statistic. Then treatment labels are shuffled across the data and the test statistic is recalculated. The original statistic can be ranked against all possible test statistics that can be generated by these data, and a p-value can be obtained. A Monte Carlo analysis of t-test, the Mann-Whitney U-test, and the exact randomization t-test is conducted. The exact randomization t-test compares favorably to the other two tests both in terms of size and power. Given the limited distributional assumptions necessary for implementation of the exact randomization test, these results suggest that experimental economists should consider using the exact randomization test more often. This revised version was published online in August 2006 with corrections to the Cover Date.  相似文献   

13.
The popular sentiment-based investor index SBW introduced by Baker and Wurgler (2006, 2007) is shown to have no predictive ability for stock returns. However, Huang et al. (2015) developed a new investor sentiment index, SPLS, which can predict monthly stock returns based on a linear framework. However, the linear model may lead to misspecification and lack of robustness. We provide statistical evidence that the relationship between stock returns, SBW and SPLS is characterized by structural instability and inherent nonlinearity. Given this, using a nonparametric causality approach, we show that neither SBW nor SPLS predicts stock market returns or even its volatility, as opposed to previous empirical evidence.  相似文献   

14.
Club Convergence in Carbon Dioxide Emissions   总被引:3,自引:2,他引:1  
We examine convergence in carbon dioxide emissions among 128 countries for the period 1960–2003 by means of a new methodology introduced by Phillips and Sul (Econometrica 75(6):1771–1855, 2007a). Contrary to previous studies, our approach allows us to examine for evidence of club convergence, i.e. identify groups of countries that converge to different equilibria. Our results suggest convergence in per capita CO2 emissions among all the countries under scrutiny in the early years of our sample. However, there seem to be two separate convergence clubs in the recent era that converge to different steady states. Interestingly, we also find evidence of transitioning between the two convergence clubs suggesting either a slow convergence between the two clubs or a tendency for some countries to move from one convergence club to the other.  相似文献   

15.
We examine the effect of corporate social responsibility (CSR) quality ratings on the financial distress levels of Chinese enterprises by using the previously unexplored new China-specific Altman ‘ZChina Score’ in the context of CSR and data from 749 firms over the 2009–2014 period. First, we find that CSR quality ratings significantly reduce Chinese firms’ distress levels. Second, we find that the ability of CSR to reduce distress levels in non-state-owned Chinese firms is higher than state-owned ones. Finally, we find similar results when we divide the data into high-low CSR ratings and levels of distress. Our results are robust to potential endogeneities.  相似文献   

16.
In this paper we forecast annual budget deficits using monthly information. Using French monthly data on central government revenues and expenditures, the method we propose consists of: (1) estimating monthly ARIMA models for all items of central government revenues and expenditures; (2) inferring the annual ARIMA models from the monthly models; (3) using the inferred annual ARIMA models to perform one-step-ahead forecasts for each item; (4) compounding the annual forecasts of all revenues and expenditures to obtain an annual budget deficit forecast. The major empirical benefit of this technique is that as soon as new monthly data become available, annual deficit forecasts are updated. This allows us to detect in advance possible slippages in central government finances. For years 2002–2004, forecasts obtained following the proposed approach are compared with a benchmark method and with official predictions published by the French government. An evaluation of their relative performance is provided.   相似文献   

17.
We study the relation between foreign entry in U.S. service sector industries and the revealed comparative advantage of the investing country using U.S. Bureau of Economic Analysis firm‐level data on all foreign takeovers and new foreign‐owned firms from 1998 to 2008. We find foreign acquisitions in the service sector are in industries of U.S. comparative advantage while new foreign firms are in industries of investing country's comparative advantage. This suggests that foreign acquisitions in the service sector are not directly related to foreign investors' competitiveness in the industry of investment. In contrast foreign investors in new service sector firms come from countries with a competitive edge in the industries of investment. We also find that foreign investors of new service sector firms are from Organization for Economic Co‐operation and Development (OECD) countries with a comparative disadvantage in royalties and trademarks. (JEL F21, F23, G34)  相似文献   

18.
Errors introduced by using aggregate data in estimating a consumer demand model have long been a concern. We study the effects of such errors on elasticity estimates derived from AIDS and QUAIDS models. Based on a survey of published articles, a generic parameterization of the income distribution, and the range of Gini coefficients reported for 28 OECD countries, we generate and analyze a large number of “observations” on the differences between elasticities calculated at the aggregate level and those calculated at the micro level. We suggest a procedure for evaluating the likely range of aggregation error when a model is estimated with aggregate data.  相似文献   

19.
Abstract. All the long‐standing archives at economics journals do not facilitate the reproduction of published results. The data‐only archives at Journal of Business and Economic Statistics and Economic Journal fail in part because most authors do not contribute data. Results published in the FRB St. Louis Review can rarely be reproduced using the data+code in the journal archive. Recently created archives at top journals should avoid the mistakes of their predecessors. We categorize reasons for archives' failures and identify successful policies.  相似文献   

20.
企业孵化器于20世纪60年代开始兴起,经过50多年的发展,企业孵化器作为推进技术创新的载体受到各国学界、产业界、政界的极大关注。本研究以国外期刊文献为数据源,采用文献计量分析法,对国外2007-2011年以来涉及企业孵化器各个研究领域的相关文献分别从发表年份、刊登期刊、作者合作度、研究方法等方面进行计量分析,并分别对发达国家、新兴工业化国家、发展中国家的企业孵化器的研究进行梳理,以便较全面地认识企业孵化器的发展趋势,为我国孵化器发展提供借鉴。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号