首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Early adopters play an important role in the innovation diffusion process. Over the past decades, many factors have been identified as predictors for early adoption of innovations. Less attention has been paid to the relationship between the early adoption of one generation of a specific product and the early adoption of successive product generations. This paper analyzes how early adoption of a new product generation depends on ownership, purchase experience and adoption times for previous generations of the same product. The paper develops predictive models of early adoption for four generations of video player products, based on a survey among 815 Australian consumers. The model allows the testing of various hypotheses. It is shown that previous generation variables outperform conventional socio-demographic and psychographic variables in predicting early adoption but also that the two variable types complement each other. The best predicting models include both previous generation and socio/psychographic variables. It is concluded that previous generation models have substantial merits for new product forecasting as they are more parsimonious than conventional models and the data required to estimate them is relatively easy to obtain.  相似文献   

2.
Based on the behavioral assumptions of diffusion theory, this article proposes an extension of the Bass diffusion model that simultaneously captures the substitution pattern for each successive generation of a durable technological innovation, and the diffusion pattern of the base technology. Normative guidelines based on the model suggest that a firm should either introduce a new generation as soon as it is available or delay its introduction to a much later date at the maturity stage of the preceding generation. The decision depends on a number of factors including the relative size of the market potentials, gross profit margins, the diffusion and substitution parameters, and the discount factor of the firm. This “now or at maturity” rule is thus an extension and generalization of the “now or never” rule of Wilson and Norton [25]. Empirical and normative implications of the proposed model are explored for four successive generations on IBM mainframe computers: first generation (vacuum tubes); second generation (transistors); 360 family (integrated circuits); and 370 family (silicon chips). The model describes the growth of these generations well. The application of normative guidelines suggests that IBM introduced the two successive generations of 360 and 370 families too late, i.e., their time to market should have been shorter. Limitations and further extensions of the model and the application are discussed.  相似文献   

3.
Economic policy formulation suffers from many ills, not the least of which is a basic inadequacy in the methods of long-range economic forecasting. This article discusses the need for a longer time perspective in economic policy and the shortcomings of current methods in regard to philosophical assumptions, theoretical limitations, economic modeling problems, and institutional issues. Interdisciplinary policy modeling is suggested as a partial solution to these shortcomings, and two examples are offered—one in regional policy simulation and other in world food-supply modeling.  相似文献   

4.
Conventional measures of forecasting accuracy reflect the view that forecast evaluation should concentrate on all large disturbances and ignore turning-point errors. Many forecasters, however, believe missed turns are the most grievous of all forecasting errors. Despite this consensus, no generally acceptable measure of this type of forecasting error exists. In this paper, such a measure—the probability of correctly forecasting directional change—is introduced. Values of this measure are computed for eleven well-known macroeconometric forecasting models. An inequality-type index of relative directional accuracy based on this measure is also presented and used to evaluate the models in terms of their relative accuracy.  相似文献   

5.
A method is presented for the continuous assessment of major technological advances—the George Washington University (GWU) forecast of emerging technologies. Environmental scanning and trend analysis are used to identify emerging technologies (ETs), and a Delphi-type survey then asks a panel of authorities to estimate the year each advance will occur, its associated probability, the potential size of its market, and the nation that will lead each ET. Eighty-five prominent ETs have been identified and grouped into 12 fields: energy, environment, farming and food, computer hardware, computer software, communications, information services, manufacturing and robotics, materials, medicine, space, and transportation. Results are presented from four survey rounds covering the past 8 years, and they are compared longitudinally to estimate the range of variance. The data are also divided into three successive decades to provide scenarios portraying the unfolding waves of innovation that comprise the coming technology revolution.  相似文献   

6.
In a great number of cases, the growth behavior of a field of technology is the sum total of many technological achievements, each following its own logistics-type life story. The tendency in sequential logistic lines positions is described with a logistic curve tangent to individual curves—a logistic envelope. A method of constructing such an envelope based on experimental data is proposed and applied to forecasting of primary electro-chemical cells' development.  相似文献   

7.
The Rotterdam model is a discrete approximation to a continuous time model and the approximation errors are known to be small. Unlike other approximations, with fixed parameters at the point of approximation, the Rotterdam model's coefficients differ at each observation, but are treated as constants during estimation. This procedure introduces relatively large approximation errors which are a potential source of bias. In this paper the performance of the Rotterdam model in reproducing the characteristics of four integrable demand systems is assessed. As anticipated, the model's performance is directly related to the variability of the underlying coefficients and the bias in the parameter estimates is minimal if the coefficients are relatively constant. The model can also discriminate well against an invalid alternative hypothesis. However, in the absence of a computationally feasible varying parameter methodology the conclusion has to be that tests of parameter stability are essential when using the Rotterdam model.  相似文献   

8.
This paper examines, via real data, some well known models for technology substitution analysis. We propose a family of data-based transformed models that will include the models under examination as special cases. The basic thrust of the paper is the recognition that for technology substitution analysis, the observations are time series data and hence are not independent. Also, the functional form of the model should be determined by both theoretical considerations as well as the data on hand. This suggests that the traditional ordinary least squares procedure used in estimating the parameters and the resulting forecasting procedures are not adequate. The existing models examined here are Fisher–Pry, Gompertz, Weibull, and Normal. We stress the statistical aspects of the models and their relative merits in terms of predictive power. The criteria used for the purpose of comparison are the mean squared deviation and the mean absolute deviation of the predicted values compared with the actual observations.  相似文献   

9.
基于最优组合预测模型的港口集装箱吞吐量预测   总被引:1,自引:0,他引:1  
童明荣  薛恒新  林琳 《技术经济》2006,25(12):82-84,92
根据港口集装箱吞吐量非线性增长等特点,建立了三次指数平滑预测模型、灰色系统预测模型及BP神经网络预测模型等单项预测模型。鉴于单项预测模型的局限性,提出了以测试数据的预测误差绝对值加权和最小为最优化准则的最优组合预测模型,采用线性规划的方法确定最优组合的权系数。最后,给出一个实例进行应用和分析。  相似文献   

10.
This paper suggests a technology forecasting approach based on a semi-Markov model, which appropriately describes the probabilistic nature of a sequential technology development process. This approach focuses primarily on the utilization of the information that has been skipped in conventional Delphi survey data. That is, through a simple statistic, the interrelationships among sequential technology developments can be extracted in a formal structure of a semi-Markov model from the original Delphi panel's estimates. A simulation technique is developed to forecast the development process by utilizing the information on such interrelationships. This technique provides a flexible and useful tool for R&D planners or project managers, especially in postanalysis of Delphi forecasting. To make good use of the approach, a computer-based interactive Delphi data analysis system (IDEAS) is implemented in IBM PC.  相似文献   

11.
Many forms of analyzing future technology and its consequences coexist, for example, technology intelligence, forecasting, roadmapping, assessment, and foresight. All of these techniques fit into a field we call technology futures analysis (TFA). These methods have matured rather separately, with little interchange and sharing of information on methods and processes. There is a range of experience in the use of all of these, but changes in the technologies in which these methods are used—from industrial to information and molecular—make it necessary to reconsider the TFA methods. New methods need to be explored to take advantage of information resources and new approaches to complex systems. Examination of the processes sheds light on ways to improve the usefulness of TFA to a variety of potential users, from corporate managers to national policy makers. Sharing perspectives among the several TFA forms and introducing new approaches from other fields should advance TFA methods and processes to better inform technology management as well as science and research policy.  相似文献   

12.
Two methodologies—Kane's KSIM and Forrester's system dynamics—for modeling socioeconomic systems are compared and contrasted with regard to the manner in which each characterizes and classifies casuality in socioeconomic systems. The equivalence of the so-called casual diagram used in system dynamics to the cross-impact matrix used in KSIM is indicated. Each method identifies exactly two classes of casual links—the similarity between the classes distinguished in system dynamics with those employed in KSIM is suggested. Then the assumptions regarding the nature of causality that are implicit within the two methodologies are compared. In this context techniques for translating linear system-dynamics models to KSIM-like models (and vice versa) are provided. Examples are provided to illustrate the notions discussed in the article.  相似文献   

13.
This paper describes the London Business School econometric model — the first fully computerized model of the UK — which has been used for regular public forecasting since 1966. The model, estimated on quarterly data, is organized around the income expenditure accounts with a fully integrated flow of funds sector which ensures consistency between portfolio decisions and income, savings and investment decisions. Aggregate demand is built up from its individual components so that demand influences are important for the short- and medium-term behaviour of the model. But there are important supply-side effects which work through the real exchange rate and real wages. Monetary conditions have a powerfull effect on the model through the exchange rate, personal sector wealth and interest rates. Wages and employment are determined in a labour market in which employment decisions depend on the level of demand and real wages while real wages depend on the level of unemployment, real benefits and direct and indirect taxes as well as underlying trends in productivity. Asset prices move in any period to clear both the spot and the future market in assets so that current asset prices in the equity, gilt-edged and foreign exchange markets reflect all current information about the expected state of the economy. In contrast, goods prices adjust sluggishly. The combination of continuously clearing asset markets and sluggish wages and prices gives the model many of the theoretical characteristics associated with the open-economy models of Dornbusch and Buiter and Miller.  相似文献   

14.
Summary In overlapping-generations models of fiat money, the existence of a Pareto-optimal equilibrium — which defines an optimal quantity of money — is more general than well-known counter-examples suggest. Those examples, having no optimal equilibrium just because there are small variations in households' tastes and endowments across generations, are not typical. On the contrary: For an open-dense, full-measure subset of smooth stationary economies and an open-dense subset of continuous stationary economies, introducing small variations in tastes and endowments across generations preserves the existence of an optimal equilibrium. Put simply, optimal equilibria generically exist for nearly-stationary economies.I thank Scott Freeman, Katsuhiko Kawai, and two referees for proofreading this text; all lead to clarifications.  相似文献   

15.
Intergenerational Redistribution with Short-lived Governments   总被引:2,自引:0,他引:2  
We study the politics of intergenerational redistribution in an overlapping generations model with short-lived governments. The successive governments—who care about the welfare of the currently living generations and possibly about campaign contributions—are unable to pre-commit the future course of redistributive taxation. In a stationary politico-economic equilibrium, the intergenerational transfer in each period depends on the current value of the state variable and all expectations about future political outcomes are fulfilled. We find that there exist multiple stationary equilibria in many political settings. Steady-state welfare is often lower than it would be in the absence of redistributive politics.  相似文献   

16.
This paper reports on a detailed comparison of the practical application of two well-known forecasting methods—a surprisingly rare exercise. Delphi and cross-impact analyses are among the best-known methods that apply quantitative approaches to derive forecasts from expert opinion. Despite their prominence, there is a marked shortage of clear guidance as to when and where–and how–particular methods can be useful, or as to what their costs and benefits are. This study applied the two methods to the same area, future European transport systems, using the same expert knowledge base. The results of the implementation of the two techniques were assessed and evaluated, in part through two evaluation questionnaires completed by the experts who participated in the study. This paper describes these encounters with methodology and evaluation, presents illustrative results of the forecasting study, and draws lessons as to good practice in use of these specific methods, as well as concerning methodological good practice in general—for example, stressing the need for systematic documentation, and the scope for debate about established practices.  相似文献   

17.
FORECASTING INFLATION USING DYNAMIC MODEL AVERAGING*   总被引:1,自引:0,他引:1  
We forecast quarterly US inflation based on the generalized Phillips curve using econometric methods that incorporate dynamic model averaging. These methods not only allow for coefficients to change over time, but also allow for the entire forecasting model to change over time. We find that dynamic model averaging leads to substantial forecasting improvements over simple benchmark regressions and more sophisticated approaches such as those using time varying coefficient models. We also provide evidence on which sets of predictors are relevant for forecasting in each period.  相似文献   

18.
Previous studies indicate that the poor forecasting performance of constant parameter UK consumption expenditure models is caused by structural instability in the underlying data generating process. Typically, this instability is removed by reparameterization within the constant parameter framework. An alternative modelling strategy is to allow some, or all, of the parameters to vary over time. A UK non-durable consumption expenditure model with time-varying parameters is developed, based on the permanent income hypothesis of Friedman (1957). This model takes into account temporal changes in the average and marginal propensities to consume. The variation in the parameter estimates is given an economic interpretation in terms of the influence of omitted variables, namely UK financial liberalization and expectational changes. The forecasting performance of this model is superior to that of two widely used constant parameter models. Further tests show that, even if these constant parameter models are respecified as time varying parameter models, the authors' model still retains a superior forecasting performance.  相似文献   

19.
One of the methods of studying complex objects is the construction of a mathematical model, containing such information about the object that is necessary to solve a definite problem connected with it.Mathematical modeling, based on the construction of models of various kinds can be used in forecasting. Let a forecasting object A(X) be described by vector X = (X1, X2,…,Xn) whose coordinates are parameters characterizing this object. The work presents a probabilistic model of forecasting and gives the example of a forecast of the object described by a set two parameters.  相似文献   

20.
This paper examines a world which is composed of countries each inhabited by a population of farsighted overlapping generations and in which the only assets are the national currencies that grow at constant proportional rates. It is shown that the world economy is unstable in the sense that, away from the steady state, either the real value of each country's stock of money goes to zero or the world monetary system eventually collapses. It is also shown that the dynamic paths of the price level and the real stock of money in each country may move nonmonotonically over time.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号