共查询到20条相似文献,搜索用时 15 毫秒
1.
Po-Hsuan Hsu Chi-Hsiu WangJoseph Z. Shyu Hsiao-Cheng Yu 《Technological Forecasting and Social Change》2003,70(1):67-82
Forecasting the production of technology industries is important to entrepreneurs and governments, but usually suffers from market fluctuation and explosion. This paper aims to propose a Litterman Bayesian vector autoregression (LBVAR) model for production prediction based on the interaction of industrial clusters. Related industries within industrial clusters are included into the LBVAR model to provide more accurate predictions. The LBVAR model possesses the superiority of Bayesian statistics in small sample forecasting and holds the dynamic property of the vector autoregression (VAR) model. Two technology industries in Taiwan, the photonics industry and semiconductor industry are used to examine the LBVAR model using a rolling forecasting procedure. As a result, the LBVAR model was found to be capable of providing outstanding predictions for these two technology industries in comparison to the autoregression (AR) model and VAR model. 相似文献
2.
Over the last 20 years, the smartphone technologies at the device level have undergone tremendous change. This paper puts forward a framework to characterise, assess and forecast the smartphone technologies at the device level. The study assesses and forecasts the technological advancement observed in smartphones using technology forecasting using data envelopment analysis with an objective to evaluate the technological rate of change in the device. A quarterly data set comprising 31 quarters from 2007 to 2014 was analysed for smartphone releases in a particular price range. For the validation purpose, the analysis was designed to set the point of forecasting somewhere in 2012 using the data set between 2007 and 2012, so as to forecast the technologies thereon till 2014. The results indicate that the rate of technological change in smartphones is accelerating. 相似文献
3.
An evolved version of the Soviet-originated Theory of Inventive Problem Solving, TRIZ, contains a series of generically predictable technology and business evolution trends uncovered from the systematic analysis of over 2 million patents, academic journals and business texts. The current state of the art—recorded for the first time together in this paper—now bring the total number of generic technical trends to over 30, and the number of business trends to over 20. The paper describes some of the newly discovered trends, and their incorporation into a design method that allows individuals and businesses to first establish the relative maturity of their current systems, and then, more importantly, to identify areas where ‘evolutionary potential’ exists. The paper introduces this concept of evolutionary potential—defined as the difference between the relative maturity of the current system, and the point where it has reached the limits of each of the evolution trends—through a number of case study examples focused on the design and evolution of complex systems. 相似文献
4.
Seongyong Choi 《Technology Analysis & Strategic Management》2014,26(3):241-251
Vacant technology forecasting (VTF) is a technology forecasting approach to find technological needs for given industrial field in the future. It is important to know the future trend of developing technology for the R&D planning of a company and a country. In this paper, we propose a new Bayesian model for patent clustering. This is a VTF methodology based on patent data analysis. Our method is composed of Bayesian learning and ensemble method to construct the VTF model. To illustrate the practical way of the proposed methodology, we perform a case study of given technology domain using retrieved patent documents from patent databases in the world. 相似文献
5.
6.
D. N. P. Murthy 《Technological Forecasting and Social Change》1979,14(1):27-37
A stochastic model for technology forecasting is proposed. A complete analysis of the model is given and application to a real problem is presented. 相似文献
7.
Focusing on the general issues of reliability and validity, several specific problems evident in reported applications of the Delphi method to forecasting are discussed. Reliability threats, which are noted in a number of procedural aspects of the method, arise from ill-considered procedural variations and lack of standardization. While validity threats are also found in several procedural aspects of the tool, they arise principally from pressures for convergence of predictions. This feature of the method, along with certain structural characteristics, is found to undermine critically its forecasting ability. Having discussed in some detail the nature of these difficulties, the paper closes with consideration of the reasons for the continued use of Delphi in spite of its shortcomings and with comments on alternative approaches. 相似文献
8.
Lavada A. Adams 《Technological Forecasting and Social Change》1980,18(2):151-160
For the time period 1974–1977 there has not existed a list of new and future arbitration issues, ranked or otherwise. This research was conducted to contribute to an update of the literature on the major new and future issues in grievance arbitration. A Delphi survey was selected as the data collection instrument. This article will focus on the predicted future issues in grievance arbitration. 相似文献
9.
In an increasingly data-rich environment, the use of factor models for forecasting purposes has gained prominence in the literature and among practitioners. Herein, we assess the forecasting behaviour of factor models to predict several GDP components and investigate the performance of a bottom-up approach to forecast GDP growth in Portugal, which was one of the hardest hit economies during the latest economic and financial crisis. We find supporting evidence of the usefulness of factor models and noteworthy forecasting gains when conducting a bottom-approach drawing on the main aggregates of GDP. 相似文献
10.
This paper considers methods for forecasting macroeconomic time series in a framework where the number of predictors, N, is too large to apply traditional regression models but not sufficiently large to resort to statistical inference based on double asymptotics. Our interest is motivated by a body of empirical research suggesting that popular data-rich prediction methods perform best when N ranges from 20 to 40. In order to accomplish our goal, we resort to partial least squares and principal component regression to consistently estimate a stable dynamic regression model with many predictors as only the number of observations, T, diverges. We show both by simulations and empirical applications that the considered methods, especially partial least squares, compare well to models that are widely used in macroeconomic forecasting. 相似文献
11.
One of the major attributes determining system reliability, the one that has received the most thorough and systematic study for many years, is system survival function. A “survival function” is a mathematical formula relating the probability of satisfactory performance of a system to time. Here, probability of satisfactory performance is synonymous with probability of nonfailure or probability of survival of a performing system.In breakthrough analysis of complex technological systems, the situation is somewhat similar but opposite to the above system reliability case. For breakthrough forecasting, the problem is to determine the probability of occurrence of success of a nonperforming system. Thus this paper presents a quantitative methodology for forecasting technological breakthroughs using a new concept of “attainability function,” derived in a similar fashion as the “reliability function.” 相似文献
12.
Juneseuk Shin Author Vitae Yongtae Park Author Vitae 《Technological Forecasting and Social Change》2009,76(8):1078-1091
Today's innovation process is best characterized by nonlinearity and interaction. Agent-based models build on these concepts, but have not been useful in practice because they are either too complex or too simple to make a good match with reality. As a remedy, we employ a Brownian agent model with intermediate complexity to produce value-added technology forecasting. As an illustration with Korea's software industry data, computer simulation is carried out. Attracted by higher technology value, agents concentrate on specific technology regions, and form co-existing major technology regions of high density. A rough comparison with actual software production data exhibits a fair reflection of reality, and supports the underlying idea that economic motivation of agents should be considered. 相似文献
13.
Tobias GnatzyAuthor Vitae Johannes WarthAuthor VitaeHeiko von der GrachtAuthor Vitae Inga-Lena DarkowAuthor Vitae 《Technological Forecasting and Social Change》2011,78(9):1681-1694
A novel and innovative real-time Delphi technique is introduced in order to address previously identified weaknesses of the conventional Delphi method, such as complicated facilitator tasks, lack of real-time presentation of results, and difficulties in tracking progress over time. We demonstrate how the real-time (computer-based) method increases the efficiency of the process, accommodates expert availability, and reduces drop-out-rates. Modifications in the Delphi procedure (e.g. change of iteration principle) not only increase efficiency but also change the nature and process of the survey technique itself. By identifying and analysing three individual effects (initial condition effect, feedback effect, and iteration effect) we examine whether the modifications in the survey process cause deviations to the survey results. Empirical data obtained from both conventional as well as real-time Delphi studies is analysed based on multiple statistical analyses. The research findings indicate that significant differences between the two Delphi survey formats do not exist and final survey results are not affected by changes in the survey procedure. 相似文献
14.
Delphi and other methods of using expert opinion to generate forecasts can be a useful tool for planning, impact assessment, and policy analysis. Unfortunately, little is known about the accuracy of forecasts produced using these methods, so their utility is limited at present. Based on the logic of the Delphi method, I suggest that: 1) forecast accuracy should increase across rounds of a Delphi iteration, 2) there is a positive correlation between a panelist's uncertainty about a forecast and his or her shift in forecast from round to round, 3) forecasts weighted by self-reported confidence will be more accurate than unweighted forecasts, and 4) the use of robust estimates of location as summaries of expert opinion yield better forecasts than nonrobust measures. A Delphi experiment provides little support to any of these hypotheses. This finding suggests that traditional assumptions about the proper methods for analyzing a Delphi study may be inappropriate. 相似文献
15.
Carluccio Bianchi Alessandro Carta Dean Fantazzini Maria Elena De Giuli Mario A. Maggi 《Applied economics》2013,45(25):3267-3277
World economies, and especially European ones, have become strongly interconnected in the last decade and a joint modelling is required. We propose here the use of copulae to build flexible multivariate distributions, since they allow for a rich dependence structure and more flexible marginal distributions that better fit the features of empirical data, such as leptokurtosis. We use our approach to forecast industrial production series in the core European Monetary Union (EMU) countries and we provide evidence that the copula-Vector Autoregression (VAR) model outperforms or at worst compares similarly to normal VAR models, keeping the same computational tractability of the latter approach. 相似文献
16.
Previous studies indicate that the poor forecasting performance of constant parameter UK consumption expenditure models is caused by structural instability in the underlying data generating process. Typically, this instability is removed by reparameterization within the constant parameter framework. An alternative modelling strategy is to allow some, or all, of the parameters to vary over time. A UK non-durable consumption expenditure model with time-varying parameters is developed, based on the permanent income hypothesis of Friedman (1957). This model takes into account temporal changes in the average and marginal propensities to consume. The variation in the parameter estimates is given an economic interpretation in terms of the influence of omitted variables, namely UK financial liberalization and expectational changes. The forecasting performance of this model is superior to that of two widely used constant parameter models. Further tests show that, even if these constant parameter models are respecified as time varying parameter models, the authors' model still retains a superior forecasting performance. 相似文献
17.
Boriss Siliverstovs 《Applied economics》2017,49(13):1326-1343
In this article, we extend the targeted-regressor approach suggested in Bai and Ng (2008) for variables sampled at the same frequency to mixed-frequency data. Our MIDASSO approach is a combination of the unrestricted MIxed-frequency DAta-Sampling approach (U-MIDAS) (see Foroni et al. 2015; Castle et al. 2009; Bec and Mogliani 2013), and the LASSO-type penalized regression used in Bai and Ng (2008), called the elastic net (Zou and Hastie 2005). We illustrate our approach by forecasting the quarterly real GDP growth rate in Switzerland. 相似文献
18.
Martin Steinert 《Technological Forecasting and Social Change》2009,76(3):291-300
This paper presents an adapted Delphi methodology that is, contrary to the classical Delphi design is not aiming to minimize expert estimation variance, but to maximize the range of expert opinions inputted sequentially into an online system. After discussing the traditional Delphi approach and its dissensus based derivatives, the author opens the case for a dissensus Delphi based explorative research tool with special consideration of the Delphi aim, the expert sample and the Delphi design. The proposed online Delphi process is then presented conceptually. Next, the proposed tool is demonstrated based on a prototype, exploring the barrier factors to the adoption of mobile data services. A discussion on the theoretical design and practical R&D experience of the dissensus based online Delphi approach concludes the paper. 相似文献
19.
20.
Due to the high complexity and strong nonlinearity nature of foreign exchange rates, how to forecast foreign exchange rate accurately is regarded as a challenging research topic. Therefore, developing highly accurate forecasting method is of great significance to investors and policy makers. A new multiscale decomposition ensemble approach to forecast foreign exchange rates is proposed in this paper. In the approach, the variational mode decomposition (VMD) method is utilized to divide foreign exchange rates into a finite number of subcomponents; the support vector neural network (SVNN) technique is used to model and forecast each subcomponent respectively; another SVNN technique is utilized to integrate the forecasting results of each subcomponent to generate the final forecast results. To verify the superiority of the proposed approach, four major exchange rates were chosen for model comparison and evaluation. The experimental results indicate that our proposed VMD-SVNN-SVNN multiscale decomposition ensemble approach outperforms some other benchmarks in terms of forecasting accuracy and statistical tests. This demonstrates that our proposed VMD-SVNN-SVNN multiscale decomposition ensemble approach is promising for forecasting foreign exchange rates. 相似文献