首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
A significant part of the literature on input-output (IO) analysis is dedicated to the development and application of methodologies forecasting and updating technology coefficients and multipliers. Prominent among such techniques is the RAS method, while more information demanding econometric methods, as well as other less promising ones, have been proposed. However, there has been little interest expressed in the use of more modern and often more innovative methods, such as neural networks in IO analysis in general. This study constructs, proposes and applies a Backpropagation Neural Network (BPN) with the purpose of forecasting IO technology coefficients and subsequently multipliers. The RAS method is also applied on the same set of UK IO tables, and the discussion of results of both methods is accompanied by a comparative analysis. The results show that the BPN offers a valid alternative way of IO technology forecasting and many forecasts were more accurate using this method. Overall, however, the RAS method outperformed the BPN but the difference is rather small to be systematic and there are further ways to improve the performance of the BPN.  相似文献   

2.
3.
The evolution of Nobel prize awards is studied as a learning/growing process. A simple logistic function describes the data well and accounts for the competition, be it between individuals or between nations. The American niche appears to be 64% exhausted by 1987, implying a diminishing expected rate of laureates in the future. Europeans have been losing ground continuously from the beginning while the remaining world has recently received more awards. Projections to the year 2000 and beyond are given. Correlations with age support the Darwinian nature of the competition for Nobel prizes. Awards to women show peaks coincidental with outbursts of feminism.  相似文献   

4.
We compare the out-of-sample performance of monthly returns forecasts for two indices, namely the Dow Jones (DJ) and the Financial Times (FT) indices. A linear and a nonlinear artificial neural network (ANN) model are used to generate the out-of-sample competing forecasts for monthly returns. Stationary transformations of dividends and trading volume are considered as fundamental explanatory variables in the linear model and the input variables in the ANN model. The comparison of out-of-sample forecasts is done on the basis of forecast accuracy, using the Diebold and Mariano test [J. Bus. Econ. Stat. 13 (1995) 253.], and forecast encompassing, using the Clements and Hendry approach [J. Forecast. 5 (1998) 559.]. The results suggest that the out-of-sample ANN forecasts are significantly more accurate than linear forecasts of both indices. Furthermore, the ANN forecasts can explain the forecast errors of the linear model for both indices, while the linear model cannot explain the forecast errors of the ANN in either of the two indices. Overall, the results indicate that the inclusion of nonlinear terms in the relation between stock returns and fundamentals is important in out-of-sample forecasting. This conclusion is consistent with the view that the relation between stock returns and fundamentals is nonlinear.  相似文献   

5.
6.
Professor Takatoshi Ito in a recent American Economic Review paper documents that exporters and importers have biased exchange rate forecasts and the biases are in opposite directions. At first glance, it is difficult to provide a rational foundation for such behavior. Professor Ito hypothesizes that these forecasts are the result of wishful expectations. However, if forecasts are stochastic, then minimization of a quadratic loss function results in a rational hedge applied to either $/¥ or ¥/$. The different perspectives of importers and exporters determines which form is hedged and determines the direction of the bias.  相似文献   

7.
8.
Conventional procedures for calculating confidence limits of forecasts generated by statistical models provide little guidance for forecasts based on a combination or a consensus process rather than formal models, as is the case with US Department of Agriculture (USDA) forecasts. This study applied and compared several procedures for calculating empirical confidence intervals for USDA forecasts of corn, soybean and wheat prices over the 1980/81 through 2006/07 marketing years. Alternative procedures were compared based on out-of-sample performance over 1995/96 through 2006/07. The results of this study demonstrate that kernel density, quantile distribution and best fitting parametric distribution (logistic) methods provided confidence intervals calibrated at the 80% level prior to harvest and 90% level after harvest. The kernel density-based method appears most accurate both before and after harvest with the final value falling inside the forecast interval 77% of the time before harvest and 92% after harvest, followed by quantile regression (73% and 91% before and after harvest, respectively) logistic distribution (73% and 90% before and after harvest, respectively) and histogram (66% and 84% before and after harvest, respectively). Overall, this study demonstrates that empirical approaches may be used to construct more accurate confidence intervals for USDA corn, soybean and wheat price forecasts.  相似文献   

9.
10.
Science itself can be considered as a “fuzzy system.” In attempting to deal with possible laws of scientific development we formulate a simple, partial model and illustrate its use as a means to control the strategy of investments in science.  相似文献   

11.
In a Bayesian assessment, beliefs are computed from the strategy profile applying Bayes rule at positive probability information sets. A consistent assessment is the limit point of a sequence of completely mixed Bayesian assessments. We characterize the set of extensive forms for which the sets of Bayesian and consistent assessments coincide. As an illustration of the results, we characterize consistency in some multi-period games with simultaneous actions.  相似文献   

12.
Macroeconomic policy decisions in real-time are based on the assessment of current and future economic conditions. Crucially, these assessments are made difficult by the presence of incomplete and noisy data. The problem is more acute for emerging market economies, where most economic data are released infrequently with a (sometimes substantial) lag. This paper evaluates nowcasts and forecasts of real GDP growth using five models for ten Latin American countries. The results indicate the flow of monthly data helps to improve forecast accuracy, and the dynamic factor model consistently produces more accurate nowcasts and forecasts relative to other model specifications, across most of the countries we consider.  相似文献   

13.
Systems like LINK should not generally be used for long-range forecasts and simulations because they eschew many significant long-term issues. World-wide fluctuations in the division of labour should be endogenous, and similarly the various momentous, but partly induced, international ‘shocks’ like the energy crisis and the breakdown of world monetary order. The same applies for several aspects of the national macroeconometric models making up systems like LINK. Consequently, if such systems are used mechanically to churn out long-range forecasts and simulations, the results will generally be misleading and the inflationary impact of the policy simulations is likely to be underestimated.  相似文献   

14.
We analyse forecasts of professional forecasters for Germany regarding the time span from 1970 to 2004. This novel panel data set renders it possible to assess the accuracy and efficiency of growth and inflation forecasts more efficiently than in previous studies. We argue that the forecasts are, on average, unbiased and weakly—but not strongly—efficient. Using model confidence sets suggested by Hansen et al. (2004), we find that, besides the effect of diverging forecasting dates, no other substantial differences in forecasting quality among forecasters exist. Nevertheless, on the basis of a direction-of-change analysis we argue that it is not always advisable to listen to the majority of forecasters.
Ulrich Fritsche (Corresponding author)Email:
  相似文献   

15.
Motivated by the recent literature on cryptocurrency volatility dynamics, this paper adopts the ARJI, GARCH, EGARCH, and CGARCH models to explore their capabilities to make out-of-sample volatility forecasts for Bitcoin returns over a daily horizon from 2013 to 2018. The empirical results indicate that the ARJI jump model can cope with the extreme price movements of Bitcoin, showing comparatively superior in-sample goodness-of-fit, as well as out-of-sample predictive performance. However, due to the excessive volatility swings on the cryptocurrency market, the realized volatility of Bitcoin prices is only marginally explained by the GARCH genre of employed models.  相似文献   

16.
This paper assesses the impact of the increasing use of nuclear energy on the international safeguards system, and it identifies and assesses options for coping with the anticipated impact. A review of nuclear energy forecasts indicates a need for substantial increases in the financial and personnel resources of the safeguards system over the next decade. The requisite financial increases are probably within the limits of political feasibility, but the personnel needs may become problematic. There is also likely to be a continuing decline in confidence in the effectiveness of the system because of perceptions of inadequate resources and methods. There are several options that could reduce the projected technical and political pressures on the system: a postponement of plutonium recycle; improved materials measurement accuracies; immediate increases in the IAEA inspections staff. There are also options that would supplement the safeguards system and alleviate the pressures on it: multinational fuel cycle centers; a suppliers' cartel-like arrangement; and an International Nuclear Materials Custodial Authority.  相似文献   

17.
The Likelihood of Events Assessment Process (LEAP), a new method of forecasting, was described by Preble in 1982 [5]. This article carries forward that work by demonstrating with top level life insurance executives that LEAP forecasts are consistent. Using matched intracompany and intercompany LEAP panels, under identical test conditions, it was found that the two groups produced very similar socio-political forecast estimates and degrees of consensus. Both groups reported themselves to be satisfied with using the technique and about 85% of the participants said they would be willing to use LEAP again. Since strong similarities were demonstrated, the intracompany panel type is recommended to strategic planners based on its unique advantages.  相似文献   

18.
Nash equilibrium is often interpreted as a steady state in which each player holds the correct expectations about the other players' behavior and acts rationally. This paper investigates the robustness of this interpretation when there are small costs associated with complicated forecasts. The model consists of a two-person strategic game in which each player chooses a finite machine to implement a strategy in an infinitely repeated 2×2 game with discounting. I analyze the model using a solution concept called Nash Equilibrium with Stable Forecasts (ESF). My main results concern the structure of equilibrium machine pairs. They provide necessary and sufficient conditions on the form of equilibrium strategies and plays. In contrast to the “folk theorem,” these structural properties place severe restrictions on the set of equilibrium paths and payoffs. For example, only sequences of the one-shot Nash equilibrium can be generated by any ESF of the repeated game of chicken.  相似文献   

19.
Two investment decisions in economic institutions are feasible; investments in monetary institutions in the form of delegation of monetary policy to a more conservative or independent central bank, and investments in fiscal capacity, in the form of combating bureaucratic corruption and its consequent fiscal revenue leakages. Within this framework, we investigate the interactions among those two institutional decisions and the obtained institutional structure. The findings provide support of strategic complementarities; investments in monetary and fiscal institutions reinforce each other. In addition, we identify a set of determinants that impact on the government’s decisions to improve economic institutions, particularly, the structure and intensity of the initial corruption level, the amount of distortions caused by taxation and the policymaker’s goals and preferences across its objectives.  相似文献   

20.
In this paper we propose an innovation diffusion framework based on well-known Bass models to analyze and forecast national adoption patterns of photovoltaic installed capacity. This allows for interesting comparisons among several countries and in many cases highlights the positive effect of incentive policies in stimulating the diffusion of such a technology. In this sense, the Generalized Bass Model proves to be essential for modelling and forecasting. On this basis, we observe important differences in the investments made by countries in the PV sector and we are able to identify whether and when these investments obtained the expected results. In particular, from our analysis it turns out that in some cases incentive measures have been certainly effective in facilitating adoption, while in some others these have not been able to produce real feed-back. Moreover, our cross-country approach is able to forecast different stages in PV evolution: whereas some countries have already entered the mature stage of diffusion, others have just begun. This result may suggest various considerations about the competitive advantage of those countries that invested in alternative energy provisions. In spite of a very diversified scenario in terms of historical patterns of diffusion, we may report, as a general result, the fragile role of innovators for this special market and the dominance of imitative behaviour in adoptions.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号