This article addresses the particle swarm optimization (PSO) method. It is a recent proposed algorithm by Kennedy and Eberhart [1995. Particle swarm optimization. In: Proceedings of the IEEE International Conference on Neural Networks (Perth, Australia), vol. IV, IEEE Service Center, Piscataway, NJ, pp. 1942–1948]. This optimization method is motivated by social behaviour of organisms such as bird flocking and fish schooling. PSO algorithm is not only a tool for optimization, but also a tool for representing socio-cognition of human and artificial agents, based on principles of social behaviour. Some scientists suggest that knowledge is optimized by social interaction and thinking is not only private but also interpersonal. PSO as an optimization tool, provides a population-based search procedure in which individuals called particles change their position (state) with time. In a PSO system, particles fly in a multidimensional search space. During flight, each particle adjusts its position according to its own experience, and according to the experience of neighbours, making use of the best position encountered by itself and its neighbours. In this paper, we propose firstly, an extension of the PSO system that integrates a new displacement of the particles (the balance between the intensification process and the diversification process) and we highlight a relation between the coefficients of update of each dimension velocity between the classical PSO algorithm and the extension. Secondly, we propose an adaptation of this extension of PSO algorithm to solve combinatorial optimization problem with precedence constraints in general and resource-constrained project scheduling problem in particular. The numerical experiments are done on the main continuous functions and on the resource-constrained project scheduling problem (RCPSP) instances provided by the psplib. The results obtained are encouraging and push us into accepting than both PSO algorithm and extensions proposed based on the new particles displacement are a promising direction for research. 相似文献
We propose a systematic algorithmic reverse-stress testing methodology to create “worst case” scenarios for regulatory stress tests by accounting for losses that arise from distressed portfolio liquidations. First, we derive the optimal bank response for any given shock. Then, we introduce an algorithm which systematically generates scenarios that exploit the key vulnerabilities in banks' portfolio holdings and thus maximize contagion despite banks' optimal response to the shock. We apply our methodology to data of the 2016 European Banking Authority (EBA) stress test, and design worst case scenarios for the portfolio holdings of European banks at the time. Using spectral clustering techniques, we group 10,000 worst-case scenarios into twelve geographically concentrated families. Our results show that even though there is a wide range of different scenarios within these 12 families, each cluster tends to affect the same banks. An “Anna Karenina” principle of stress testing emerges: Not all stressful scenarios are alike, but every stressful scenario stresses the same banks. These findings suggest that the precise specification of a scenario is not of primal importance as long as the most vulnerable banks are targeted and sufficiently stressed. Finally, our methodology can be used to uncover the weakest links in the financial system and thereby focus supervisory attention on these, thus building a bridge between macroprudential and microprudential stress tests. 相似文献
Money laundering has affected the global economy for many years, and there are several methods of solving it presented in the literature. However, when tackling money laundering and financial fraud together there are few methods for solving them. Thus, this study aims to identify methods for anti-money laundering (AML) and financial fraud detection (FFD). A systematic literature review was performed for analysis and research of the methods used, utilizing the SCOPUS and Web of Science databases. Of the 48 articles that aligned with the research theme, 20 used quantitative methods for AML and FFD solution, 13 were literature reviews, 7 used qualitative methods, and 8 used mixed methods. This study contributes by presenting a systematic literature review that fills two research gaps: lack of studies on AML and FFD, and the methods used to solve them. This will assist researchers in identifying gaps and related research. 相似文献
Creativity is often highly concentrated in time and space, and across different domains. What explains the formation and decay of clusters of creativity? We match data on notable individuals born in Europe between the eleventh and the nineteenth centuries with historical city data. The production and attraction of creative talent is associated with city institutions that protected economic and political freedoms and promoted local autonomy. Instead, indicators of local economic conditions such as city size and real wages, do not predict creative clusters. We also show that famous creatives are spatially concentrated and clustered across disciplines, that their spatial mobility has remained stable over the centuries, and that creative clusters are persistent but less than population.
We show that pro-cyclicality is inherent in risk measure estimates based on historical data. Taking the example of VaR, we show that the empirical VaR measure is mean-reverting over a 1-year horizon when the portfolio is held fixed. It means that a capital requirement rule based on historical measurements of VaR tends in calm times to understate future required capital and tends in volatile times to overstate it. To quantify this pro-cyclicality, we develop a simple and efficient methodology, which we apply to major equity market indices. We make the interesting point that the pro-cyclicality property holds true even in a world with constant volatility, though the empirical magnitude of the mean-reversion is greater than what would be observed in that special case. 相似文献