首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
This paper illustrates the development of an object-oriented Bayesian network (OOBN) to integrate the safety risks contributing to an in-flight loss-of-control aviation accident. With the creation of a probabilistic model, inferences about changes to the states of the accident shaping or causal factors can be drawn quantitatively. These predictive safety inferences derive from qualitative reasoning to conclusions based on data, assumptions, and/or premises, and enable an analyst to identify the most prominent causal factors leading to a risk factor prioritization. Such an approach facilitates a mitigation portfolio study and assessment. The model also facilitates the computation of sensitivity values based on perturbations to the estimates in the conditional probability tables. Such computations lead to identifying the most sensitive causal factors with respect to an accident probability. This approach may lead to vulnerability discovery of emerging causal factors for which mitigations do not yet exist that then informs possible future R&D efforts. To illustrate the benefits of an OOBN in a large and complex aviation accident model, the in-flight loss-of-control accident framework model is presented.  相似文献   

2.
This study examines the relation between measurement system satisfaction, economic performance, and two general approaches to strategic performance measurement: greater measurement diversity and improved alignment with firm strategy and value drivers. We find consistent evidence that firms making more extensive use of a broad set of financial and (particularly) non-financial measures than firms with similar strategies or value drivers have higher measurement system satisfaction and stock market returns. However, we find little support for the alignment hypothesis that more or less extensive measurement than predicted by the firm's strategy or value drivers adversely affect performance. Instead, our results indicate that greater measurement emphasis and diversity than predicted by our benchmark model is associated with higher satisfaction and stock market performance. Our results also suggest that greater measurement diversity relative to firms with similar value drivers has a stronger relationship with stock market performance than greater measurement on an absolute scale. Finally, the balanced scorecard process, economic value measurement, and causal business modeling are associated with higher measurement system satisfaction, but exhibit almost no association with economic performance.  相似文献   

3.
This paper proposes a technology foresight methodology based on the development of a complementary approach to the Delphi method that enables the identification of strategic technological competences, and presents its application in a sheet metal processing equipment manufacturer. The proposed methodology takes into consideration synergies between future events through a modified QFD matrix, and the application involved a panel of experts from industry and academia. The proposed methodology can benefit organizations by promoting a homogeneous perspective on existing relationships between external drivers and technology diffusion. This study contributes for the understanding of the links between foresight and technology strategy formulation. Further implementations in industrial environments should be performed to refine the methodology and increase the confidence level on the expected results that these findings can signify.  相似文献   

4.
Artificial intelligence (AI) in business processes and academic research in AI has significantly increased. However, the adoption of AI in organizational strategy is yet to be explored in extant literature. This study proposes two conceptual frameworks showing hierarchical relationships among the various drivers and barriers to AI adoption in organizational strategy. In a two-step approach, the literature study is first done to identify eight drivers of and nine barriers to AI adoption and validated by academic and industry experts. In the second step, MICMAC (matrice d'impacts croises-multiplication appliqúe a un classment or cross-impact matrix multiplication applied to classification) analysis categorizes the drivers and barriers to AI adoption in organizational strategy. Total interpretive structural modeling (TISM) is developed to understand the complex and hierarchical associations among the drivers and barriers. This is the first attempt to model the drivers and barriers using a methodology like TISM, which provides a comprehensive conceptual framework with hierarchical relationships and relative importance of the drivers and barriers to AI adoption. AI solutions' decision-making ability and accuracy are the most influential drivers that influence other driving factors. Lack of an AI adoption strategy, lack of AI talent, and lack of leadership commitment are the most significant barriers that affect other barriers. Recommendations for senior leadership are discussed to focus on the leading drivers and barriers. Also, the limitations and future research scope are addressed.  相似文献   

5.
A Critique of the Stochastic Discount Factor Methodology   总被引:2,自引:0,他引:2  
In this paper, we point out that the widely used stochastic discount factor (SDF) methodology ignores a fully specified model for asset returns. As a result, it suffers from two potential problems when asset returns follow a linear factor model. The first problem is that the risk premium estimate from the SDF methodology is unreliable. The second problem is that the specification test under the SDF methodology has very low power in detecting misspecified models. Traditional methodologies typically incorporate a fully specified model for asset returns, and they can perform substantially better than the SDF methodology.  相似文献   

6.
Paul Dragos Aligica   《Futures》2003,35(10):1027-1040
This article is a contribution to the development of the epistemological foundations of Futures Studies. The article starts by presenting the conventional “covering-law” model asserting the symmetry between prediction and explanation, a model that continues to undermine the authority of Futures Studies as a discipline despite the fact that Logical Positivism, the epistemological paradigm that inspired it, is no longer dominant. Then the article outlines the fatal weaknesses of that model showing how out of its criticism emerges the prospect of a coherent and robust epistemology of prediction. Two major points are made: First that predictive argumentation is not demonstrative but merely evidential. Therefore formal logic argumentative structures of the “covering law” type are inadequate in giving a complete and accurate account of predictive argumentation and practice. If the nature of predictive arguments is evidential then the epistemology of prediction should be based not on mere formal logic but on a larger theory of argumentation. Second, the criticism illuminates the complex problem of the types of knowledge and information used in predictive arguments to build up evidence. Explicit and formalized knowledge and statistical evidence are not enough for a successful predictive procedure. Background information and personal, local and tacit knowledge play a surprisingly major role in predictive arguments and procedures and that has very important epistemological consequences.One of the most challenging difficulties Futures Studies had to face since its inception as a discipline has been the fact that in an era dominated by the legacy of Logical Positivism the Futures Studies project seemed epistemologically odd and not quite matching the rigid standards of scientific investigation imposed by the mainstream Positivist cannon. In spite of its impressive advances in theory, methodology and applications, the shadow cast on it by the fact that it was epistemologically suspicious to the philosophic mainstream undermined a good deal of its credibility and authority as a discipline. Even in the wake of the retreat of Positivism as a dominant paradigm the situation in this respect remained frustratingly dysfunctional. Thus there is no surprise that many preeminent scholars in the field argued that an epistemology of Futures Studies was long overdue and that given the current intellectual circumstances, the effort of developing it came to represent one of the major priorities of the field at this point [1, 9, 14 and 15]. Futures Studies had to establish its epistemological credentials in a clear and robust way and thus to claim its clout and legitimacy undermined by Logical Positivism in front of the scholarly community.Undoubtedly the main source of the damage done by Logical Positivism to the epistemological foundations of Futures Studies was neither the rigid methodology implied by it nor its ultra-empiricism but its widely accepted and influential theory of explanation. The crux of that theory is that explaining and predicting events are logically and methodologically identical. It is true that positivists were interested in developing a theory of explanation and not of prediction but due to the alleged logical symmetry between the two, a complete and analogous theory of prediction emerged in a natural way by implication from the theory of explanation. This model and the relationship between prediction and explanation implied by it have raised to dominance and become the backbone of epistemology and the theory of sciences for a couple of decades. The problem is that the account it has given to both explanation and prediction is incomplete and in many respects harmful to the explanatory and predictive practice. By tying the two too close together in a rigid conceptual framework it has arbitrarily constrained their domains and undermined the epistemological legitimacy of many of the methods, practices and approaches associated to them.In the case of explanation, the model, while adequate for many important types of scientific explanations is not at all applicable to all scientific domains. It is definitely not a complete account of explanation and the consequences of the straightjacket it has imposed to scientific inquiry are appreciable. Imposing prediction as a fundamental concept and criteria for explanation the positivist epistemological model sets standards that many disciplines could never achieve by their very nature. As such they were arbitrary relegated outside the proper domain of science. The result was an unnecessary long and painful debate in all the disciplines affected by that demarcation criterion, a sterile debate that rages to this day in, for instance, political science or sociology.But the impact of the model on prediction was even worse. The spread of the belief in the identity of predictive and explanatory scientific procedures undermined at a fundamental level the efforts to reflect on the nature and potentialities of predictive procedures different from those used for explanation. The legacy of this state of affairs continues to be felt very strongly in Futures Studies. Nevertheless it is interesting to stress that doesn’t happen due to the embrace of the positivist model by the discipline. Familiar with the complexities of future oriented thinking, Futures scholars never took the model seriously. But outside the sphere of its own theorists and practitioners, the Futures Studies field has been still perceived through the epistemological lenses shaped by the positivist model. The truth is that the legitimacy and status of Futures Studies rest with the position the field manages to validate for itself in the mainstream epistemological and scientific methodology forum. And the reality is that the epistemological asymmetry between explanation and prediction has not been adequately recognized and considered outside the field in epistemology or social theory, and that the Futures Studies scholars haven’t made and drawn that distinction convincingly enough.The discussion of the specific methodology of prediction—a theme that with very few exceptions has been neglected by the philosophers of science themselves—failed to enter the mainstream epistemological and philosophy of knowledge debates. And the crucial obstacle to that development continues to be the myth reigning in mainstream social sciences that explanation and prediction are or should be symmetrical processes. It is interesting to note that disentangling the models of predictions from those of explanation, and making the case for a solid epistemological argument remains today a priority for the futures research community as it was 30 years ago. In a path-breaking article written in 1964 Hellmer and Rescher wrote: “As long as one believes that explanation and prediction are strict methodological counterparts, it is reasonable to press further with solely the explanatory problems of a discipline, in the expectation that only the tools thus forged will then be usable for predictive purposes. But once this belief is rejected, the problem of a specifically predictive method arises, and it becomes pertinent to investigate the possibilities of predictive procedures autonomous of those used for explanation” [5].During the last decades Futures Studies made important progress in theory, methodology and applications. But it is still to make a convincing case to gain epistemological legitimacy outside its own field. The task is clear: translating into the mainstream’s epistemological terms the insights gained by the discipline and placing them within the ongoing debates in philosophy of science and theory of knowledge. That effort and the epistemological battle for the future and status of the field are even more urgent today when the place of logical positivism is filled by a number of scattered approaches that may lead to a broader and more realistic view of explanation but that continue to neglect the issue of prediction. Thus in spite of the change of the climate of philosophical opinion, the prediction issue is in danger of remaining strongly tied in its entanglement with explanation, and to unwittingly carry on the legacy of the positivist model.Therefore it is even more important today to disentangle the theory of prediction from the theory of explanation and thus to contribute to the elaboration of a strong case for an autonomous and specific epistemology for Futures Studies. This paper is a contribution to this effort of carving a firm epistemological ground for Futures Studies. As such it continues by presenting the classical model of the symmetry between prediction and explanation and then outlines its fatal weaknesses showing how out of its criticism emerges the possibility of a coherent, robust, original and very interesting epistemology of prediction. All these are done being aware of the fact that the epistemology of Futures Studies could not be reduced to a mere extension of a theory of prediction and that themes such as conditionals, counterfactuals and scenario-related analytic narratives that carry on their own epistemological load are as important as prediction is. However given he external perception of Futures Studies, a perception that is defined and shaped by the notion of prediction, the issue of prediction should be addressed with priority.  相似文献   

7.
Desertification in the Northern Mediterranean region can be effectively managed only through an understanding of the principal ecological, socio-cultural and economic drivers. Scenarios can play an important role in the understanding of such a complex system. Following the fundamentals of Integrated Assessment, narrative storylines were developed that are qualitative, participatory, and highly integrated. Multi-scale long-term (2030) storylines were developed for Europe, the Northern Mediterranean, and for four local cases. This paper discusses the methodology and results of the process of developing European and Mediterranean scenarios. In Part II, the local scenario development by means of scenario workshops is elaborated upon. European and Mediterranean scenarios were based on a set of three existing European scenarios, that were adapted to fit the specific issues in the Mediterranean region, using the so-called Factor-Actor-Sector (FAS) framework. Resulting scenarios were Convulsive Change (disruptive climate change); Big is Beautiful (oversized EU and powerful multinationals); and Knowledge is King (technological development and mass migration). It proved possible to use and enrich a set of existing European scenarios and to translate them to fit the Mediterranean region. A possible use of this type of narrative storylines is further illustrated in Part II.  相似文献   

8.
Creating breakthroughs at 3M   总被引:1,自引:0,他引:1  
Most senior managers want their product development teams to create break-throughs--new products that will allow their companies to grow rapidly and maintain high margins. But more often they get incremental improvements to existing products. That's partly because companies must compete in the short term. Searching for breakthroughs is expensive and time consuming; line extensions can help the bottom line immediately. In addition, developers simply don't know how to achieve breakthroughs, and there is usually no system in place to guide them. By the mid-1990s, the lack of such a system was a problem even for an innovative company like 3M. Then a project team in 3M's Medical-Surgical Markets Division became acquainted with a method for developing breakthrough products: the lead user process. The process is based on the fact that many commercially important products are initially thought of and even prototyped by "lead users"--companies, organizations, or individuals that are well ahead of market trends. Their needs are so far beyond those of the average user that lead users create innovations on their own that may later contribute to commercially attractive breakthroughs. The lead user process transforms the job of inventing breakthroughs into a systematic task of identifying lead users and learning from them. The authors explain the process and how the 3M project team successfully navigated through it. In the end, the team proposed three major new product lines and a change in the division's strategy that has led to the development of breakthrough products. And now several more divisions are using the process to break away from incrementalism.  相似文献   

9.
In multi-organisational contexts, scenario building has been used to engage stakeholders in a critical discussion on issues of mutual importance, and to gain their support with regards to possible future responses. A review of existing literature suggests that much has been written regarding the process of scenario development and the benefits of the process, but the detailed analysis of scenario building outcomes, which encompass a large number of issues and their complex interconnections, has not been made explicit for studying and enhancing understanding of a complex societal problem. This paper presents a systematic method for analysing such complex outcomes in order to facilitate reflective thinking on important issues within the wider context for policy development. The method was employed in a series of participative scenario development workshops, which yielded several causal maps around the theme of construction industry skills. A collective map merging the individual subject-specific causal maps was created to help provide a more holistic overview of the pertinent issues surrounding the construction skills debate. The analysis of this collective map promotes a better understanding of the issue in the wider context, the consequence of possible future events and actions, and of the pre-requisition required for certain events/desired outcomes to take place. The main benefit that could be derived from the method is the opportunity to help facilitate and encourage debate and discussion amongst key stakeholders regarding scenario theme, in this case skills improvement within construction. Due to its flexibility and adaptability, the method could potentially be applied to other areas requiring longer range planning and which contain multiple stakeholder perspectives.  相似文献   

10.
This article proposes an alternative framework for modeling the stochastic dynamics of mortality rates. A simple age basis combined with two stochastic period factors is used to explain the key mortality drivers, while the remaining structure is modeled via a multivariate autoregressive residuals model. The latter captures the stationary mortality dynamics and introduces dependencies between adjacent age-period cells of the mortality matrix that, among other things, can be structured to capture cohort effects in a transparent manner and incorporate across ages correlations in a natural way. Our approach is compared with models with and without a univariate cohort process. The age- and period-related latent states of the mortality basis are more robust when the residuals surface is modeled via the multivariate time-series model, implying that the process indeed acts independently of the assumed mortality basis. Under the Bayesian paradigm, the posterior distribution of the models is considered to explore coherently the extent of parameter uncertainty. Samples from the posterior predictive distribution are used to project mortality, and an in-depth sensitivity analysis is conducted. The methodology is easily extendable in multiple ways that give a different form and degree of significance to the different components of mortality dynamics.  相似文献   

11.
As infrastructure ages, the maintenance cost of highways is becoming a major international concern that hitherto has been overlooked by public sector researchers. This paper begins to fill this gap by focusing on the nature and extent of the impact of environmental cost drivers on costs of highway maintenance. By linking a cost driver framework with engineering theory, and using geographic information systems methodology, it has been possible to demonstrate that the physical geological environment has a significant effect on the cost of highway maintenance activity. In addition to advancing highway maintenance cost behaviour understanding, the research illustrates that to gain new insights, researchers must be prepared to base causality enquiries on theoretical foundations advanced by other disciplines and to work with data and methods of analysis which are appropriate to each situation. From a strategic cost management perspective, this study elevates environmental factors in importance as major drivers of cost and in particular, highlights their related interaction with management strategy and policy. The paper discusses aspects of the cost driver framework and application to planning and control accountability, describes dynamic inter-relationships between activity-based costing and activity-based management and suggests directions for further research.  相似文献   

12.
This paper presents a single thesis relating to the potential uses of structural modelling. In the first section it is argued that the development of conventional dynamic modelling techniques has shifted attention from the important task of producing a model structure which adequately represents a complex system. The paper then discusses various methods of system representation which are suitable for the analysis and improvement of model structure. Finally, a brief survey is presented of some automatic procedures which the authors have developed to provide what they believe to be an essential balance in the methodology of modelling.  相似文献   

13.
This article develops two models for predicting the default of Russian Small and Medium-sized Enterprises (SMEs). The most general questions that the article attempts to answer are ‘Can the default risk of Russian SMEs be assessed with a statistical model?’ and ‘Would it sufficiently demonstrate high predictive accuracy?’ The article uses a relatively large data set of financial statements and employs discriminant analysis as a statistical methodology. Default is defined as legal bankruptcy. The basic model contains only financial ratios; it is extended by adding size and age variables. Liquidity and profitability turned out to be the key factors in predicting default. The resulting models have high predictive accuracy and have the potential to be of practical use in Russian SME lending.  相似文献   

14.
The current ascendancy of transdisciplinarity (TD) is marked by an exponential growth of publications, a widening array of contexts, and increased interest across academic, public and private sectors. This investigation traces historical trends, rhetorical claims, and social formations that have shaped three major discourses of TD: transcendence, problem solving, and transgression. In doing so, it also takes account of developments that have emerged or gained traction since the early 21st century when a 2004 issue of Futures on the same topic was being written.The epistemological problem at the heart of the discourse of transcendence is the idea of unity, traced in the West to ancient Greece. The emergence of transdisciplinarity was not a complete departure from this historical quest, but it signalled the need for new syntheses at a time of growing fragmentation of knowledge and culture. New synthetic frameworks emerged, including general systems, post/structuralism, feminist theory, and sustainability. New organizations also formed to advance conceptual frameworks aimed at transcending the narrowness of disciplinary worldviews and interdisciplinary combinations of approaches that did not supplant the status quo of academic structure and classification.The discourse of problem solving is not new. It was fundamental to conceptions of interdisciplinarity in the first half of the 20th century. Heightened pressure to solve problems of society, though, fostered growing alignment of TD with solving complex problems as well as trans-sector participation of stakeholders in society and team-based science. The discourse of transgression was forged in critique of the existing system of knowledge and education. TD became aligned with imperatives of cultural critique, socio-political movements, and conceptions of post-normal science and wicked problems that break free of reductionist and mechanistic approaches. It also became a recognized premise in interdisciplinary fields, including cultural studies, women's and gender studies, urban studies, and environmental studies. And, calls for TD arrived at a moment of wider crisis in the privileging of dominant forms of knowledge, human rights accountability, and democratic participation.Even with distinct patterns of definition, though, discourses are not air-tight categories. Transcendence was initially an epistemological project, but the claim of transcendence overlaps increasingly with problem solving. The imperatives of transgression also cut across the discourses of transcendence and problem solving. Broadly speaking, though, emphasis is shifting from traditional epistemology to problem solving, from the pre-given to the emergent, and from universality to hybridity and contextuality.  相似文献   

15.
Jordi Serra 《Futures》2008,40(7):664-673
The traditional role of intelligence services has entered into a period of crisis. For decades the main objective of the intelligence system has been (and still is) to detect the moment at which a potential risk becomes a genuine threat and to act when this transition occurs. The logic of this system is based on the same conception that dominated futures studies in its early times: the predictive approach. Essentially, this approach postulates that it is possible to predict the future if you have enough high-quality information; the equivalent to this in intelligence terms would be that you can predict a threat if you have enough privileged information.This approach has been successful for some time, but the changes that the world has undergone in recent years have rendered it obsolete. Globalization, the collapse of the Eastern Bloc and the emergence of new forms of terrorism are forcing intelligence services to develop new methods to keep pace. The doctrine of pre-emptive attacks could be considered an initial move in this direction, but this paper, which is the preliminary version of a forthcoming dissertation, will argue that a far more profound change is required. In short, the current demand is for a new kind of intelligence that has a far more transversal perspective, a systemic mode of operation and an anticipatory approach to risks and threats: proactive intelligence.  相似文献   

16.
Navigating towards sustainable development: A system dynamics approach   总被引:1,自引:0,他引:1  
Traditional fragmented and mechanistic science is unable to cope with issues about sustainability, as these are often related to complex, self-organizing systems. In the paper, sustainable development is seen as an unending process defined neither by fixed goals nor by specific means of achieving them. It is argued that, in order to understand the sources of and the solutions to modern problems, linear and mechanistic thinking must give way to non-linear and organic thinking, more commonly referred to as systems thinking. System Dynamics, which operates in a whole-system fashion, is put forward as a powerful methodology to deal with issues of sustainability. Examples of successful applications are given.Any system in which humans are involved is characterized by the following essential system properties: Bounded rationality, limited certainty, limited predictability, indeterminate causality, and evolutionary change. We need to resort to an adaptive approach, where we go through a learning process and modify our decision rules and our mental models of the real world as we go along. This will enable us to improve system performance by setting dynamic improvement goals (moving targets) for it.Finally, it is demonstrated how causal loop diagrams can be used to find the leverage points of a system.  相似文献   

17.
Although organizations commonly report nonfinancial performance measures (NFPMs) associated with profitability, prior research does not address the extent to which the provision of causal knowledge affects individuals' perceptions of the predictive content of such measures. This study examines how providing NFPMs together with different types of causal knowledge (i.e., strong, weak, or none) affects the perceived usefulness of the measures in a profit prediction context. Weak causal knowledge is defined as the direction of the relationship between an NFPM and earnings, while strong causal knowledge is a complete explanation underlying the relationship. The results provide evidence that providing weak causal knowledge increases individuals' perceptions of the predictive content of an NFPM compared to providing no causal knowledge; however, providing strong causal knowledge does not incrementally affect perceptions beyond providing only weak causal knowledge. These findings have implications for the type of information organizations report concurrently with NFPMs.  相似文献   

18.
It is frequently suggested that accountants need to develop more comprehensive predictive measures of performance. In response, several predictive deterministic models have been proposed in the literature. This paper explores the feasibility of developing predictive statistical models of performance for revenue and profit centres. A large national retail organisation was selected to field test the applicability of such a model. The results indicate that a statistical model can detect significant differences in outlet performance in a multi-outlet firm, providing performance information useful for managerial evaluation. The methodology developed can be applied to performance evaluations in many retail firms.  相似文献   

19.
Accurate prediction of future claims is a fundamentally important problem in insurance. The Bayesian approach is natural in this context, as it provides a complete predictive distribution for future claims. The classical credibility theory provides a simple approximation to the mean of that predictive distribution as a point predictor, but this approach ignores other features of the predictive distribution, such as spread, that would be useful for decision making. In this article, we propose a Dirichlet process mixture of log-normals model and discuss the theoretical properties and computation of the corresponding predictive distribution. Numerical examples demonstrate the benefit of our model compared to some existing insurance loss models, and an R code implementation of the proposed method is also provided.  相似文献   

20.
This article describes and generalizes a validation study of four commercially available personal financial planning expert systems and the rationale for the research methodology used. Our evaluation of these systems adds to the understanding of verification and validation issues related to case selection, validation standards and evaluator bias. The article describes the systems, their domain and the empirical method—field tests using hypothetical cases—and relates that method to the literature. Comparing same-task systems combines multiple system perspectives and multiple models. Our methodology did efficiently and effectively identify conflicting terminology, omissions and system weaknesses but was inadequate for comparing the complex plan recommendations. The results re-emphasize the importance of continual knowledge base updating, formal system testing and the need for external evaluation. The results also show the value of comparing multiple, same-task systems.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号