首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
It is important to respond to customers' requirements more rapidly than ever before due to the recent trend in e-business and its technologies. In order to achieve an agile response, we have to manage business models, to reflect the changes in the models and to develop or modify IT systems for further chances. This paper proposes a management framework of layered enterprise models. The proposed framework consists of a business model repository and a software repository, and defines three different grain sizes of modeling layers, namely business modeling, business process modeling and business application modeling, in order to support business modeling and application development. This framework helps us to develop business application in incremental deployment of analysis, design, and implementation to execute business processes. We have implemented a prototype environment using Java. Each repository's contents are described using XML so that the repositories are interoperable. Copyright © 2003 John Wiley & Sons, Ltd.  相似文献   

2.
This paper presents a knowledge‐based methodology for business process reengineering that uses a case‐based reasoning paradigm to provide decision support to its users in the modeling of a current problem and a redesign of critical business processes. As a process modeling tool for representing the business process, the event process chain (EPC) modeling method is used in this paper. We developed a CAPMOSS (CAse‐based Process MOdeling Supporting System) to support our proposed methodology. To reengineer a new business process problem, CAPMOSS retrieves from its case base the case that is most similar to the current problem. CAPMOSS uses a retrieved case to guide the structuring of AS‐IS models and TO‐BE models of a target business process. Using the transformational knowledge of a retrieved case, CAPMOSS helps the user to transform an AS‐IS model into a TO‐BE model for the target process with ease and the purchasing process in a government R&D institute is explained as an application of this approach. Copyright © 2000 John Wiley & Sons, Ltd.  相似文献   

3.
In this paper we examine the behavior of the systematic risk of corporate bonds. A model that assumes β is constant is compared with a model that allows systematic risk to vary in a manner consistent with the Black-Scholes-Merton Options Pricing Model. This procedure captures some fundamental properties of the movement of bond β and provides a starting point for improved models of the process generating bond returns.  相似文献   

4.
The calculation of statistical expectation values is a most useful tool in risk analysis, but it is not a panacea. Valuable additional decision support can be obtained from argument-based tools, i.e. rigorous tools based on conceptual distinctions and logical reasoning. If the decision problem is not well defined, then argument-based tools can be used to analyse the uncertainties involved and provide a more well-reasoned presentation of the problem. If the available information is insufficient for a reliable quantitative analysis, then argument-based tools can partly replace it as guidance for the decision-making process. Argument-based tools can be grouped into three main categories: (1) tools used to determine the impact of alternative structurings of decisions and to make an informed choice among them, (2) tools used for the evaluation of decision options, and (3) tools used for the choice among such options in terms of the relative strengths of the arguments that speak for and against each of the options.  相似文献   

5.
Business models are economic models that describe the rationale of why organizations create and deliver value. These models focus on what organizations offer and why. Business process models capture business activities and the ways in which they are accomplished (i.e. their coordination). They explain who is involved in the activities, and how and when these activities should be performed. This paper discusses the alignment between business models and business process models. It proposes a novel systematic method for extracting a value chain (i.e. business model) expressed in the Resources, Events, Agents (REA) ontology from a business process model expressed in Business Process Model and Notation?. Our contribution is twofold: (1) from a theoretical standpoint we identified a set of structural and behavioural patterns that enable us to infer the corresponding REA value chain; (2) from a pragmatic perspective, our approach can be used to derive useful knowledge about the business process and serve as a starting point for business analysis.  相似文献   

6.
Changes in organizational processes often interact with changes in the IT infrastructure. Accounting for the structural and economic consequences of changes to the modern IT infrastructure remains a challenge, as their complexity can affect more than one business process, and the need to share a common understanding between the IT and the business management challenges current IT governance practices. An integrative perspective of business processes and IT resources would help meet these challenges, but despite some progress such a perspective remains to be developed. This paper proposes a domain ontology – an Ontology for Linking Processes and IT infrastructure (OLPIT) – to model the relationship between IT resources and business processes for the purpose of measuring the business value of IT. The ontology was developed and evaluated in the context of a design research project conducted in the Hilti Corporation, an international manufacturing company, with the aim of defining how IT impacts the business and calculating the cost of IT services used.  相似文献   

7.
8.
Abstract

This paper contains a systematic presentation of time-continuous stable population theory in modern probabilistic dress. The life-time births of an individual are represented by an inhomogeneous Poisson process stopped at death, and an aggregate of such processes on the individual level constitutes the population process. Forward and backward renewal relations are established for the first moments of the main functionals of the process and for their densities. Their asymptotic convergence to a stable form is studied, and the stable age distribution is given some attention. It is a distinguishing feature of the present paper that rigorous proofs are given for results usually set up by intuitive reasoning only.  相似文献   

9.
Business process modeling is an important part of information systems design as well as of any business engineering or reengineering activity. Business process modeling languages provide standard ways of presentation and communication between different stakeholders. A business process model is the externalization of the conceptualization of some parts of the object world that deal with those aspects that pertain to the way business transactions are carried out and supported by an information system. This paper deals with an essential issue in this context, namely the assessment of the quality of business processes through their models. This objective raises two major issues, (a) the identification of the quality factors relevant to business processes, and (b) the definition of the metrics that provide a means for objectively measuring quality of business processes. These two issues are addressed in this paper through a quality evaluation framework, known as QEF that enables business process modelers to explicitly incorporate a wide variety of requirements corresponding to quality factors. Quality factors of business processes are defined in this paper and categorized into different quality dimensions. Application of the quality framework as well as proposed quality dimensions, factors and metrics are discussed through an illustrative example.  相似文献   

10.
In this paper we provide an extensive classification of one- and two-dimensional diffusion processes which admit an exact solution to the Kolmogorov (and hence Black–Scholes) equation (in terms of hypergeometric functions). By identifying the one-dimensional solvable processes with the class of integrable superpotentials introduced recently in supersymmetric quantum mechanics, we obtain new analytical solutions. In particular, by applying supersymmetric transformations on a known solvable diffusion process (such as the Natanzon process for which the solution is given by a hypergeometric function), we obtain a hierarchy of new solutions. These solutions are given by a sum of hypergeometric functions, generalizing the results obtained in a paper by Albanese et al. (Albanese, C., Campolieti, G., Carr, P. and Lipton, A., Black–Scholes goes hypergeometric. Risk Mag., 2001, 14, 99–103). For two-dimensional processes, more precisely stochastic volatility models, the classification is achieved for a specific class called gauge-free models including the Heston model, the 3?/?2-model and the geometric Brownian model. We then present a new exact stochastic volatility model belonging to this class.  相似文献   

11.
Abstract

A Monte Carlo (MC) experiment is conducted to study the forecasting performance of a variety of volatility models under alternative data-generating processes (DGPs). The models included in the MC study are the (Fractionally Integrated) Generalized Autoregressive Conditional Heteroskedasticity models ((FI)GARCH), the Stochastic Volatility model (SV), the Long Memory Stochastic Volatility model (LMSV) and the Markov-switching Multifractal model (MSM). The MC study enables us to compare the relative forecasting performance of the models accounting for different characterizations of the latent volatility process: specifications that incorporate short/long memory, autoregressive components, stochastic shocks, Markov-switching and multifractality. Forecasts are evaluated by means of mean squared errors (MSE), mean absolute errors (MAE) and value-at-risk (VaR) diagnostics. Furthermore, complementarities between models are explored via forecast combinations. The results show that (i) the MSM model best forecasts volatility under any other alternative characterization of the latent volatility process and (ii) forecast combinations provide systematic improvements upon most single misspecified models, but are typically inferior to the MSM model even if the latter is applied to data governed by other processes.  相似文献   

12.
This paper investigates the time-varying behavior of systematic risk for 18 pan-European sectors. Using weekly data over the period 1987–2005, six different modeling techniques in addition to the standard constant coefficient model are employed: a bivariate t-GARCH(1,1) model, two Kalman filter (KF)-based approaches, a bivariate stochastic volatility model estimated via the efficient Monte Carlo likelihood technique as well as two Markov switching models. A comparison of ex-ante forecast performances of the different models indicate that the random walk process in connection with the KF is the preferred model to describe and forecast the time-varying behavior of sector betas in a European context.  相似文献   

13.
Numerical integration methods for stochastic volatility models in financial markets are discussed. We concentrate on two classes of stochastic volatility models where the volatility is either directly given by a mean-reverting CEV process or as a transformed Ornstein–Uhlenbeck process. For the latter, we introduce a new model based on a simple hyperbolic transformation. Various numerical methods for integrating mean-reverting CEV processes are analysed and compared with respect to positivity preservation and efficiency. Moreover, we develop a simple and robust integration scheme for the two-dimensional system using the strong convergence behaviour as an indicator for the approximation quality. This method, which we refer to as the IJK (137) scheme, is applicable to all types of stochastic volatility models and can be employed as a drop-in replacement for the standard log-Euler procedure.  相似文献   

14.
The purpose of this paper is to describe the appropriate mathematical framework for the study of the duality principle in option pricing. We consider models where prices evolve as general exponential semimartingales and provide a complete characterization of the dual process under the dual measure. Particular cases of these models are the ones driven by Brownian motions and by Lévy processes, which have been considered in several papers. Generally speaking, the duality principle states that the calculation of the price of a call option for a model with price process S=e H (with respect to the measure P) is equivalent to the calculation of the price of a put option for a suitable dual model S′=e H (with respect to the dual measure P′). More sophisticated duality results are derived for a broad spectrum of exotic options. The second named author acknowledges the financial support from the Deutsche Forschungsgemeinschaft (DFG, Eb 66/9-2). This research was carried out while the third named author was supported by the Alexander von Humboldt foundation.  相似文献   

15.
For d-dimensional exponential Lévy models, variational formulations of the Kolmogorov equations arising in asset pricing are derived. Well-posedness of these equations is verified. Particular attention is paid to pure jump, d-variate Lévy processes built from parametric, copula dependence models in their jump structure. The domains of the associated Dirichlet forms are shown to be certain anisotropic Sobolev spaces. Singularity-free representations of the Dirichlet forms are given which remain bounded for piecewise polynomial, continuous functions of finite element type. We prove that the variational problem can be localized to a bounded domain with explicit localization error bounds. Furthermore, we collect several analytical tools for further numerical analysis.  相似文献   

16.
This article explores the relationships between several forecasts for the volatility built from multi-scale linear ARCH processes, and linear market models for the forward variance. This shows that the structures of the forecast equations are identical, but with different dependencies on the forecast horizon. The process equations for the forward variance are induced by the process equations for an ARCH model, but postulated in a market model. In the ARCH case, they are different from the usual diffusive type. The conceptual differences between both approaches and their implication for volatility forecasts are analysed. The volatility forecast is compared with the realized volatility (the volatility that will occur between date t and t + ΔT), and the implied volatility (corresponding to an at-the-money option with expiry at t + ΔT). For the ARCH forecasts, the parameters are set a priori. An empirical analysis across multiple time horizons ΔT shows that a forecast provided by an I-GARCH(1) process (one time scale) does not capture correctly the dynamics of the realized volatility. An I-GARCH(2) process (two time scales, similar to GARCH(1,1)) is better, while a long-memory LM-ARCH process (multiple time scales) replicates correctly the dynamics of the implied and realized volatilities and delivers consistently good forecasts for the realized volatility.  相似文献   

17.
《Quantitative Finance》2013,13(1):12-14
Developments in real options theory are resulting in valuation tools more suited to our increasingly dynamic economy. These tools can help where conventional business valuation techniques fail to capture the full opportunity value of new business strategies. Steve Leppard and Peter Morawitz report.  相似文献   

18.
Abstract

By claims experience monitoring is meant the systematic comparison of the forecasts from a claims model with claims experience as it emerges subsequently. In the event that the stochastic properties of the forecasts are known, the comparison can be represented as a collection of probabilistic statements. This is stochastic monitoring. This paper defines this process rigorously in terms of statistical hypothesis testing. If the model is a regression model (which is the case for most stochastic claims models), then the natural form of hypothesis test is a number of likelihood ratio tests, one for each parameter in the valuation model. Such testing is shown to be very easily implemented by means of generalized linear modeling software. This tests the formal structure of the claims model and is referred to as microtesting. There may be other quantities (e.g., amount of claim payments in a defined interval) that require testing for practical reasons. This sort of testing is referred to as macrotesting, and its formulation is also discussed.  相似文献   

19.
Using data collected through survey questionnaire across 15 universities, we examine the effect of emotional intelligence on academic work performance (in research, teaching and service) in Australian business faculties. We find academics’ ability to use emotion enhances performance across research, teaching and service, while ability to regulate emotion enhances performance for teaching and service only. We also find support for a process‐based model of emotional intelligence in which appraisal of emotion is a necessary antecedent to emotion’s use and regulation. The results have implications for management in appointment decisions and professional development programmes in business/accounting faculties.  相似文献   

20.
We use Markov Chain Monte Carlo (MCMC) methods for the parameter estimation and the testing of conditional asset pricing models. In contrast to traditional approaches, it is truly conditional because the assumption that time variation in betas is driven by a set of conditioning variables is not necessary. Moreover, the approach has exact finite sample properties and accounts for errors‐in‐variables. Using S&P 500 panel data, we analyse the empirical performance of the CAPM and the Fama and French (1993) three‐factor model. We find that time‐variation of betas in the CAPM and the time variation of the coefficients for the size factor (SMB) and the distress factor (HML) in the three‐factor model improve the empirical performance. Therefore, our findings are consistent with time variation of firm‐specific exposure to market risk, systematic credit risk and systematic size effects. However, a Bayesian model comparison trading off goodness of fit and model complexity indicates that the conditional CAPM performs best, followed by the conditional three‐factor model, the unconditional CAPM, and the unconditional three‐factor model.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号