首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
In this paper we extend the exact discrete model of Bergstrom (1966) first used in empirical finance by Brennan and Schwartz (1979) to estimate their two-factor term structure model to estimate other two-factor term structure models using the recent assumption in Nowman (1997) for single factor models. Following Nowman (1997) we use the exact Gaussian estimation methods of Bergstrom (1983–1986, 1990) to estimate two-factor CKLS, Vasicek and CIR models. We estimate the models using monthly UK and Japanese interest rate data and our results indicate that the estimation method works well in practice.  相似文献   

2.
We propose dynamic programming coupled with finite elements for valuing American-style options under Gaussian and double exponential jumps à la Merton [J. Financ. Econ., 1976, 3, 125–144] and Kou [Manage. Sci., 2002, 48, 1086–1101], and we provide a proof of uniform convergence. Our numerical experiments confirm this convergence result and show the efficiency of the proposed methodology. We also address the estimation problem and report an empirical investigation based on Home Depot. Jump-diffusion models outperform their pure-diffusion counterparts.  相似文献   

3.
In this paper we compare the forecasting performance of different models of interest rates using parametric and nonparametric estimation methods. In particular, we use three popular nonparametric methods, namely, artificial neural networks (ANN), k-nearest neighbour (k-NN), and local linear regression (LL). These are compared with forecasts obtained from two-factor continuous time interest rate models, namely, Chan, Karolyi, Longstaff, and Sanders [CKLS, J. Finance 47 (1992) 1209]; Cos, Ingersoll, and Ross [CIR, Econometrica 53 (1985) 385]; Brennan and Schwartz [BR-SC, J. Financ. Quant. Anal. 15 (1980) 907]; and Vasicek [J. Financ. Econ. 5 (1977) 177]. We find that while the parametric continuous time method, specifically Vasicek, produces the most successful forecasts, the nonparametric k-NN performed well.  相似文献   

4.
Hai Lin 《Quantitative Finance》2018,18(9):1453-1470
This paper investigates the impact of tightened trading rules on the market efficiency and price discovery function of the Chinese stock index futures in 2015. The market efficiency and the price discovery of Chinese stock index futures do not deteriorate after these rule changes. Using variance ratio and spectral shape tests, we find that the Chinese index futures market becomes even more efficient after the tightened rules came into effect. Furthermore, by employing Schwarz and Szakmary [J. Futures Markets, 1994, 14(2), 147–167] and Hasbrouck [J. Finance, 1995, 50(4), 1175–1199] price discovery measures, we find that the price discovery function, to some extent, becomes better. This finding is consistent with Stein [J. Finance, 2009, 64(4), 1517–1548], who documents that regulations on leverage can be helpful in a bad market state, and Zhu [Rev. Financ. Stud., 2014, 27(3), 747–789.], who finds that price discovery can be improved with reduced liquidity. It also suggests that the new rules may effectively regulate the manipulation behaviour of the Chinese stock index futures market during a bad market state, and then positively affect its market efficiency and price discovery function.  相似文献   

5.
This article presents the first application in finance of recently developed methods for the Gaussian estimation of continuous time dynamic models. A range of one factor continuous time models of the short-term interest rate are estimated using a discrete time model and compared to a recent discrete approximation used by Chan, Karolyi, Longstaff, and Sanders (1992a, hereafter CKLS). Whereas the volatility of short-term rates is highly sensitive to the level of rates in the United States, it is not in the United Kingdom.  相似文献   

6.
This paper investigates Barroso and Santa-Clara’s [J. Financ. Econ., 2008, 116, 111–120] risk-managed momentum strategy in an industry momentum setting. We investigate several traditional momentum strategies including that recently proposed by Novy-Marx [J. Financ. Econ., 2012, 103, 429–453]. We moreover examine the impact of different variance forecast horizons on average pay-offs and also Daniel and Moskowitz’s [J. Financ. Econ., 2016, 122, 221–247] optionality effects. Our results show in general that neither plain industry momentum strategies nor the risk-managed industry momentum strategies are subject to optionality effects, implying that these strategies have no time-varying beta. Moreover, the benefits of risk management are robust across volatility estimators, momentum strategies and subsamples. Finally, the ‘echo effect’ in industries is not robust in subsamples as the strategy works only during the most recent subsample.  相似文献   

7.
We explore the robust replication of forward-start straddles given quoted (Call and Put options) market data. One approach to this problem classically follows semi-infinite linear programming arguments, and we propose a discretisation scheme to reduce its dimensionality and hence its complexity. Alternatively, one can consider the dual problem, consisting in finding optimal martingale measures under which the upper and the lower bounds are attained. Semi-analytical solutions to this dual problem were proposed by Hobson and Klimmek [Financ. Stochastics, 2015, 19, 189–214] and by Hobson and Neuberger [Math. Financ., 2012, 22, 31–56]. We recast this dual approach as a finite-dimensional linear program, and reconcile numerically, in the Black–Scholes and in the Heston model, the two approaches.  相似文献   

8.
Motivated by the practical challenge in monitoring the performance of a large number of algorithmic trading orders, this paper provides a methodology that leads to automatic discovery of causes that lie behind poor trading performance. It also gives theoretical foundations to a generic framework for real-time trading analysis. The common acronym for investigating the causes of bad and good performance of trading is transaction cost analysis Rosenthal [Performance Metrics for Algorithmic Traders, 2009]). Automated algorithms take care of most of the traded flows on electronic markets (more than 70% in the US, 45% in Europe and 35% in Japan in 2012). Academic literature provides different ways to formalize these algorithms and show how optimal they can be from a mean-variance (like in Almgren and Chriss [J. Risk, 2000, 3(2), 5–39]), a stochastic control (e.g. Guéant et al. [Math. Financ. Econ., 2013, 7(4), 477–507]), an impulse control (see Bouchard et al. [SIAM J. Financ. Math., 2011, 2(1), 404–438]) or a statistical learning (as used in Laruelle et al. [Math. Financ. Econ., 2013, 7(3), 359–403]) viewpoint. This paper is agnostic about the way the algorithm has been built and provides a theoretical formalism to identify in real-time the market conditions that influenced its efficiency or inefficiency. For a given set of characteristics describing the market context, selected by a practitioner, we first show how a set of additional derived explanatory factors, called anomaly detectors, can be created for each market order (following for instance Cristianini and Shawe-Taylor [An Introduction to Support Vector Machines and Other Kernel-based Learning Methods, 2000]). We then will present an online methodology to quantify how this extended set of factors, at any given time, predicts (i.e. have influence, in the sense of predictive power or information defined in Basseville and Nikiforov [Detection of Abrupt Changes: Theory and Application, 1993], Shannon [Bell Syst. Tech. J., 1948, 27, 379–423] and Alkoot and Kittler [Pattern Recogn. Lett., 1999, 20(11), 1361–1369]) which of the orders are underperforming while calculating the predictive power of this explanatory factor set. Armed with this information, which we call influence analysis, we intend to empower the order monitoring user to take appropriate action on any affected orders by re-calibrating the trading algorithms working the order through new parameters, pausing their execution or taking over more direct trading control. Also we intend that use of this method can be taken advantage of to automatically adjust their trading action in the post trade analysis of algorithms.  相似文献   

9.
This paper focuses on the situations where individuals with mean-variance preferences add independent risks to an already risky situation. Pratt and Zeckhauser (Econometrica, 55, 143–154, 1987) define a concept called proper risk aversion in the expected utility framework to describe the situation where an undesirable risk can never be made desirable by the presence of an independent undesirable risk. The assumption of mean-variance preferences allows us to study proper risk aversion in an intuitive manner. The paper presents an economic interpretation for the quasi-concavity of a utility function derived over mean and variance. The main result of the paper says that quasi-concavity plus decreasing risk aversion is equivalent to proper risk aversion.  相似文献   

10.
Yue Qiu  Tian Xie 《Quantitative Finance》2013,13(10):1673-1687
Empirical evidence has demonstrated that certain factors in asset pricing models are more important than others for explaining specific portfolio returns. We propose a technique that evaluates the factors included in popular linear asset pricing models. Our method has the advantage of simultaneously ranking the relative importance of those pricing factors through comparing their model weights. As an empirical verification, we apply our method to portfolios formed following Fama and French [A five-factor asset pricing model. J. Financ. Econ., 2015, 116, 1–22] and demonstrate that models accommodated to our factor rankings do improve their explanatory power in both in-sample and out-of-sample analyses.  相似文献   

11.
Nian Yang 《Quantitative Finance》2018,18(10):1767-1779
The stochastic-alpha-beta-rho (SABR) model is widely used by practitioners in interest rate and foreign exchange markets. The probability of hitting zero sheds light on the arbitrage-free small strike implied volatility of the SABR model (see, e.g. De Marco et al. [SIAM J. Financ. Math., 2017, 8(1), 709–737], Gulisashvili [Int. J. Theor. Appl. Financ., 2015, 18, 1550013], Gulisashvili et al. [Mass at zero in the uncorrelated SABR modeland implied volatility asymptotics, 2016b]), and the survival probability is also closely related to binary knock-out options. Besides, the study of the survival probability is mathematically challenging. This paper provides novel asymptotic formulas for the survival probability of the SABR model as well as error estimates. The formulas give the probability that the forward price does not hit a nonnegative lower boundary before a fixed time horizon.  相似文献   

12.
This article presents a numerically efficient approach for constructing an interest rate lattice for multi-state variable multi-factor term structure models in the Makovian HJM [Econometrica 70 (1992) 77] framework based on Monte Carlo simulation and an advanced extension to the Markov Chain Approximation technique. The proposed method is a mix of Monte Carlo and lattice-based methods and combines the best from both of them. It provides significant computational advantages and flexibility with respect to many existing multi-factor model implementations for interest rates derivatives valuation and hedging in the HJM framework.
Alexander L. ShulmanEmail:
  相似文献   

13.
In this paper, we propose a dynamic bond pricing model and report the usefulness of our bond pricing model based on analysis of Japanese Government bond price data. We extend the concept of the time dependent Markov (TDM) model proposed by Kariya and Tsuda (Financial Engineering and the Japanese Markets, Kluwer Academic Publishers, Dordrecht, The Netherlands, Vol. 1, pp. 1–20) to a dynamic model, which can obtain information for future bond prices. A main feature of the extended model is that the whole stochastic process of the random cash-flow discount functions of each individual bond has a time series structure. We express the dynamic structure for the models by using a Bayesian state space representation. The state space approach integrates cross-sectional and time series aspects of individual bond prices. From the empirical results, we find useful evidence that our model performs well for the prediction of the patterns of the term structure of the individual bond returns.  相似文献   

14.
This study adds to research which examines the construct validity of coefficients of cue importance in studies concerned with how decision-makers use accounting information in formulating judgments (see Larcker & Lessig, The Accounting Review, January, 1983, pp. 58–77; Selling & Shank, Accounting, Organizations and Society, 1989, pp. 65–77). Historically, accounting studies have modelled cue importance with almost exclusive reliance upon linear models. But as the Selling & Shank study indicates, inferring the importance of accounting cues through reliance upon only one kind of model can leave “method variance” undetected and raises threats to the construct validity of coefficients (see Cook & Campbell, Quasi-experimentation: Design and Analysis Issues for Field Settings, 1979, pp. 59–70). To the extent that cue importance appears similar across models, then the model coefficients are presumed more valid. While Selling & Shank compare linear models to process tracing models, we compare a linear model to an eigenvector-scaling routine known as the Analytic Hierarchy Process (see Saaty, The Analytic Hierarchy Process, 1980). As with Selling & Shank, we find that the importance of cues is sensitive to model choice, suggesting that more research is needed into method variance before judgments can be made with respect to the construct validity of linear coefficients in accounting studies.  相似文献   

15.
Behavioral decision theory (BDT) is concerned with “accounting for decisions”. The development of this interdisciplinary field is traced from the appearance of several key publications in the 1950s to the present. Whereas the 1960s saw increasing theoretical and empirical work, the field really started to flourish in the 1970s with the appearance of the review by Slovic & Lichtenstein (Organizational Behavior and Human Performance, pp. 549–744, 1971), and key papers on probabilistic judgment (Tversky & Kahneman, Science, pp. 1124–1131, 1974), and choice (Kahneman & Tversky, Econometrica, pp. 263–291, 1979). From the early 1980s to the present, BDT has seen considerable consolidation and expansion and its influence now permeates many fields of enquiry. After this brief history, eight major ideas or findings are discussed. These are: (1) that judgment can be modeled; (2) bounded rationality; (3) to understand decision making, understanding the task is more important than understanding the people; (4) levels of aspiration/reference points; (5) use of heuristic rules; (6) the importance of adding; (7) search for confirmation; and (8) thought as construction. Next, comments are addressed to differences between BDT and problem solving/cognitive science. It is argued that whereas many substantive differences are artificial, two distinct communities of researchers do exist. This is followed by a discussion of some major shortcomings currently facing BDT that include questions about the robustness of findings as well as overconcern with a few specific, “paradoxial” results. On the other hand, there are many interesting issues that BDT could address and several specific suggestions are made. Moreover, these issues represent opportunities for accounting research and several are enumerated. Finally, BDT presents “decisions for accounting” in the sense that scarce resources need to be allocated to different types of research that could illuminate accounting issues. The argument is made that BDT is one research metaphor or paradigm that has proved useful in accounting and that should be supported. Such support, however, may mean that some researchers may work on issues that, at first blush, might seem distant from accounting per se.  相似文献   

16.
This paper empirically examines the relationship between the credit risk of Toyota, Nissan and Honda keiretsu-affiliated firms and the credit risk of the respective parent company. As credit spread data for keiretsu-affiliated firms were not available we create a keiretsu default index, as a proxy, using expected default probabilities obtained from the KMV and Leland and Toft (J. Finance 51, 987–1019, 1996) option pricing models. We find parent credit spreads do not Granger cause our keiretsu default index and vice versa in a bivariate vector autoregressive (VAR) framework.JEL classification: G3, L62  相似文献   

17.
18.
This article presents a pure exchange economy that extends Rubinstein [Bell J. Econ. Manage. Sci., 1976, 7, 407–425] to show how the jump-diffusion option pricing model of Black and Scholes [J. Political Econ., 1973, 81, 637–654] and Merton [J. Financ. Econ., 1976, 4, 125–144] evolves in gamma jumping economies. From empirical analysis and theoretical study, both the aggregate consumption and the stock price are unknown in determining jumping times. By using the pricing kernel, we determine both the aggregate consumption jump time and the stock price jump time from the equilibrium interest rate and CCAPM (Consumption Capital Asset Pricing Model). Our general jump-diffusion option pricing model gives an explicit formula for how the jump process and the jump times alter the pricing. This innovation with predictable jump times enhances our analysis of the expected stock return in equilibrium and of hedging jump risks for jump-diffusion economies.  相似文献   

19.
Under the general affine jump-diffusion framework of Duffie et al. [Econometrica, 2000, 68, 1343–1376], this paper proposes an alternative pricing methodology for European-style forward start options that does not require any parallel optimization routine to ensure square integrability. Therefore, the proposed methodology is shown to possess a better accuracy–efficiency trade-off than the usual and more general approach initiated by Hong [Forward Smile and Derivative Pricing. Working paper, UBS, 2004] that is based on the knowledge of the forward characteristic function. Explicit pricing solutions are also offered under the nested jump-diffusion setting proposed by Bakshi et al. [J. Finance, 1997, 52, 2003–2049], which accommodates stochastic volatility and stochastic interest rates, and different integration schemes are numerically tested.  相似文献   

20.
This paper examines the out-of-sample performance of asset allocation strategies that use conditional multi-factor models to forecast expected returns and estimate the future variance and covariance. We find that strategies based on conditional multi-factor models outperform strategies based on unconditional multi-factor models, and do better than a passive buy-and-hold strategy. However, a strategy that uses the sample mean as a return forecast is superior. We also find that the estimation of the covariance matrices based on the conditional and unconditional multi-factor models does not improve the performance of the active asset allocation strategy relative to the incorporation of the historical covariance matrices. These results are fairly robust to different estimation approaches, as well as to the impact of transaction costs and the consideration of upper and lower bounds for the portfolio weights.
M. DeetzEmail:
  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号