首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Local regime-switching models are a natural consequence of combining the concept of a local volatility model with that of a regime-switching model. However, even though Elliott et al. (2015) have derived a Dupire formula for a local regime-switching model, its calibration still remains a challenge, primarily due to the fact that the derived volatility function for each state involves all the state price variables whereas only one market price is available for model calibration, and a direct implementation of Elliott et al.’s formula may not yield stable results. In this paper, a closed system for option pricing and data extraction under the classical regime-switching model is proposed with a special approach, splitting one market price into two “market-implied state prices”. The success of our approach hinges on the recovery of the two local volatility functions being transformed into an optimal control problem, which is solved through the Tikhonov regularization. In addition, an efficient algorithm is proposed to obtain the optimal solution by iteration. Our numerical experiments show that different shapes of local volatility functions can be accurately and stably recovered with the newly-proposed algorithm, and this algorithm also works quite well with real market data.  相似文献   

2.
In data envelopment analysis (DEA), there are two principal methods for identifying and measuring congestion: Those of Färe et al. [Färe R, Grosskopf S. When can slacks be used to identify congestion. An answer to W. W. Cooper, L. Seiford, J. Zhu. Socio-Economic Planning Sciences 2001;35:1–10] and Cooper et al. [Cooper WW, Deng H, Huang ZM, Li SX. A one-model approach to congestion in data envelopment analysis. Socio-Economic Planning Sciences 2002;36:231–8]. In the present paper, we focus on the latter work in proposing a new method that requires considerably less computation. Then, by proving a selected theorem, we show that our proposed methodology is indeed equivalent to that of Cooper et al.  相似文献   

3.
The stochastic-alpha-beta-rho (SABR) model introduced by Hagan et al. (2002) provides a popular vehicle to model the implied volatilities in the interest rate and foreign exchange markets. To exclude arbitrage opportunities, we need to specify an absorbing boundary at zero for this model, which the existing analytical approaches to pricing derivatives under the SABR model typically ignore. This paper develops closed-form approximations to the prices of vanilla options to incorporate the effect of such a boundary condition. Different from the traditional normal distribution-based approximations, our method stems from an expansion around a one-dimensional Bessel process. Extensive numerical experiments demonstrate its accuracy and efficiency. Furthermore, the explicit expression yielded from our method is appealing from the practical perspective because it can lead to fast calibration, pricing, and hedging.  相似文献   

4.
The spatial dependence of assets, which relates to similarities in economic, political, or cultural systems and other aspects, has been confirmed through empirical research; however, spatial dependence has rarely been applied to financial risk measurement. To fill this gap in the literature, a dynamic spatial GARCH-copula (sGC) model is proposed in this paper to evaluate the portfolio risk of international stock indices. In this model, a spatial GARCH is used as the marginal distribution and vine copula is adopted as the joint distribution of indices. Then, the proposed model is applied empirically to assess portfolio risk. Results show that, first, the proposed risk prediction model with spatial dependence outperforms a model neglecting spatial effects per the Kupiec test, Z test and Christoffersen test. Risk prediction during periods of economic stability is also more accurate than during times of crisis. Second, risk measures for models with spatial dependence are higher than those without such dependence but lower than for vine copula models. Third, models including either spatial dependence or vine copulas alone exhibit relatively poor performance. Fourth, the model involving extreme value theory (EVT) generates the greatest value at risk to pass the Kupiec test, Z test and Christoffersen test; however, this model is not suitable for characterizing international indices with EVT based on negative values of the shape parameters of estimates. Findings offer important implications for personal investors, institutional investors, and national regulatory authorities.  相似文献   

5.
This paper shows consistency of a two-step estimation of the factors in a dynamic approximate factor model when the panel of time series is large (n large). In the first step, the parameters of the model are estimated from an OLS on principal components. In the second step, the factors are estimated via the Kalman smoother. The analysis develops the theory for the estimator considered in Giannone et al. (2004) and Giannone et al. (2008) and for the many empirical papers using this framework for nowcasting.  相似文献   

6.
In this paper, we implement the conditional difference asymmetry model (CDAS) for square tables with nominal categories proposed by Tomizawa et al. (J. Appl. Stat. 31(3): 271–277, 2004) with the use of the non-standard log-linear model formulation approach. The implementation is carried out by refitting the model in the 3 ×  3 table in (Tomizawa et al. J. Appl. Stat. 31(3): 271–277, 2004). We extend this approach to a larger 4 ×  4 table of religious affiliation. We further calculated the measure of asymmetry along with its asymptotic standard error and confidence bounds. The procedure is implemted with SAS PROC GENMOD but can also be implemented in SPSS by following the discussion in (Lawal, J. Appl. Stat. 31(3): 279–303, 2004; Lawal, Qual. Quant. 38(3): 259–289, 2004).  相似文献   

7.
This paper re-examines a problem of congested inputs in the Chinese automobile and textile industries, which was identified by Cooper et al. [Cooper WW, Deng H, Gu B, Li S, Thrall RM. Using DEA to improve the management of congestion in Chinese industries (1981-1997). Socio-Economic Planning Sciences 2001;35:227-242]. Since these authors employed a single approach in measuring congestion, it is worth exploring whether alternative procedures would yield very different outcomes. Indeed, the measurement of congestion is an area where there has been much theoretical debate but relatively little empirical work. After examining the theoretical properties of the two main approaches currently available, those of Färe et al. [Färe R, Grosskopf S, Lovell CAK. The measurement of efficiency of production. Boston: Kluwer-Nijhoff; 1985] and Cooper et al., we use the data set assembled by Cooper et al. for the period 1981-1997 to compare and contrast the measurements of congestion generated by these alternative approaches. We find that the results are strikingly different, especially in terms of the amount of congestion identified. Finally, we discuss the new approach to measuring congestion proposed by Tone and Sahoo [Tone K, Sahoo BK. Degree of scale economies and congestion: a unified DEA approach. European Journal of Operational Research 2004;158:755-772].  相似文献   

8.
In a very influential model with internal habits, Carroll et al., (2017, 2000), establish that an increase in economic growth may cause a positive change in savings. The optimality of this result, and of many other contributions using a similar framework, has been questioned by some authors who have observed that the parametrization used in these models always implies a utility function not jointly concave in consumption and habits. In this paper, we revisit the optimality issue and, using advanced techniques in Dynamic Programming, we answer the following long-standing open questions: (i) Is the solution found in Carroll et al., (2017, 2000) optimal? (ii) Is it also unique or do other optimal solutions exist?  相似文献   

9.
We characterize preference relations on continuous time consumption paths which admit an exponential discounting representation. We provide two theorems as such, one in the cardinal framework and another in the ordinal framework. Our characterizations parallel the known characterizations in discrete time framework. In the cardinal framework, we adopt the axioms of Epstein (1983), which characterize a stationary preference relation in discrete time, and obtain the exponential discounting model as a special case of the discounting model proposed by Uzawa (1968). In the ordinal framework, we adopt the axioms of Bleichrodt et al. (2008) which were proposed to generalize Koopmans’ classical characterization of stationary preferences.  相似文献   

10.
We report a surprising link between optimal portfolios generated by a special type of variational preferences called divergence preferences (see Maccheroni et al., 2006) and optimal portfolios generated by classical expected utility. As a special case, we connect optimization of truncated quadratic utility (see ?erný, 2003) to the optimal monotone mean–variance portfolios (see Maccheroni et al., 2009), thus simplifying the computation of the latter.  相似文献   

11.
The aim of this study is to confirm the factorial structure of the Identification-Commitment Inventory (ICI) developed within the frame of the Human System Audit (HSA) (Quijano et al. in Revist Psicol Soc Apl 10(2):27–61, 2000; Pap Psicól Revist Col Of Psicó 29:92–106, 2008). Commitment and identification are understood by the HSA at an individual level as part of the quality of human processes and resources in an organization; and therefore as antecedents of important organizational outcomes, such as personnel turnover intentions, organizational citizenship behavior, etc. (Meyer et al. in J Org Behav 27:665–683, 2006). The theoretical integrative model which underlies ICI Quijano et al. (2000) was tested in a sample (N = 625) of workers in a Spanish public hospital. Confirmatory factor analysis through structural equation modeling was performed. Elliptical least square solution was chosen as estimator procedure on account of non-normal distribution of the variables. The results confirm the goodness of fit of an integrative model, which underlies the relation between Commitment and Identification, although each one is operatively different.  相似文献   

12.
This paper proposes a method for estimating national standardizations of partially speeded tests composed of items from a previously standardized item bank. The model combines two submodels, one for whether the examinee reaches the item, and the second for whether she is successful if she does reach it. The former model is comparable to that for survival analysis using item position in test as a quasi-time parameter (Hutchison 1988). The latter is a straightforward Rasch Model. Combining the two submodels allows for the possibility that ability and drop-out were correlated. The model proposed here is superior that of Bolt et al. (2002), which divides the population into a speeded and a non-speeded group, in that it allows for a range of speededness effects. The model is tested using three UK national standardizations on one outcome and comparing the actual and predicted distributions. It is suggested that the observed discrepancies may be due to differences in the samples drawn, and that in some circumstances the model may actually produce a better estimate than an actual standardization exercise.  相似文献   

13.
Quasi-Monte Carlo method (QMC) is an efficient technique for numerical integration. QMC provides a lower convergence rate, O(ln d n/n), than the standard Monte Carlo (MC), , where n is the number of simulations and d the nominal problem dimension. However, some studies in the literature have claimed that the QMC performs better than the MC method for d < 20/30 because of its dependence on d. Caflisch et al. (J Comput Finance 1(1):27–46, 1997) have proposed to extend the QMC superiority by ANOVA considerations. To this aim, we consider the Asian basket option pricing problem, where d is much higher than 30, by QMC simulation. We investigate the applicability of several path-generation constructions that have been proposed to overtake the dimensional drawback. We employ the principal component analysis, the linear transformation, the Kronecker product approximation and test their performance both in terms of computational cost and accuracy. Finally, we compare the results with those obtained by the standard MC.   相似文献   

14.
We propose a Bayesian nonparametric model to estimate rating migration matrices and default probabilities using the reinforced urn processes (RUP) introduced in Muliere et al. (2000). The estimated default probability becomes our prior information in a parametric model for the prediction of the number of bankruptcies, with the only assumption of exchangeability within rating classes. The Polya urn construction of the transition matrix justifies a Beta distributed de Finetti measure. Dependence among the processes is introduced through the dependence among the default probabilities, with the Bivariate Beta Distribution proposed in Olkin and Liu (2003) and its multivariate generalization.  相似文献   

15.
In this paper we consider the issue of unit root testing in cross-sectionally dependent panels. We consider panels that may be characterized by various forms of cross-sectional dependence including (but not exclusive to) the popular common factor framework. We consider block bootstrap versions of the group-mean (Im et al., 2003) and the pooled (Levin et al., 2002) unit root coefficient DF tests for panel data, originally proposed for a setting of no cross-sectional dependence beyond a common time effect. The tests, suited for testing for unit roots in the observed data, can be easily implemented as no specification or estimation of the dependence structure is required. Asymptotic properties of the tests are derived for T going to infinity and N finite. Asymptotic validity of the bootstrap tests is established in very general settings, including the presence of common factors and cointegration across units. Properties under the alternative hypothesis are also considered. In a Monte Carlo simulation, the bootstrap tests are found to have rejection frequencies that are much closer to nominal size than the rejection frequencies for the corresponding asymptotic tests. The power properties of the bootstrap tests appear to be similar to those of the asymptotic tests.  相似文献   

16.
This paper proposes a new method to empirically validate simulation models that generate artificial time series data comparable with real-world data. The approach is based on comparing structures of vector autoregression models which are estimated from both artificial and real-world data by means of causal search algorithms. This relatively simple procedure is able to tackle both the problem of confronting theoretical simulation models with the data and the problem of comparing different models in terms of their empirical reliability. Moreover the paper provides an application of the validation procedure to the agent-based macroeconomic model proposed by Dosi et al. (2015).  相似文献   

17.
We present a Bayesian approach for analyzing aggregate level sales data in a market with differentiated products. We consider the aggregate share model proposed by Berry et al. [Berry, Steven, Levinsohn, James, Pakes, Ariel, 1995. Automobile prices in market equilibrium. Econometrica. 63 (4), 841–890], which introduces a common demand shock into an aggregated random coefficient logit model. A full likelihood approach is possible with a specification of the distribution of the common demand shock. We introduce a reparameterization of the covariance matrix to improve the performance of the random walk Metropolis for covariance parameters. We illustrate the usefulness of our approach with both actual and simulated data. Sampling experiments show that our approach performs well relative to the GMM estimator even in the presence of a mis-specified shock distribution. We view our approach as useful for those who are willing to trade off one additional distributional assumption for increased efficiency in estimation.  相似文献   

18.
DSGE models are useful tools for evaluating the impact of policy changes, but their use for (short-term) forecasting is still in its infancy. Besides theory-based restrictions, the timeliness of data is an important issue. Since DSGE models are based on quarterly data, they suffer from the publication lag of quarterly national accounts. In this paper we present a framework for the short-term forecasting of GDP based on a medium-scale DSGE model for a small open economy within a currency area. We utilize the information available in monthly indicators based on the approach proposed by Giannone et al. (2009). Using Austrian data, we find that the forecasting performance of the DSGE model can be improved considerably by incorporating monthly indicators, while still maintaining the story-telling capability of the model.  相似文献   

19.
We extract elliptically symmetric principal components from a panel of 17 OECD exchange rates and use the deviations from the components to forecast future exchange rate movements, following the method in Engel et al. (2015). Instead of using standard factor models, we apply elliptically symmetric principal component analysis (ESPCA), introduced by Solat and Spanos (2018), which captures both contemporaneous and temporal co-variation among the exchange rates. We find that ESPCA is more accurate than forecasts generated by existing standard methods and the random walk model, with or without including macroeconomic fundamentals.  相似文献   

20.
Macroeconometric data often come under the form of large panels of time series, themselves decomposing into smaller but still quite large subpanels or blocks. We show how the dynamic factor analysis method proposed in Forni et al. (2000), combined with the identification method of Hallin and Liška (2007), allows for identifying and estimating joint and block-specific common factors. This leads to a more sophisticated analysis of the structures of dynamic interrelations within and between the blocks in such datasets, along with an informative decomposition of explained variances. The method is illustrated with an analysis of a dataset of Industrial Production Indices for France, Germany, and Italy.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号