首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 140 毫秒
1.
This paper focuses on the liquidity of electronic stock markets applying a sequential estimation approach of models for volume duration with increasing threshold values. A modified ACD model with a Box–Tukey transformation and a flexible generalized beta distribution is proposed to capture the changing cluster structure of duration processes. The estimation results with German XETRA data reveal the market's absorption limit for high volumes of shares, expanding the time costs of illiquidity when trading these quantities.  相似文献   

2.
This paper studies the parameter estimation problem for Ornstein–Uhlenbeck stochastic volatility models driven by Lévy processes. Estimation is regarded as the principal challenge in applying these models since they were proposed by Barndorff-Nielsen and Shephard [J. R. Stat. Soc. Ser. B, 2001, 63(2), 167–241]. Most previous work has used a Bayesian paradigm, whereas we treat the problem in the framework of maximum likelihood estimation, applying gradient-based simulation optimization. A hidden Markov model is introduced to formulate the likelihood of observations; sequential Monte Carlo is applied to sample the hidden states from the posterior distribution; smooth perturbation analysis is used to deal with the discontinuities introduced by jumps in estimating the gradient. Numerical experiments indicate that the proposed gradient-based simulated maximum likelihood estimation approach provides an efficient alternative to current estimation methods.  相似文献   

3.
Parameter estimation and statistical inference are challenging problems for stochastic volatility (SV) models, especially those driven by pure jump Lévy processes. Maximum likelihood estimation (MLE) is usually preferred when a parametric statistical model is correctly specified, but traditional MLE implementation for SV models is computationally infeasible due to high dimensionality of the integral involved. To overcome this difficulty, we propose a gradient-based simulated MLE method under the hidden Markov structure for SV models, which covers those driven by pure jump Lévy processes. Gradient estimation using characteristic functions and sequential Monte Carlo in the simulation of the hidden states are implemented. Numerical experiments illustrate the efficiency of the proposed method.  相似文献   

4.
Abstract

The Pareto distribution plays a central role in many areas of econometrics. So, we first consider sequential point estimation problems for the scale parameter of a Pareto distribution. Under a very general loss structure, we derive several asymptotic results regarding the associated “risk” and “regret” functions. Then, we consider the problem of constructing a fixed-ratio confidence interval for the scale parameter, and we propose various sampling techniques to achieve the intended goal. Most of our theoretical findings are asymptotic in nature for either problem, and thus we have presented extensive simulation studies to examine moderate sample performances of all the procedures. The findings in the point estimation problem are also supposed to fill many important gaps left in the paper of Wang (1973).  相似文献   

5.

Stochastic approximation is a powerful tool for sequential estimation of zero points of a function. This methodology is defined and is shown to be related to a broad class of credibility formulae derived for the Exponential Dispersion Family (EDF). We further consider a Location Dispersion Family (LDF) which is rich enough and for which no simple credibility formula exists. For this case, a Generalized Sequential Credibility Formula is suggested and an optimal stepwise gain sequence is derived.  相似文献   

6.
Adoption of Internet banking often follows on from usage of Internet shopping, but policies to increase Internet banking use typically ignore this ordering. This article presents a case study that underscores this sequence of Internet service adoption and identifies factors that shape the propensity to use the Internet for shopping and banking. Application of bivariate probit regression techniques to data sourced from a survey of 259 respondents in Athens, Greece, and estimation of marginal effects of the determinants of Internet banking use conditioned on the determinants of Internet shopping use illustrate that ignoring the sequence of Internet service use can lead to incorrect policy recommendations. This article contributes to the literature by theorising the underlying causal mechanisms of Internet banking adoption and presenting supporting evidence via a sequential modelling approach. We find that personal capacity is an important determinant of Internet banking use in a standard, non-sequential approach but it has no significant effect when the model is sequential. Our results suggest that policymakers should emphasise usefulness attributes of Internet banking when attempting to increase Internet banking usage by people who already use the Internet for shopping.  相似文献   

7.
Abstract

This paper introduces nonlinear threshold time series modeling techniques that actuaries can use in pricing insurance products, analyzing the results of experience studies, and forecasting actuarial assumptions. Basic “self-exciting” threshold autoregressive (SETAR) models, as well as heteroscedastic and multivariate SETAR processes, are discussed. Modeling techniques for each class of models are illustrated through actuarial examples. The methods that are described in this paper have the advantage of being direct and transparent. The sequential and iterative steps of tentative specification, estimation, and diagnostic checking parallel those of the orthodox Box-Jenkins approach for univariate time series analysis.  相似文献   

8.
This paper introduces the class of Bayesian infinite mixture time series models first proposed in Lau & So (2004) for modelling long-term investment returns. It is a flexible class of time series models and provides a flexible way to incorporate full information contained in all autoregressive components with various orders by utilizing the idea of Bayesian averaging or mixing. We adopt a Bayesian sampling scheme based on a weighted Chinese restaurant process for generating partitions of investment returns to estimate the Bayesian infinite mixture time series models. Instead of using the point estimates, as in the classical or non-Bayesian approach, the estimation in this paper is performed by the full Bayesian approach, utilizing the idea of Bayesian averaging to incorporate all information contained in the posterior distributions of the random parameters. This provides a natural way to incorporate model risk or uncertainty. The proposed models can also be used to perform clustering of investment returns and detect outliers of returns. We employ the monthly data from the Toronto Stock Exchange 300 (TSE 300) indices to illustrate the implementation of our models and compare the simulated results from the estimated models with the empirical characteristics of the TSE 300 data. We apply the Bayesian predictive distribution of the logarithmic returns obtained by the Bayesian averaging or mixing to evaluate the quantile-based and conditional tail expectation risk measures for segregated fund contracts via stochastic simulation. We compare the risk measures evaluated from our models with those from some well-known and important models in the literature, and highlight some features that can be obtained from our models.  相似文献   

9.
For the estimation problem of mean-variance optimal portfolio weights, several previous studies have proposed applying Stein type estimators. However, few studies have addressed this problem analytically. Since the form of the loss function used in this problem is not of the quadratic type commonly used in statistical studies, there have been some difficulties in showing analytically the general dominance results. However, dominance results are given here of a class of Stein type estimators for the mean-variance optimal portfolio weights when the covariance matrix is unknown and is estimated. The class of estimators is broader than the one given in a previous study. The results we have obtained enable us to clarify conditions for some previously proposed estimators in finance to have smaller risks than the estimator which we obtain by plugging in the sample estimates.  相似文献   

10.
11.
We show how the equilibrium restrictions implied by standard search models can be used to estimate search‐cost distributions using price data alone. We consider both sequential and non‐sequential search strategies, and develop estimation methodologies that exploit equilibrium restrictions to recover estimates of search‐cost heterogeneity that are theoretically consistent with the search models. We illustrate the method using online prices for several economics and statistics textbooks.  相似文献   

12.
This paper discusses how conditional heteroskedasticity models can be estimated efficiently without imposing strong distributional assumptions such as normality. Using the generalized method of moments (GMM) principle, we show that for a class of models with a symmetric conditional distribution, the GMM estimates obtained from the joint estimating equations corresponding to the conditional mean and variance of the model are efficient when the instruments are chosen optimally. A simple ARCH(1) model is used to illustrate the feasibility of the proposed estimation procedure.  相似文献   

13.
For estimating the integrated volatility and covariance by using high frequency data, Kunitomo and Sato (Math Comput Simul 81:1272–1289, 2011; N Am J Econ Finance 26:289–309, 2013) have proposed the separating information maximum likelihood (SIML) method when there are micro-market noises. The SIML estimator has reasonable finite sample properties and asymptotic properties when the sample size is large when the hidden efficient price process follows a Brownian semi-martingale. We shall show that the SIML estimation is useful for estimating the integrated covariance and hedging coefficient when we have round-off errors, micro-market price adjustments and noises, and when the high-frequency data are randomly sampled. The SIML estimation is consistent, asymptotically normal in the stable convergence sense under a set of reasonable assumptions and it has reasonable finite sample properties with these effects.  相似文献   

14.
Lévy subordinated hierarchical Archimedean copulas (LSHAC) are flexible models in high dimensional modeling. However, there is limited literature discussing their applications, largely due to the challenges in estimating their structures and their parameters. In this paper, we propose a three-stage estimation procedure to determine the hierarchical structure and the parameters of a LSHAC. This is the first paper to empirically examine the modeling performances of LSHAC models using exchange traded funds. Simulation study demonstrates the reliability and robustness of the proposed estimation method in determining the optimal structure. Empirical analysis further shows that, compared to elliptical copulas, LSHACs have better fitting abilities as well as more accurate out-of-sample Value-at-Risk estimates with less parameters. In addition, from a financial risk management point of view, the LSHACs have the advantage of being very flexible in modeling the asymmetric tail dependence, providing more conservative estimations of the probabilities of extreme downward co-movements in the financial market.  相似文献   

15.
This paper finds statistically and economically significant out‐of‐sample portfolio benefits for an investor who uses models of return predictability when forming optimal portfolios. Investors must account for estimation risk, and incorporate an ensemble of important features, including time‐varying volatility, and time‐varying expected returns driven by payout yield measures that include share repurchase and issuance. Prior research documents a lack of benefits to return predictability, and our results suggest that this is largely due to omitting time‐varying volatility and estimation risk. We also document the sequential process of investors learning about parameters, state variables, and models as new data arrive.  相似文献   

16.
Suppose a seller wants to sell k similar or identical objects and there are n > k potential buyers. Suppose that each buyer wants only one object. In this case, we suggest the use of a simultaneous auction that would work as follows. Players are asked to submit sealed bids for one object. The individual with the highest bid chooses an object first; the individual with the second-highest bid chooses the next object; and this process continues until the individual with the kth highest bid receives the last object. Each individual pays the equivalent to his or her bid. When objects are identical, we show that the proposed auction generates the same revenue as a first-price sealed-bid sequential auction. When objects are perfectly correlated, there is no known solution for sequential auctions, whereas we can characterize bidding strategies in the proposed auction. Moreover, the proposed auction is optimal (given an appropriately chosen reserve price), and it may be easier and cheaper to run than a sequential auction.  相似文献   

17.
Modeling the joint distribution of spot and futures returns is crucial for establishing optimal hedging strategies. This paper proposes a new class of dynamic copula-GARCH models that exploits information from high-frequency data for hedge ratio estimation. The copula theory facilitates constructing a flexible distribution; the inclusion of realized volatility measures constructed from high-frequency data enables copula forecasts to swiftly adapt to changing markets. By using data concerning equity index returns, the estimation results show that the inclusion of realized measures of volatility and correlation greatly enhances the explanatory power in the modeling. Moreover, the out-of-sample forecasting results show that the hedged portfolios constructed from the proposed model are superior to those constructed from the prevailing models in reducing the (estimated) conditional hedged portfolio variance. Finally, the economic gains from exploiting high-frequency data for estimating the hedge ratios are examined. It is found that hedgers obtain additional benefits by including high-frequency data in their hedging decisions; more risk-averse hedgers generate greater benefits.  相似文献   

18.
Nonparametric Inference of Value-at-Risk for Dependent Financial Returns   总被引:6,自引:1,他引:5  
The article considers nonparametric estimation of value-at-risk(VaR) and associated standard error estimation for dependentfinancial returns. Theoretical properties of the kernel VaRestimator are investigated in the context of dependence. Thepresence of dependence affects the variance of the VaR estimatesand has to be taken into consideration in order to obtain adequateassessment of their variation. An estimation procedure of thestandard errors is proposed based on kernel estimation of thespectral density of a derived series. The performance of theVaR estimators and the proposed standard error estimation procedureare evaluated by theoretical investigation, simulation of commonlyused models for financial returns, and empirical studies onreal financial return series.  相似文献   

19.
Estimation risk occurs when individuals form beliefs about parameters that are unknown. We examine how auditors respond to the estimation risk that arises when they form beliefs about the likelihood of client bankruptcy. We argue that auditors are likely to become more conservative when facing higher estimation risk because they are risk-averse. We find that estimation risk is of first-order importance in explaining auditor behavior. In particular, auditors are more likely to issue going-concern opinions, are more likely to resign, and charge higher audit fees when the standard errors surrounding the point estimates of bankruptcy are larger. To our knowledge, this is the first study to quantify estimation risk using the variance-covariance matrix of coefficient estimates taken from a statistical prediction model.  相似文献   

20.
Several issues concerning foreign project evaluation are critically examined. The paper suggests that the point of view for analysis, cost of capital and cash flow estimation be internally consistent. Three points of views of analysis, local, global, and parent specific, are identified and various inconsistencies in the estimation of the initial outlay and project cash flows are identified. The elimination of the inconsistencies is discussed. It is argued that the treatment of blocked funds is relevant only from the parent specific point of view. A number of alternatives for the estimate of the terminal value is also suggested. The differences between the parent specific point of view and the global and local views are sharply drawn. An integrated scheme is also presented to underscore the need for consistency in evaluating foreign investments.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号