首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
2.
This study investigates the volatily jump contagion among the Asian, European (Germany, UK, & France) and US markets. In particular, it examines the stochastic linkages among the international stock markets and analyzes the self and cross-excitation of jumps. The discontinuities in the stochastic volatility of each market are identified and their structural inter-dependencies are analyzed. Our empirical results imply that negative jumps from the USA and Europe are transmitted to the domestic Asian markets, while positive jumps are majorly from the regional markets. Results also imply that the cross-market linkages vary with respect to markets and regimes. Our results have implications for risk management, investment and hedging decisions.  相似文献   

3.
4.
This paper proposes a tail-truncated stochastic frontier model that allows for the truncation of technical efficiency from below. The truncation bound implies the inefficiency threshold for survival. Specifically, this paper assumes a uniform distribution of technical inefficiency and derives the likelihood function. Even though this distributional assumption imposes a strong restriction that technical inefficiency has a uniform probability density over [0, θ], where θ is the threshold parameter, this model has two advantages: (1) the reduction in the number of parameters compared with more complicated tail-truncated models allows better performance in numerical optimization; and (2) it is useful for empirical studies of the distribution of efficiency or productivity, particularly the truncation of the distribution. The Monte Carlo simulation results support the argument that this model approximates the distribution of inefficiency precisely, as the data-generating process not only follows the uniform distribution but also the truncated half-normal distribution if the inefficiency threshold is small.  相似文献   

5.
The paper proposes a stochastic frontier model with random coefficients to separate technical inefficiency from technological differences across firms, and free the frontier model from the restrictive assumption that all firms must share exactly the same technological possibilities. Inference procedures for the new model are developed based on Bayesian techniques, and computations are performed using Gibbs sampling with data augmentation to allow finite‐sample inference for underlying parameters and latent efficiencies. An empirical example illustrates the procedure. Copyright © 2002 John Wiley & Sons, Ltd.  相似文献   

6.
Worker peer-effects and managerial selection have received limited attention in the stochastic frontier analysis literature. We develop a parametric production function model that allows for worker peer-effects in output and worker-level inefficiency that is correlated with a manager’s selection of worker teams. The model is the usual “composed error” specification of the stochastic frontier model, but we allow for managerial selectivity (endogeneity) that works through the worker-level inefficiency term. The new specification captures both worker-level inefficiency and the manager’s ability to efficiently select teams to produce output. As the correlation between the manager’s selection equation and worker inefficiency goes to zero, our parametric model reduces to the normal-exponential stochastic frontier model of Aigner et al. (1977) with peer-effects. A comprehensive application to the NBA is provided.  相似文献   

7.
Journal of Productivity Analysis - In this paper we provide several new specifications within the true random effects model as well as stochastic frontiers models estimated with GLS and MLE that...  相似文献   

8.
The present paper tests a new model comparison methodology by comparing multiple calibrations of three agent-based models of financial markets on the daily returns of 24 stock market indices and exchange rate series. The models chosen for this empirical application are the herding model of Gilli and Winker (2003), its asymmetric version by Alfarano et al. (2005) and the more recent model by Franke and Westerhoff (2011), which all share a common lineage to the herding model introduced by Kirman (1993). In addition, standard ARCH processes are included for each financial series to provide a benchmark for the explanatory power of the models. The methodology provides a consistent and statistically significant ranking of the three models. More importantly, it also reveals that the best performing model, Franke and Westerhoff, is generally not distinguishable from an ARCH-type process, suggesting their explanatory power on the data is similar.  相似文献   

9.
Journal of Productivity Analysis - This paper develops a theoretical framework for modeling farm households’ joint production and consumption decisions in the presence of technical...  相似文献   

10.
Stochastic frontier models with multiple time-varying individual effects   总被引:3,自引:1,他引:2  
This paper proposes a flexible time-varying stochastic frontier model. Similarly to Lee and Schmidt [1993, In: Fried H, Lovell CAK, Schmidt S (eds) The measurement of productive efficiency: techniques and applications. Oxford University Press, Oxford], we assume that individual firms’ technical inefficiencies vary over time. However, the model, which we call the “multiple time-varying individual effects” model, is more general in that it allows multiple factors determining firm-specific time-varying technical inefficiencies. This allows the temporal pattern of inefficiency to vary over firms. The number of such factors can be consistently estimated. The model is applied to data on Indonesian rice farms, and the changes in the efficiency rankings of farms over time demonstrate the model’s flexibility.
Young H. LeeEmail:
  相似文献   

11.
We replace the conventional market clearing process by maximizing the information entropy. At fixed return agents optimize their demands using an utility with a statistical tolerance against deviations from the deterministic maximum of the utility which leads to information entropies for the agents. Interactions are described by coupling the agents to a large system, called ‘market’. The main problem in economic markets is the absence of the analogue of the first law of thermodynamics (energy conservation in physics). We solve this problem by restricting the utilities to be at most quadratic in the demand (ideal gas approximation). Maximizing the sum of agent and market entropy serves to eliminate the unknown properties of the latter. Assuming a stochastic volatility decomposition for the return we derive an expression for the pdf of the return which is in excellent agreement with the daily DAX data. The pdf exhibits a power law behaviour with an integer tail index equal to the number of agent groups. With the assumption of an Ornstein Uhlenbeck model for the risk aversity parameters of the agents the autocorrelation for the absolute return is well described up to time lags of 2.5 years.  相似文献   

12.
This paper investigates the issue of market risk quantification for emerging and developed market equity portfolios. A very wide spectrum of popular and widely used in practice Value at Risk (VaR) models are evaluated and compared with Extreme Value Theory (EVT) and adaptive filtered models, during normal, crises, and post-crises periods. The results are interesting and indicate that despite the documented differences between emerging and developed markets, the most successful VaR models are common for both asset classes. Furthermore, in the case of the (fatter tailed) emerging market equity portfolios, most VaR models turn out to yield conservative risk forecasts, in contrast to developed market equity portfolios, where most models underestimate the realized VaR. VaR estimation during periods of financial turmoil seems to be a difficult task, particularly in the case of emerging markets and especially for the higher loss quantiles. VaR models seem to be affected less by crises periods in the case of developed markets. The performance of the parametric (non-parametric) VaR models improves (deteriorates) during post-crises periods due to the inclusion of extreme events in the estimation sample.  相似文献   

13.
Data envelopment analysis (DEA) is a non-parametric approach for measuring the relative efficiencies of peer decision making units (DMUs). In recent years, it has been widely used to evaluate two-stage systems under different organization mechanisms. This study modifies the conventional leader–follower DEA models for two-stage systems by considering the uncertainty of data. The dual deterministic linear models are first constructed from the stochastic CCR models under the assumption that all components of inputs, outputs, and intermediate products are related only with some basic stochastic factors, which follow continuous and symmetric distributions with nonnegative compact supports. The stochastic leader–follower DEA models are then developed for measuring the efficiencies of the two stages. The stochastic efficiency of the whole system can be uniquely decomposed into the product of the efficiencies of the two stages. Relationships between stochastic efficiencies from stochastic CCR and stochastic leader–follower DEA models are also discussed. An example of the commercial banks in China is considered using the proposed models under different risk levels.  相似文献   

14.
In affine term structure models (ATSM) the stochastic Jacobian under the forward measure plays a crucial role for pricing, as discussed in Elliott and van der Hoek (Finance Stoch 5:511–525, 2001). Their approach leads to a deterministic integro-differential equation which, apparently, has the advantage of by-passing the solution to the Riccati ODE obtained by the standard Feynman-Kac argument. In the generic multi-dimensional case, we find a procedure to reduce such integro-differential equation to a non linear matrix ODE. We prove that its solution does necessarily require the solution of the vector Riccati ODE. This result is obtained proving an extension of the celebrated Radon Lemma, which allows us to highlight a deep relation between the geometry of the Riccati flow and the stochastic calculus of variations for an ATSM. We are grateful to two anonymous referees for their careful reading of the paper.  相似文献   

15.
For a long time researchers have recognized the need for applying stochastic models for analyzing data generated from agents’ choice under risk. This paper compares and discusses recent axiomatizations of stochastic models for risky choice given by Blavatskyy (2008) and Dagsvik (2008). We show that some of Blavatskyy’s axioms are equivalent to some of Dagsvik’s axioms. We also propose new axioms and derive their implications. Specifically, we show that some of the results of Dagsvik (2008) can be derived under weaker axioms than those he proposed originally.  相似文献   

16.
The range of daily asset prices is often used as a measure of volatility. Using a CARRX (conditional autoregressive range with exogenous variables) model, and the parsimony principle, the paper investigates the factors affecting the volatilities of Asian equity markets. Since the beginning of the new Century, emerging Asian markets such as Taiwan and Shanghai have been undergoing various stages of financial globalization. The volatility of the equity market may not be explained solely by its own dynamics. In this paper, we examine volatility using the following factors: (i) lagged returns; (ii) lagged absolute returns; (iii) own trading volume; (iv) U.S. factors; (v) European factors; and (vi) regional (Asian) factors. Points (i) and (iii) are by and large significant, while (ii) is not. Controlling for (i), (ii) and (iii), we find evidence that the volatility of European markets has spillovers on to both the Taiwan and Tokyo markets, mild evidence that the volatility of the U.S. market has spillovers on to the Hong Kong market, but there are no spillovers from the European or U.S. markets on to the Shanghai market.  相似文献   

17.
This paper describes the first model considered in the computational suite project that compares different numerical algorithms. It is an incomplete markets economy with a continuum of agents and an inequality (borrowing) constraint.  相似文献   

18.
This paper analyzes the empirical performance of two alternative ways in which multi-factor models with time-varying risk exposures and premia may be estimated. The first method echoes the seminal two-pass approach introduced by Fama and MacBeth (1973). The second approach is based on a Bayesian latent mixture model with breaks in risk exposures and idiosyncratic volatility. Our application to monthly, 1980–2010 U.S. data on stock, bond, and publicly traded real estate returns shows that the classical, two-stage approach that relies on a nonparametric, rolling window estimation of time-varying betas yields results that are unreasonable. There is evidence that most portfolios of stocks, bonds, and REITs have been grossly over-priced. On the contrary, the Bayesian approach yields sensible results and a few factor risk premia are precisely estimated with a plausible sign. Predictive log-likelihood scores indicate that discrete breaks in both risk exposures and variances are required to fit the data.  相似文献   

19.
We propose an agent-based framework, based on simple piecewise linear time-invariant continuous-time dynamical systems models, as a means for describing efficient financial markets. We show by examples that many of the common agent-specific trading strategies occurring in the academic literature, including chartists and fundamentalists of various kinds, can be described in the proposed framework. We present definitions for weak and strong market efficiency and provide necessary and sufficient conditions for them to hold. We present minimal examples of strongly and weakly efficient markets to show that these concepts are natural and easy to satisfy in agent-based models, and that the models can reproduce both statistical and behavioral stylized facts of real markets. We provide examples to demonstrate that the framework can be extended for agents with delays in information processing, as well as for agents with time-varying strategies and for nonlinear market impact functions. We also provide a counterexample to show that the proposed market efficiency concepts may require modification in generalizations for nonlinear trading strategies.  相似文献   

20.
Most of the empirical applications of the stochastic volatility (SV) model are based on the assumption that the conditional distribution of returns, given the latent volatility process, is normal. In this paper, the SV model based on a conditional normal distribution is compared with SV specifications using conditional heavy‐tailed distributions, especially Student's t‐distribution and the generalized error distribution. To estimate the SV specifications, a simulated maximum likelihood approach is applied. The results based on daily data on exchange rates and stock returns reveal that the SV model with a conditional normal distribution does not adequately account for the two following empirical facts simultaneously: the leptokurtic distribution of the returns and the low but slowly decaying autocorrelation functions of the squared returns. It is shown that these empirical facts are more adequately captured by an SV model with a conditional heavy‐tailed distribution. It also turns out that the choice of the conditional distribution has systematic effects on the parameter estimates of the volatility process. Copyright © 2000 John Wiley & Sons, Ltd.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号