首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 734 毫秒
1.
Abstract

Inhalt des bekannten Erneuerungsproblems ist es, bei einer Gesamtheit (Versichertenbestand, Viehherde, Warenlager, etc.) deren Elemente infolge gewisser Ursachen (Todesfall, Unbrauchbar-oder Verkauft werden, etc.) aus der Gesamtheit ausscheiden können, den Neuzugang zu untersuchen, der dazu nötig ist, die Gesamtheit auf konstantem Umfang zu erhalten. Bei diskontinuierlich sich erneuernden Gesamtheiten, wie sie in dieser Note betrachtet werden sollen, vollziehen sich die Wechsel im Bestand immer am Ende einer festen Zeiteinheit (Jahr, Tag, etc.). Das zufallsartige Ausscheiden der Elemente aus der Gesamtheit sei geregelt durch ein als bekannt vorausgesetztes Abbaugesctz. Darunter versteht man ein System von K - 1 Zahlen (1) P 0, P 1, P 2, . . PK;wo P v die Wahrscheinlichkeit für ein der Gesamtheit neu angegliedertes Element ist, ihr mindestens v Zeiteinheiten anzugehören.Diese Wahrscheinlichkeiten können durch überwachungeiner geschlossenen Elementengesamtheit statistisch gewonnen werden. Das Abbrechen des Abbaugesetzes bedeutet, dass kein Element mehr als K Zeiteinheiten der Gesamtheit angehört.  相似文献   

2.
Summary

Large sample estimation of the origin (α1 and the scale parameter (α2 of the gamma distribution when the shape parameter m is known is considered. Assuming both parameters are unknown, the optimum spacings (0<λ12<...λ k <1) determining the maximum efficiences among other choices of the same number of observations are obtained. The coefficients to be used in computing the estimates, their variances and their asymptotic relative efficiencies (A.R.E.) relative to the Cramer Rao lower bounds are given.  相似文献   

3.
Abstract

It is a fact that when one is making a decision concerning the probability distribution of a random variable by means of observing this random variable, one is recommended by the statisticians to consider certain functions of the operating characteristic (O.C.) of the decision function as measures of the reliability of the actual decision made. For instance, the confidence coefficient of an interval estimator will as a rule be regarded as a measure of our confidence in the interval.  相似文献   

4.
The GARCH model has been very successful in capturing the serial correlation of asset return volatilities. As a result, applying the model to options pricing attracts a lot of attention. However, previous tree-based GARCH option pricing algorithms suffer from exponential running time, a cut-off maturity, inaccuracy, or some combination thereof. Specifically, this paper proves that the popular trinomial-tree option pricing algorithms of Ritchken and Trevor (Ritchken, P. and Trevor, R., Pricing options under generalized GARCH and stochastic volatility processes. J. Finance, , 54(1), 377–402.) and Cakici and Topyan (Cakici, N. and Topyan, K., The GARCH option pricing model: a lattice approach. J. Comput. Finance, , 3(4), 71–85.) explode exponentially when the number of partitions per day, n, exceeds a threshold determined by the GARCH parameters. Furthermore, when explosion happens, the tree cannot grow beyond a certain maturity date, making it unable to price derivatives with a longer maturity. As a result, the algorithms must be limited to using small n, which may have accuracy problems. The paper presents an alternative trinomial-tree GARCH option pricing algorithm. This algorithm provably does not have the short-maturity problem. Furthermore, the tree-size growth is guaranteed to be quadratic if n is less than a threshold easily determined by the model parameters. This level of efficiency makes the proposed algorithm practical. The surprising finding for the first time places a tree-based GARCH option pricing algorithm in the same complexity class as binomial trees under the Black–Scholes model. Extensive numerical evaluation is conducted to confirm the analytical results and the numerical accuracy of the proposed algorithm. Of independent interest is a simple and efficient technique to calculate the transition probabilities of a multinomial tree using generating functions.  相似文献   

5.
Abstract

In [5] S. Holm proposed teststatistics for testing simple hypotheses by means of the probability paper for distribution functions (d.f.) of the form F 0(x) = Φ[(x - μ0)/σ0], where μ0 is location parameter, σ0 scale parameter, and Φ is an absolutely continuous distribution function with Φ(0) = 1/2. If μ0 and (σ0 are known, the hypothesis H 0 is:
  • H 0: H(x) = F 0(x) = Φ[(x0)/σ0],

while the three possible alternatives are
  • H 1: H(x) > F 0(x)

  • H 2: H(x) < F 0(x)

  • H 3: H(x) ≠ F 0(x).

  相似文献   

6.
《Quantitative Finance》2013,13(4):288-295
Abstract

This paper is concerned with geometric Asian options whose pay-offs depend on the geometric average of the underlying asset prices. Following the Cox et al (1979 J. Financial Economics 7 229-63) arbitrage arguments, we develop one-state variable binomial models for the options on the basis of the idea of Cheuk and Vorst (1997 J. Int. Money Finance 16 173-87). The models are more efficient and faster than those lattice methods (for the options) proposed by Hull and White (1993 J. Derivatives 1 21-31), Ritchken et al (1993 Manage. Sci. 39 1202-13), Barraquand and Pudet (1996 Math. Finance 6 17-51) and Cho and Lee (1997 J. Financial Eng. 6 179-91). We also establish the equivalence of the models and certain difference schemes.  相似文献   

7.
Abstract

Consider a sequence of independent random variables (r.v.) X 1 X 2, …, Xn , … , with the same distribution function (d.f.) F(x). Let E (Xn ) = 0, E , E (?(X)) denoting the mean value of the r.v. ? (X). Further, let the r.v. where have the d.f. F n (x). It was proved by Berry [1] and the present author (Esseen [2], [4]) that Φ(x) being the normal d.f.   相似文献   

8.
We present in a Monte Carlo simulation framework, a novel approach for the evaluation of hybrid local volatility [Risk, 1994, 7, 18–20], [Int. J. Theor. Appl. Finance, 1998, 1, 61–110] models. In particular, we consider the stochastic local volatility model—see e.g. Lipton et al. [Quant. Finance, 2014, 14, 1899–1922], Piterbarg [Risk, 2007, April, 84–89], Tataru and Fisher [Quantitative Development Group, Bloomberg Version 1, 2010], Lipton [Risk, 2002, 15, 61–66]—and the local volatility model incorporating stochastic interest rates—see e.g. Atlan [ArXiV preprint math/0604316, 2006], Piterbarg [Risk, 2006, 19, 66–71], Deelstra and Rayée [Appl. Math. Finance, 2012, 1–23], Ren et al. [Risk, 2007, 20, 138–143]. For both model classes a particular (conditional) expectation needs to be evaluated which cannot be extracted from the market and is expensive to compute. We establish accurate and ‘cheap to evaluate’ approximations for the expectations by means of the stochastic collocation method [SIAM J. Numer. Anal., 2007, 45, 1005–1034], [SIAM J. Sci. Comput., 2005, 27, 1118–1139], [Math. Models Methods Appl. Sci., 2012, 22, 1–33], [SIAM J. Numer. Anal., 2008, 46, 2309–2345], [J. Biomech. Eng., 2011, 133, 031001], which was recently applied in the financial context [Available at SSRN 2529691, 2014], [J. Comput. Finance, 2016, 20, 1–19], combined with standard regression techniques. Monte Carlo pricing experiments confirm that our method is highly accurate and fast.  相似文献   

9.
Abstract

1. In a s. or n.s. cPp (stationary or non-stationary compound Poisson process) the probability for occurrence of m events, while the parameter (one-or more-dimensional) passes from zero to τ 0 as measured on an absolute scale (the τ-scale), is defined as a mean of Poisson probabilities with intensities, which are distributed with distribution functions defining another random process, called the primary process with respect to the s. or n.s cPp. The stationarity (in the weak sence) and the non-stationarity of the primary process imply the same properties of the s. or n.s. cPp.  相似文献   

10.
Book Reviews     
Organized Uncertainty: Designing a World of Risk Management. Michael Power. Oxford University Press, 2007. xviii and 248pp. ISBN 978–0–9–925394–4. £24.99.

Intellectual Capital Reporting: Lessons from Hong Kong and Australia. J. Guthrie, R. Petty and F. Ricceri. The Institute of Chartered Accountants of Scotland, 2007, vii and 118pp. ISBN 978–1–904574–27–9. £15

The Routledge Companion to Fair Value and Financial Reporting. P. Walton (ed.). Routledge, 2007. xviii and 404 pp. ISBN 978–0–415–42356–4. £95.

UK Reporting of Intellectual Capital. Jeffrey Unerman, James Guthrie and Ludmila Striukova. ICAEW Centre for Business Performance, 2007. 68 pp. ISBN 978 1 84152 507 5. £20.  相似文献   

11.
Nils Ekholm     
Abstract

The problem of χ2 tests of a linear hypothesis H0 for ‘matched samples’ in attribute data has been discussed earlier by the author (Bennett, 1967, 1968). This note presents corresponding results for the hypothesis that the multinomial probabilities p satisfy (c ?1) functional restrictions: F 1(p) = 0, ... , F C?1(p) = 0. An explicit relationship between the usual ‘goodness-of-fit’ χ2 and the modified minimum χ2 (=χ*2) of Jeffreys (1938) and Neyman (1949) is demonstrated for this situation. An example of the test for the 2 × 2 × 2 contingency table is given and compared with the solution of Bartlett (1935).  相似文献   

12.
CALL FOR PAPERS     
Organized Uncertainty: Designing a World of Risk Management. Michael Power. Oxford University Press, 2007. xviii and 248pp. ISBN 978–0–9–925394–4. £24.99.

Intellectual Capital Reporting: Lessons from Hong Kong and Australia. J. Guthrie, R. Petty and F. Ricceri. The Institute of Chartered Accountants of Scotland, 2007, vii and 118pp. ISBN 978–1–904574–27–9. £15

The Routledge Companion to Fair Value and Financial Reporting. P. Walton (ed.). Routledge, 2007. xviii and 404 pp. ISBN 978–0–415–42356–4. £95.

UK Reporting of Intellectual Capital. Jeffrey Unerman, James Guthrie and Ludmila Striukova. ICAEW Centre for Business Performance, 2007. 68 pp. ISBN 978 1 84152 507 5. £20.  相似文献   

13.

The sequential approach to credibility, developed by Landsman and Makov [(1999a) On stochastic approximation and credibility. Scand. Actuarial J. 1, 15-31; (1999b) Sequential credibility evaluation for symmetric location claim distributions. Insurance: Math. Econ. 24, 291-300] is extended to the scale dispersion family, which contains distributions often used in actuarial science: log-normal, Weibull, Half normal, Stable, Pareto, to mention only a few. For members of this family a sequential quasi-credibility formula is devised, which can also be used for heavy tailed claims. The results are illustrated by a study of log-normal claims.  相似文献   

14.
15.
In this note we extend the Gaussian estimation of two factor CKLS and CIR models recently considered in Nowman, K. B. (2001, Gaussian estimation and forecasting of multi-factor term structure models with an application to Japan and the United Kingdom, Asia Pacif. Financ. Markets 8, 23–34) to include feedback effects in the conditional mean as was originally formulated in general continuous time models by Bergstrom, A. R. (1966, Non-recursive models as discrete approximations to systems of stochastic differential equations, Econometrica 34, 173–182) with constant volatility. We use the exact discrete model of Bergstrom, A. R. (1966, Non-recursive models as discrete approximations to systems of stochastic differential equations, Econometrica 34, 173–182) to estimate the parameters which was first used by Brennan, M. J. and Schwartz, E. S. (1979, A continuous time approach to the pricing of bonds, J. Bank. Financ. 3, 133–155) to estimate their two factor interest model but incorporating the assumption of Nowman, K. B. (1997, Gaussian estimation of single-factor continuous time models of the term structure of interest rates, J. Financ. 52, 1695–1706; 2001, Gaussian estimation and forecasting of multi-factor term structure models with an application to Japan and the United Kingdom, Asia Pacif. Financ. Markets 8, 23–34). An application to monthly Japanese Euro currency rates indicates some evidence of feedback from the 1-year rate to the 1-month rate in both the CKLS and CIR models. We also find a low level-volatility effect supporting Nowman, K. B. (2001, Gaussian estimation and forecasting of multi-factor term structure models with an application to Japan and the United Kingdom, Asia Pacif. Financ. Markets 8, 23–34).  相似文献   

16.
Abstract

Extract

d1. Vis, at for fast x er en ikke voksende funktion af y og for fast y en ikke aftagende funktion af x, når dødsintensiteterne μg og μ I for de to live er ikke aftagende. Det forudsættes, at dødsaldrene for (x) og (y) er stokastisk uafhængige.  相似文献   

17.
We suggest an improved FFT pricing algorithm for discretely sampled Asian options with general independently distributed returns in the underlying. Our work complements the studies of Carverhill and Clewlow [Risk, 1990, 3(4), 25–29], Benhamou [J. Comput. Finance, 2002, 6(1), 49–68], and Fusai and Meucci [J. Bank. Finance, 2008, 32(10), 2076–2088], and, if we restrict our attention only to log-normally distributed returns, also Ve?e? [Risk, 2002, 15(6), 113–116]. While the existing convolution algorithms compute the density of the underlying state variable by moving forward on a suitably defined state space grid, our new algorithm uses backward price convolution, which resembles classical lattice pricing algorithms. For the first time in the literature we provide an analytical upper bound for the pricing error caused by the truncation of the state space grid and by the curtailment of the integration range. We highlight the benefits of the new scheme and benchmark its performance against existing finite difference, Monte Carlo, and forward density convolution algorithms.  相似文献   

18.
Abstract

Let us assume that two integrable functions f (t) and ?(t) are defined in an interval a > t > b, that f (t) never increases, and that 0 ≥ ? (t) ≥ 1.  相似文献   

19.
Abstract

Extract

d1. Bestern karakteristikkerne for den partielle differentialigning Gør rede for, at der ved begyndelsebetingelsen z=2√x for 0<x<+∞,y=0 fastlægges netop en løsning til (*) i et passende område ω i xy-planen, og bestem denne løsning (herunder et brugbart område ω).  相似文献   

20.
Motivated by the practical challenge in monitoring the performance of a large number of algorithmic trading orders, this paper provides a methodology that leads to automatic discovery of causes that lie behind poor trading performance. It also gives theoretical foundations to a generic framework for real-time trading analysis. The common acronym for investigating the causes of bad and good performance of trading is transaction cost analysis Rosenthal [Performance Metrics for Algorithmic Traders, 2009]). Automated algorithms take care of most of the traded flows on electronic markets (more than 70% in the US, 45% in Europe and 35% in Japan in 2012). Academic literature provides different ways to formalize these algorithms and show how optimal they can be from a mean-variance (like in Almgren and Chriss [J. Risk, 2000, 3(2), 5–39]), a stochastic control (e.g. Guéant et al. [Math. Financ. Econ., 2013, 7(4), 477–507]), an impulse control (see Bouchard et al. [SIAM J. Financ. Math., 2011, 2(1), 404–438]) or a statistical learning (as used in Laruelle et al. [Math. Financ. Econ., 2013, 7(3), 359–403]) viewpoint. This paper is agnostic about the way the algorithm has been built and provides a theoretical formalism to identify in real-time the market conditions that influenced its efficiency or inefficiency. For a given set of characteristics describing the market context, selected by a practitioner, we first show how a set of additional derived explanatory factors, called anomaly detectors, can be created for each market order (following for instance Cristianini and Shawe-Taylor [An Introduction to Support Vector Machines and Other Kernel-based Learning Methods, 2000]). We then will present an online methodology to quantify how this extended set of factors, at any given time, predicts (i.e. have influence, in the sense of predictive power or information defined in Basseville and Nikiforov [Detection of Abrupt Changes: Theory and Application, 1993], Shannon [Bell Syst. Tech. J., 1948, 27, 379–423] and Alkoot and Kittler [Pattern Recogn. Lett., 1999, 20(11), 1361–1369]) which of the orders are underperforming while calculating the predictive power of this explanatory factor set. Armed with this information, which we call influence analysis, we intend to empower the order monitoring user to take appropriate action on any affected orders by re-calibrating the trading algorithms working the order through new parameters, pausing their execution or taking over more direct trading control. Also we intend that use of this method can be taken advantage of to automatically adjust their trading action in the post trade analysis of algorithms.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号