首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
蒙特卡洛模拟方法是一种非常重要的矿业投资风险分析方法。文中介绍了蒙特卡洛模拟方法的思想和具体步骤,以及常见随机数产生方式,同时讲述了该方法的成功例子,最后简单分析了该方法的优点以及目前在使用中存在的问题。  相似文献   

2.
We describe in this paper a variance reduction method based on control variates. The technique uses the fact that, if all stochastic assets but one are replaced in the payoff function by their mean, the resulting integral can most often be evaluated in closed form. We exploit this idea by applying the univariate payoff as control variate and develop a general Monte Carlo procedure, called Mean Monte Carlo (MMC). The method is then tested on a variety of multifactor options and compared to other Monte Carlo approaches or numerical techniques. The method is of easy and broad applicability and gives good results especially for low to medium dimension and in high volatility environments.  相似文献   

3.
A random sample drawn from a population would appear to offer an ideal opportunity to use the bootstrap in order to perform accurate inference, since the observations of the sample are IID. In this paper, Monte Carlo results suggest that bootstrapping a commonly used index of inequality leads to inference that is not accurate even in very large samples, although inference with poverty indices is satisfactory. We find that the major cause is the extreme sensitivity of many inequality indices to the exact nature of the upper tail of the income distribution. This leads us to study two non-standard bootstraps, the m out of n bootstrap, which is valid in some situations where the standard bootstrap fails, and a bootstrap in which the upper tail is modelled parametrically. Monte Carlo results suggest that accurate inference can be achieved with this last method in moderately large samples.  相似文献   

4.
The calculation of likelihood functions of many econometric models requires the evaluation of integrals without analytical solutions. Approaches for extending Gaussian quadrature to multiple dimensions discussed in the literature are either very specific or suffer from exponentially rising computational costs in the number of dimensions. We propose an extension that is very general and easily implemented, and does not suffer from the curse of dimensionality. Monte Carlo experiments for the mixed logit model indicate the superior performance of the proposed method over simulation techniques.  相似文献   

5.
Hypothesis testing on cointegrating vectors based on the asymptotic distributions of the test statistics are known to suffer from severe small sample size distortion. In this paper an alternative bootstrap procedure is proposed and evaluated through a Monte Carlo experiment, finding that the Type I errors are close to the nominal signficance levels but power might be not entirely adequate. It is then shown that a combined test based on the outcomes of both the asymptotic and the bootstrap tests will have both correct size and low Type II error, therefore improving the currently available procedures.  相似文献   

6.
An article by Chan et al. ( 2013 ) published in the Journal of Business and Economic Statistics introduces a new model for trend inflation. They allow the trend inflation to evolve according to a bounded random walk. In order to draw the latent states from their respective conditional posteriors, they use accept–reject Metropolis–Hastings procedures. We reproduce their results using particle Markov chain Monte Carlo (PMCMC), which approaches drawing the latent states from a different technical point of view by relying on combining Markov chain Monte Carlo and sequential Monte Carlo methods. To conclude: we are able to reproduce the results of Chan et al. ( 2013 ). Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

7.
The Wooldridge method is based on a simple and novel strategy to deal with the initial values problem in nonlinear dynamic random‐effects panel data models. The characteristic of the method makes it very attractive in empirical applications. However, its finite sample performance and robustness are not fully known as of yet. In this paper we investigate the performance and robustness of this method in comparison with an ideal case in which the initial values are known constants; the worst scenario is based on an exogenous initial values assumption, and the Heckman's reduced‐form approximation method, which is widely used in the literature. The dynamic random‐effects probit and Tobit (type I) models are used as working examples. Various designs of the Monte Carlo experiments and two further empirical illustrations are provided. The results suggest that the Wooldridge method works very well only for the panels of moderately long duration (longer than 5–8 periods). Heckman's reduced‐form approximation is suggested for short panels (shorter than 5 periods). It is also found that all the methods tend to perform equally well for panels of long duration (longer than 15–20 periods). Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

8.
We characterize the robustness of subsampling procedures by deriving a formula for the breakdown point of subsampling quantiles. This breakdown point can be very low for moderate subsampling block sizes, which implies the fragility of subsampling procedures, even when they are applied to robust statistics. This instability arises also for data driven block size selection procedures minimizing the minimum confidence interval volatility index, but can be mitigated if a more robust calibration method can be applied instead. To overcome these robustness problems, we introduce a consistent robust subsampling procedure for M-estimators and derive explicit subsampling quantile breakdown point characterizations for MM-estimators in the linear regression model. Monte Carlo simulations in two settings where the bootstrap fails show the accuracy and robustness of the robust subsampling relative to the subsampling.  相似文献   

9.
This article explores the use of risk analysis as a major feature of the corporate planning process operated by the British Railways Board (BRB). The risk-analysis procedures involve the use of fractile analysis and Monte Carlo simulation. They were first used in connection with the forecasts produced in the BRR's 1983 Corporate Plan and, more particularly, its Railways business clement which is known as the Rail Plan. This was the first time that the BRB had used risk analysis of this kind but, because of its success, the procedures discussed in this article arc now incorporated as a permanent feature of its corporate planning process.  相似文献   

10.
In this article, we propose new Monte Carlo methods for computing a single marginal likelihood or several marginal likelihoods for the purpose of Bayesian model comparisons. The methods are motivated by Bayesian variable selection, in which the marginal likelihoods for all subset variable models are required to compute. The proposed estimates use only a single Markov chain Monte Carlo (MCMC) output from the joint posterior distribution and it does not require the specific structure or the form of the MCMC sampling algorithm that is used to generate the MCMC sample to be known. The theoretical properties of the proposed method are examined in detail. The applicability and usefulness of the proposed method are demonstrated via ordinal data probit regression models. A real dataset involving ordinal outcomes is used to further illustrate the proposed methodology.  相似文献   

11.
Abstract. A large number of different Pseudo- R 2 measures for some common limited dependent variable models are surveyed. Measures include those based solely on the maximized likelihoods with and without the restriction that slope coefficients are zero, those which require further calculations based on parameter estimates of the coefficients and variances and those that are based solely on whether the qualitative predictions of the model are correct or not. The theme of the survey is that while there is no obvious criterion for choosing which Pseudo- R 2 to use, if the estimation is in the context of an underlying latent dependent variable model, a case can be made for basing the choice on the strength of the numerical relationship to the OLS- R 2 in the latent dependent variable. As such an OLS- R 2 can be known in a Monte Carlo simulation, we summarize Monte Carlo results for some important latent dependent variable models (binary probit, ordinal probit and Tobit) and find that a Pseudo- R 2 measure due to McKelvey and Zavoina scores consistently well under our criterion. We also very briefly discuss Pseudo- R 2 measures for count data, for duration models and for prediction-realization tables.  相似文献   

12.
Many cases of strategic interaction between agents involve a continuous set of choices. It is natural to model these problems as continuous space games. Consequently, the population of agents playing the game will be represented with a density function defined over the continuous set of strategy choices. Simulating evolutionary dynamics on continuous strategy spaces is a challenging problem. The classic approach of discretizing the strategy space is ineffective for multidimensional strategy spaces. We present a principled approach to simulation of adaptive dynamics in continuous space games using sequential Monte Carlo methods. Sequential Monte Carlo methods use a set of weighted random samples, also named particles to represent density functions over multidimensional spaces. Sequential Monte Carlo methods provide computationally efficient ways of computing the evolution of probability density functions. We employ resampling and smoothing steps to prevent particle degeneration problem associated with particle estimates. The resulting algorithm can be interpreted as an agent based simulation with elements of natural selection, regression to mean and mutation. We illustrate the performance of the proposed simulation technique using two examples: continuous version of the repeated prisoner dilemma game and evolution of bidding functions in first-price closed-bid auctions.  相似文献   

13.
Today quasi-Monte Carlo methods are used successfully in computational finance and economics as an alternative to the Monte Carlo method. One drawback of these methods, however, is the lack of a practical way of error estimation. To address this issue several researchers introduced the so-called randomized quasi-Monte Carlo methods in the last decade. In this paper we will present a survey of randomized quasi-Monte Carlo methods, and compare their efficiencies with the efficiency of the Monte Carlo method in pricing certain securities. We will also investigate the effects of Box–Muller and inverse transformation techniques when they are applied to low-discrepancy sequences.  相似文献   

14.
Vector autoregressions (VARs) are important tools in time series analysis. However, relatively little is known about the finite-sample behaviour of parameter estimators. We address this issue, by investigating ordinary least squares (OLS) estimators given a data generating process that is a purely nonstationary first-order VAR. Specifically, we use Monte Carlo simulation and numerical optimisation to derive response surfaces for OLS bias and variance, in terms of VAR dimensions, given correct specification and several types of over-parameterisation of the model: we include a constant, and a constant and trend, and introduce excess lags. We then examine the correction factors that are required for the least squares estimator to attain the minimum mean squared error (MSE). Our results improve and extend one of the main finite-sample multivariate analytical bias results of Abadir, Hadri and Tzavalis [Abadir, K.M., Hadri, K., Tzavalis, E., 1999. The influence of VAR dimensions on estimator biases. Econometrica 67, 163–181], generalise the univariate variance and MSE findings of Abadir [Abadir, K.M., 1995. Unbiased estimation as a solution to testing for random walks. Economics Letters 47, 263–268] to the multivariate setting, and complement various asymptotic studies.  相似文献   

15.
Although convincing arguments have been put forward for continuous-time modeling, its use in panel research is rare. In one approach, classical N  = 1 state-space modeling procedures are adapted for panel analysis to estimate the exact discrete model (EDM) by means of filter techniques. Based on earlier less satisfactory indirect methods, a more recent approach uses structural equation modeling (SEM) to get the maximum likelihood estimate of the EDM by the direct method. After an introduction into continuous-time state-space modeling for panel data and the EDM, a thorough comparison is made between the two distinct approaches with quite different histories by means of Monte Carlo simulation studies. The model used in the simulation studies is the damped linear oscillator with and without random subject effects.  相似文献   

16.
Tests of ARCH are a routine diagnostic in empirical econometric and financial analysis. However, it is well known that misspecification of the conditional mean may lead to spurious rejection of the null hypothesis of no ARCH. Nonlinearity is a prime example of this phenomenon. There is little work on the extent of the effect of neglected nonlinearity on the properties of ARCH tests. We investigate this using new ARCH testing procedures that are robust to the presence of neglected nonlinearity. Monte Carlo evidence shows that the problem is serious and that the new methods alleviate this problem to a very large extent. We apply the new tests to exchange rate data and find substantial evidence of spurious rejection of the null hypothesis of no ARCH.  相似文献   

17.
In this paper, we present an estimation procedure which uses both option prices and high-frequency spot price feeds to estimate jointly the objective and risk-neutral parameters of stochastic volatility models. The procedure is based on a method of moments that uses analytical expressions for the moments of the integrated volatility and series expansions of option prices and implied volatilities. This results in an easily implementable and rapid estimation technique. An extensive Monte Carlo study compares various procedures and shows the efficiency of our approach. Empirical applications to the Deutsche mark–US dollar exchange rate futures and the S&P 500 index provide evidence that the method delivers results that are in line with the ones obtained in previous studies where much more involved estimation procedures were used.  相似文献   

18.
《Journal of econometrics》2005,126(2):355-384
In this paper, we propose simulation-based Bayesian inference procedures in a cost system that includes the cost function and the cost share equations augmented to accommodate technical and allocative inefficiency. Markov chain Monte Carlo techniques are proposed and implemented for Bayesian inferences on costs of technical and allocative inefficiency, input price distortions and over- (under-) use of inputs. We show how to estimate a well-specified translog system (in which the error terms in the cost and cost share equations are internally consistent) in a random effects framework. The new methods are illustrated using panel data on U.S. commercial banks.  相似文献   

19.
ABSTRACT This paper investigates through Monte Carlo experiments both size and power properties of a bootstrapped trace statistic in two prototypical DGPs. The Monte Carlo results indicate that the ordinary bootstrap has similar size and power properties as inference procedures based on asymptotic critical values. Considering empirical size, the stationary bootstrap is found to provide a uniform improvement over the ordinary bootstrap if the dynamics is underspecified. The use of the stationary bootstrap as a diagnostic tool is suggested. In two illustrative examples this seems to work, and again it appears that the bootstrap incorporates the finite-sample correction required for the asymptotic critical values to apply.  相似文献   

20.
An earlier paper [Kloek and Van Dijk (1978)] is extended in three ways. First, Monte Carlo integration is performed in a nine-dimensional parameter space of Klein's model I [Klein (1950)]. Second, Monte Carlo is used as a tool for the elicitation of a uniform prior on a finite region by making use of several types of prior information. Third, special attention is given to procedures for the construction of importance functions which make use of nonlinear optimization methods.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号