首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Volatility models have been playing important roles in economics and finance. Using a generalized spectral second order derivative approach, we propose a new class of generally applicable omnibus tests for the adequacy of linear and nonlinear volatility models. Our tests have a convenient asymptotic null N(0,1) distribution, and can detect a wide range of misspecifications for volatility dynamics, including both neglected linear and nonlinear volatility dynamics. Distinct from the existing diagnostic tests for volatility models, our tests are robust to time-varying higher order moments of unknown form (e.g., time-varying skewness and kurtosis). They check a large number of lags and are therefore expected to be powerful against neglected volatility dynamics that occurs at higher order lags or display long memory properties. Despite using a large number of lags, our tests do not suffer much from the loss of a large number of degrees of freedom, because our approach naturally discounts higher order lags, which is consistent with the stylized fact that economic or financial markets are affected more by the recent past events than by the remote past events. No specific estimation method is required, and parameter estimation uncertainty has no impact on the convenient limit N(0,1) distribution of the test statistics. Moreover, there is no need to formulate an alternative volatility model, and only estimated standardized residuals are needed to implement our tests. We do not have to calculate tedious and model-specific score functions or derivatives of volatility models with respect to estimated parameters, which are required in some existing popular diagnostic tests for volatility models. We examine the finite sample performance of the proposed tests. It is documented that the new tests are rather powerful in detecting neglected nonlinear volatility dynamics which the existing tests can easily miss. They are useful diagnostic tools for practitioners when modelling volatility dynamics.  相似文献   

2.
This paper studies the extension of Harsanyi’s theorem (Harsanyi, 1955) in a framework involving uncertainty. It seeks to extend the aggregation result to a wide class of Monotonic Bernoullian and Archimedean preferences (Cerreia-Vioglio et al., 2011) that subsumes many models of choice under uncertainty proposed in the literature. An impossibility result is obtained, unless we are in the specific framework where all individuals and the social observer are subjective expected utility maximizers sharing the same beliefs. This implies that non-expected utility preferences cannot be aggregated consistently.  相似文献   

3.
This paper uses a k-th order nonparametric Granger causality test to analyze whether firm-level, economic policy and macroeconomic uncertainty indicators predict movements in real stock returns and their volatility. Linear Granger causality tests show that whilst economic policy and macroeconomic uncertainty indices can predict stock returns, firm-level uncertainty measures possess no predictability. However, given the existence of structural breaks and inherent nonlinearities in the series, we employ a nonparametric causality methodology, as linear modeling leads to misspecifications thus the results cannot be considered reliable. The nonparametric test reveals that in fact no predictability can be observed for the various measures of uncertainty i.e., firm-level, macroeconomic and economic policy uncertainty, vis-à-vis real stock returns. In turn, a profound causal predictability is demonstrated for the volatility series, with the exception of firm-level uncertainty. Overall our results not only emphasize the role of economic and firm-level uncertainty measures in predicting the volatility of stock returns, but also presage against using linear models which are likely to suffer from misspecification in the presence of parameter instability and nonlinear spillover effects.  相似文献   

4.
A desirable property of a forecast is that it encompasses competing predictions, in the sense that the accuracy of the preferred forecast cannot be improved through linear combination with a rival prediction. In this paper, we investigate the impact of the uncertainty associated with estimating model parameters in‐sample on the encompassing properties of out‐of‐sample forecasts. Specifically, using examples of non‐nested econometric models, we show that forecasts from the true (but estimated) data generating process (DGP) do not encompass forecasts from competing mis‐specified models in general, particularly when the number of in‐sample observations is small. Following this result, we also examine the scope for achieving gains in accuracy by combining the forecasts from the DGP and mis‐specified models.  相似文献   

5.
In this paper it is argued, in the context of a Walrasian market game, that the statecontingent representation of uncertainty is not compatible with the existence of what we call systematic uncertainty, that is, uncertainty that cannot be removed whatever the decision maker's conduct and past experience. The implications of this conclusion from the point of view of the foundations of decision theory are also drawn.  相似文献   

6.
Abstract

We investigate the specification and power of intraday event study test statistics. Mean, market, and matched firm models generate well-specified return results for a range of intervals up to 60?min around the event. These models detect return shocks equivalent to one spread in one-minute interval data and three spreads in longer intervals. Researchers using intraday return event studies can, therefore, be confident in their robustness. Some volume event study approaches have reasonable power but they are not generally well specified, while a matched-firm approach gives the best combination of specification and power for spread event studies.  相似文献   

7.
We propose an easily implementable test of the validity of a set of theoretical restrictions on the relationship between economic variables, which do not necessarily identify the data generating process. The restrictions can be derived from any model of interactions, allowing censoring and multiple equilibria. When the restrictions are parameterized, the test can be inverted to yield confidence regions for partially identified parameters, thereby complementing other proposals, primarily Chernozhukov et al. [Chernozhukov, V., Hong, H., Tamer, E., 2007. Estimation and confidence regions for parameter sets in econometric models. Econometrica 75, 1243–1285].  相似文献   

8.
Planning and scheduling significantly influence organizational performance, but literature that pays attention to how organizations could or should organize and assess their planning processes is limited. We extend planning and scheduling theory with a categorization of scheduling performance criteria, based on a three-stage survey research design. Particularly, the results show that, next to schedule quality, the planning process factors timeliness, flexibility, communication, and negotiation are important performance criteria, and especially so in organizations that are faced with high levels of uncertainty. The results suggest that organizational and behavioral aspects of planning and scheduling cannot be mitigated with advanced models and software that solely focus on good schedules. Rather, high quality schedules and high quality scheduling processes need to be facilitated simultaneously to attain high planning and scheduling performance.  相似文献   

9.
We propose a new method to explore the information content of fixed-event forecasts and estimate structural parameters that are keys to sticky and noisy information models. Estimation follows a regression-based framework in which estimated coefficients map one-to-one with parameters that measure the degree of information rigidity. The statistical characterization of regression errors explores the laws that govern expectation formation under sticky and noisy information, that is, they are coherent with the theory. This strategy is still unexplored in the literature and potentially enhances the reliability of inference results. The method also allows linking estimation results to the signal-to-noise ratio, an important parameter of noisy information models. This task cannot be accomplished if one adopts an “agnostic” characterization of regression errors. With regard to empirical results, they show a substantial degree of information rigidity in the countries studied. They also suggest that the theoretical characterization of regression errors yields a more conservative picture of the uncertainty surrounding parameter estimates.  相似文献   

10.
ABSTRACT

Parameter uncertainty has fuelled criticisms on the robustness of results from computable general equilibrium models. This has led to the development of alternative sensitivity analysis approaches. Researchers have used Monte Carlo analysis for systematic sensitivity analysis because of its flexibility. But Monte Carlo analysis may yield biased simulation results. Gaussian quadratures have also been widely applied, although they can be difficult to apply in practice. This paper applies an alternative approach to systematic sensitivity analysis, Monte Carlo filtering and examines how its results compare to both Monte Carlo and Gaussian quadrature approaches. It does so via an application to rural development policies in Aberdeenshire, Scotland. We find that Monte Carlo filtering outperforms the conventional Monte Carlo approach and is a viable alternative when a Gaussian quadrature approach cannot be applied or is too complex to implement.  相似文献   

11.
Assessing regional population compositions is an important task in many research fields. Small area estimation with generalized linear mixed models marks a powerful tool for this purpose. However, the method has limitations in practice. When the data are subject to measurement errors, small area models produce inefficient or biased results since they cannot account for data uncertainty. This is particularly problematic for composition prediction, since generalized linear mixed models often rely on approximate likelihood inference. Obtained predictions are not reliable. We propose a robust multivariate Fay–Herriot model to solve these issues. It combines compositional data analysis with robust optimization theory. The nonlinear estimation of compositions is restated as a linear problem through isometric logratio transformations. Robust model parameter estimation is performed via penalized maximum likelihood. A robust best predictor is derived. Simulations are conducted to demonstrate the effectiveness of the approach. An application to alcohol consumption in Germany is provided.  相似文献   

12.
This paper provides empirical evidence on the dynamic effects of uncertainty on firm‐level capital accumulation. A novelty in this paper is that the firm‐level uncertainty indicator is motivated and derived from a theoretical model, the neoclassical investment model with time to build. This model also serves as the base for the empirical work, where an error‐correction approach is employed. I find a negative effect of uncertainty on capital accumulation, both in the short run and the long run. This outcome cannot be explained by the model alone. Instead, the results suggest that the predominant mechanism at work stems from irreversibility constraints.  相似文献   

13.
Skepticism toward traditional identifying assumptions based on exclusion restrictions has led to a surge in the use of structural VAR models in which structural shocks are identified by restricting the sign of the responses of selected macroeconomic aggregates to these shocks. Researchers commonly report the vector of pointwise posterior medians of the impulse responses as a measure of central tendency of the estimated response functions, along with pointwise 68% posterior error bands. It can be shown that this approach cannot be used to characterize the central tendency of the structural impulse response functions. We propose an alternative method of summarizing the evidence from sign-identified VAR models designed to enhance their practical usefulness. Our objective is to characterize the most likely admissible model(s) within the set of structural VAR models that satisfy the sign restrictions. We show how the set of most likely structural response functions can be computed from the posterior mode of the joint distribution of admissible models both in the fully identified and in the partially identified case, and we propose a highest-posterior density credible set that characterizes the joint uncertainty about this set. Our approach can also be used to resolve the long-standing problem of how to conduct joint inference on sets of structural impulse response functions in exactly identified VAR models. We illustrate the differences between our approach and the traditional approach for the analysis of the effects of monetary policy shocks and of the effects of oil demand and oil supply shocks.  相似文献   

14.
Decision making and planning under low levels of predictability   总被引:3,自引:0,他引:3  
This special section aims to demonstrate the limited predictability and high level of uncertainty in practically all important areas of our lives, and the implications of this. It summarizes the huge body of solid empirical evidence accumulated over the past several decades that proves the disastrous consequences of inaccurate forecasts in areas ranging from the economy and business to floods and medicine. The big problem is, however, that the great majority of people, decision and policy makers alike, still believe not only that accurate forecasting is possible, but also that uncertainty can be reliably assessed. Reality, however, shows otherwise, as this special section proves. This paper discusses forecasting accuracy and uncertainty, and distinguishes three distinct types of predictions: those relying on patterns for forecasting, those utilizing relationships as their basis, and those for which human judgment is the major determinant of the forecast. In addition, the major problems and challenges facing forecasters and the reasons why uncertainty cannot be assessed reliably are discussed using four large data sets. There is also a summary of the eleven papers included in this special section, as well as some concluding remarks emphasizing the need to be rational and realistic about our expectations and avoid the common delusions related to forecasting.  相似文献   

15.
The identification of structural parameters in the linear instrumental variables (IV) model is typically achieved by imposing the prior identifying assumption that the error term in the structural equation of interest is orthogonal to the instruments. Since this exclusion restriction is fundamentally untestable, there are often legitimate doubts about the extent to which the exclusion restriction holds. In this paper I illustrate the effects of such prior uncertainty about the validity of the exclusion restriction on inferences based on linear IV models. Using a Bayesian approach, I provide a mapping from prior uncertainty about the exclusion restriction into increased uncertainty about parameters of interest. Moderate prior uncertainty about exclusion restrictions can lead to a substantial loss of precision in estimates of structural parameters. This loss of precision is relatively more important in situations where IV estimates appear to be more precise, for example in larger samples or with stronger instruments. I illustrate these points using several prominent recent empirical papers that use linear IV models. An accompanying electronic table allows users to readily explore the robustness of inferences to uncertainty about the exclusion restriction in their particular applications. Copyright © 2010 John Wiley & Sons, Ltd.  相似文献   

16.
韦东 《企业科技与发展》2010,(12):154-155,158
刑事被害人在被伤害后,急需资金进行抢救治疗,但因犯罪嫌疑人未能确定,或在逃,或无钱支付,或拒绝支付,刑事被害人无法筹措救治所需资金,以致无法得到及时救治而引起死亡、残废等第二次伤害的情况发生。此外,被告人应当支付刑事被害人或其近亲属的赔偿金,但被告人拒绝支付,或无能力支付,或无能力完全支付,使刑事被害人或其近亲属无法获得赔偿金,使法院判决变成了"空判""法律白条"。由此,引发了许多刑事被害人或其近亲属的报复、上访、闹事等新的社会问题,严重影响社会和谐稳定。唯有建立刑事赔偿金国家垫付制度,才能从根本上解决上述问题。  相似文献   

17.
Economic uncertainty has only recently begun to appear in research on the determinants of fertility. Therefore, the purpose of this study is to investigate how economic uncertainty affects the fertility rate in Taiwan. Official county-level panel data from 1998 to 2016 for 20 counties are utilized in DIFF-GMM and SYS-GMM models in dynamic panel regression estimation. The major finding of this study is that higher volatility of household disposable income will reduce the fertility rate. The empirical results support the proposition that economic uncertainty might be an important determinant of fertility decisions, explaining the decline in fertility in Taiwan.  相似文献   

18.
This paper evaluates the effects of high‐frequency uncertainty shocks on a set of low‐frequency macroeconomic variables representative of the US economy. Rather than estimating models at the same common low frequency, we use recently developed econometric models, which allow us to deal with data of different sampling frequencies. We find that credit and labor market variables react the most to uncertainty shocks in that they exhibit a prolonged negative response to such shocks. When looking at detailed investment subcategories, our estimates suggest that the most irreversible investment projects are the most affected by uncertainty shocks. We also find that the responses of macroeconomic variables to uncertainty shocks are relatively similar across single‐frequency and mixed‐frequency data models, suggesting that the temporal aggregation bias is not acute in this context.  相似文献   

19.
The selection of inputs and outputs in DEA models represents a vibrant methodological topic. At the same time; however, the problem of the impact of different measurement units of selected inputs is understated in empirical literature. Using the example of Czech farms, we show that the DEA method does not provide consistent score estimates, neither a stable ranking for different popular measurements of labour and capital factors of production. For this reason, studies based on DEA efficiency results for differently measured inputs should be compared only with great caution.  相似文献   

20.
This paper presents a model of investment in projects that are characterized by uncertainty over both the construction costs and revenues. Both processes are modeled as spectrally negative Lévy jump-diffusions. The optimal stopping problem that determines the value of the project is solved under fairly general assumptions. It is found that the current value of the benefit-to-cost ratio (BCR) decreases in the frequency of negative shocks to the construction process. This implies that the cost overruns that can be expected if one ignores such shocks are increasing in their frequency. Based on calibrated data, the model is applied to the proposed construction of high-speed rail in the UK and it is found that its economic case cannot currently be made and is unlikely to be met at any time in the next decade. In addition it is found that ignoring construction uncertainty leads to a substantial probability of an erroneous decision being taken.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号