首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 78 毫秒
1.
Evidence of monthly stock returns predictability based on popular investor sentiment indices, namely SBW and SPLS as introduced by Baker and Wurgler (2006, 2007) and Huang et al. (2015) respectively are mixed. While, linear predictive models show that only SPLS can predict excess stock returns, nonparametric models (which accounts for misspecification of the linear frameworks due to nonlinearity and regime changes) finds no evidence of predictability based on either of these two indices for not only stock returns, but also its volatility. However, in this paper, we show that when we use a more general nonparametric causality‐in‐quantiles model of Balcilar et al., (forthcoming), in fact, both SBW and SPLS can predict stock returns and its volatility, with SPLS being a relatively stronger predictor of excess returns during bear and bull regimes, and SBW being a relatively powerful predictor of volatility of excess stock returns, barring the median of the conditional distribution.  相似文献   

2.
In modern decision-making processes, ratios or indices of stochastic variables are commonly used criteria. The decision criterion is, however, frequently presented as deterministic despite the fact that sampling has been the dominant collection method for the data underlying the measure. This paper derives the exact distribution of |X/Y| when X and Y are independent Pearson type VII random variables. We provide an application of this result to the exchange rate data of the six major currencies. Some computer programs for use in the applications are also provided.   相似文献   

3.
The objective of this paper is to consider methodology for modelling time series data of monetary aggregates such as monetary base and broad money. A brief review is made with regard to the likelihood‐based cointegration analysis of I(2) (integrated of order 2) data and I(2)‐to‐I(1) transformations. The paper then investigates procedures for econometric modelling of monetary aggregates, which are in general deemed to be I(2) variables analogous to price indices. It is shown that I(2)‐to‐I(1) transformations centering on a money multiplier play an important role in the modelling procedures. Finally, the study presents an empirical illustration of the proposed methodology using monetary aggregate data from Japan.  相似文献   

4.
This study examines the long-run relationship between monetary policy and dividend growth in Germany. For this purpose, cointegration is tested for between both variables in the period 1974 to 2003. However, problems related to spurious regression arise from the mixed order of integration of the series used, from mutual causation between the variables and from the lack of a long-run relationship among the variables of the model. These problems are addressed by applying the bounds testing approach to cointegration in addition to a more standard long-run structural modelling approach. In principle, both procedures are capable of dealing with the controversial issue of the exogeneity of monetary policy vis-à-vis dividend growth. However, the structural modelling approach still leaves a certain degree of uncertainty about the integration properties of the interest rate and the dividend growth. Hence, one feels legitimized to refer to the bounds testing procedure and to conclude that in the longer term short-term rates drive stock returns but not vice versa.  相似文献   

5.
Previous econometric analyses of patent data rely on regression methods using purely parametric forms of the predictor for modeling the dependence of the response. These approaches lack the capability of identifying potential non-linear relationships between dependent and independent variables. In this paper, we present a Bayesian semiparametric approach making use of Markov Chain Monte Carlo (MCMC) simulation techniques which is able to capture these non-linearities. Using this methodology we reanalyze the determinants of patent oppositions in Europe for biotechnology/pharmaceutical and semiconductor/computer software patents. Our semiparametric specification clearly finds considerable non-linearities in the effect of various metrical covariates which has been not been discussed previously. Further, a formal model validation based on ROC-methodology which splits the data in a training and a validation data set shows a significant improvement of the explanatory and the predictive power of our approach compared to purely parametric specifications.
Stefan Wagner (Corresponding author)Email:
  相似文献   

6.
We test for fractional dynamics in US monetary series, their various formulations and components, and velocity series. Using the spectral regression method, we find evidence of a fractional exponent in the differencing process of the monetary series (both simple-sum and Divisia indices), in their components (with the exception of demand deposits, savings deposits, overnight repurchase agreements, and term repurchase agreements), and the monetary base and money multipliers. No evidence of fractional behaviour is found in the velocity series. Granger's (Journal of Econometrics, 25, 1980) aggregation hypothesis is evaluated and implications of the presence of fractional monetary dynamics are drawn.  相似文献   

7.
In this paper we test the so‐called ‘quiet life’ hypothesis (QLH), according to which firms with market power are less efficient. Using data on the Italian banking industry for the period 1992–2007, we apply a two‐step procedure. First we estimate bank‐level cost efficiency scores and Lerner indices. Then we use the estimated market power measures, as well as a vector of control variables, to explain cost efficiency. Our empirical evidence supports the QLH, although the impact of market power on efficiency is not particularly remarkable in magnitude.  相似文献   

8.
Abstract

Objective: To quantify the impact of activities of daily living (ADL) scores on the risk of nursing home placement (NHP) in Alzheimer's disease (AD) patients.

Setting: Models predicting NHP for AD patients have depended on cognitive deterioration as the primary measure. However, there is increased recognition that both patient functioning and cognition are predictive of disease progression.

Methods: Using the database from a prospective, randomised, double-blind trial of rivastigmine and donepezil, two treatments indicated for AD, Cox regression models were constructed to predict the risk of NHP using age, gender, ADL and MMSE (Mini-Mental State Examination) scores as independent variables.

Participants: Patients aged 50–85 years, with MMSE scores of 10–20, and a diagnosis of dementia of the Alzheimer type.

Results: Cox regression analyses indicated that being female, older age, lower ADL score at baseline, and deterioration in ADL all significantly increased the risk of NHP. Over 2 years, risk of NHP increased by 3% for each 1-point deterioration in ADL score independent of cognition.

Conclusion: Data analyses from this long-term clinical trial established that daily functioning is an important predictor of time to NHP. Further research may be required to confirm whether this finding translates to the real world.  相似文献   

9.
Scanner data are used to calculate chained, exact (and superlative) hedonic price indexes for television sets. The data source is available for a wide range of goods, the application providing an example of how this method can be more widely applied. The indexes correspond to constant utility, hedonic cost-of-living indexes. The approach improves on the existing direct method, which takes its estimates directly from the coefficients on time dummies in a hedonic regression. It also improves on the matched model method used by statistical agencies. The differences between actual price changes and exact hedonic quality-adjusted price changes are found to be substantial. Base-period and current-period weighted exact hedonic indexes are similar, thus providing good approximations to a superlative index. Estimates from the direct, dummy variable approach were compared to the superlative indexes. The disparities between the results argue for caution in the use of the direct, dummy variable approach to estimating quality-adjusted price changes.  相似文献   

10.
Abstract

Objective:

Cost-effectiveness analysis (CEA) on trial-based data has played an important role in pharmacoeconomics. A regression model can be used to account for patient-level heterogeneity throughout covariates adjustment in CEA. However, the estimates from CEA could be biased if ignoring the censoring issue on effectiveness and costs. This study is to propose a regression model to account for both time-to-event effectiveness and cost.

Methods:

A bivariate regression model was proposed to analyze both effectiveness and cost simultaneously, while censored observations were also taken into account. The regression coefficients were estimated using a Bayesian approach by drawing a random sample from their posterior distribution derived from the Markov chain Monte Carlo (MCMC) method. The proposed method was illustrated using empirical data of anti-platelet therapies to the management of cardiovascular diseases for those patients with high risk of gastrointestinal (GI) bleeding, where cost-effectiveness between different therapies was analyzed under both censored and non-censored circumstances, where the effectiveness was defined as the time to re-hospitalization due to GI complications, and the cost was measured by the total drug expenditure.

Results:

Under censored circumstances, aspirin plus proton-pump inhibitors (PPIs) was considered more cost-effective than clopidogrel with/without PPIs, as shown in the cost-effectiveness acceptability curve, and clopidogrel was preferred to aspirin for a willingness-to-pay of 89 NTD for delaying 1 day to hospitalization due to GI complications.

Conclusions:

Ignoring censoring problems could possibly bias the results in CEA. This study has provided an appropriate method to conduct regression-based CEA to improve the estimation which serves its purpose for CEA concerns.

Limitations:

The normality assumption for the cost and effectiveness in the bivariate normal regression needs to be examined, and the conclusions may be biased if this assumption is violated. However, when sample size is sufficiently large, a slight deviation from normality would not be a serious problem.  相似文献   

11.
This paper revisits the question of whether the finance–growth nexus varies with the stages of economic development. Using a novel threshold regression with the instrumental variables approach proposed by Caner and Hansen (2004) to the dataset used in Levine et al. (2000) we detect overwhelming evidence in support of a positive linkage between financial development and economic growth, and this positive effect is larger in the low‐income countries than in the high‐income ones. The data also reveal that financial development tends to have stronger impacts on capital accumulation and productivity growth in the low‐income countries than in the high‐income ones. The findings are robust to alternative financial development measures and conditioning information sets.  相似文献   

12.
In this paper we propose ridge regression estimators for probit models since the commonly applied maximum likelihood (ML) method is sensitive to multicollinearity. An extensive Monte Carlo study is conducted where the performance of the ML method and the probit ridge regression (PRR) is investigated when the data are collinear. In the simulation study we evaluate a number of methods of estimating the ridge parameter k that have recently been developed for use in linear regression analysis. The results from the simulation study show that there is at least one group of the estimators of k that regularly has a lower mean squared error than the ML method for all different situations that have been evaluated. Finally, we show the benefit of the new method using the classical Dehejia and Wahba dataset which is based on a labour market experiment.  相似文献   

13.
Since PHILLIPS (1958) wrote his landmark article on the relationship between money wage rates and unemployment, much work has been conducted concerning ‘Phillips’ curves'. While some writers, such as LIPSEY (1960), SAMUELSON and SOLOW (1960) and HANSEN (1970), remained relatively close to Phillips' original model, other strayed from it. BRECHLING (1968), DICKS-MIREAUX and DOW (1959), ECKSTEIN (1968), ECKSTEIN and WILSON (1962), HAMERMESH (1970), KUH (1967), PERRY (1966), PHELPS (1968), and TAYLOR (1970) are a few researchers who felt Phillips' formulation did not adequately explain changes in money wage rates. Their approaches ranged from formulating entirely new models to merely adding other explanatory variables to Phillips' model.

The present study estimates a Phillips' curve for the United Kingdom within the framework of Phillips' basic hypothesis. He grouped all observations according to unemployment levels, averaged the wage changes for each group, and fitted a curve by trial and error to the compact body of data. The current approach fits a curve to all data points using a direct estimation procedure. Only two basic variables are used; money wage rates and unemployment.

REES and HAMILTON (1967) pointed out that the estimated relation between wage changes and unemployment is highly sensitive to the form of the variables and the regression employed. One can obtain widely scattered results by either changing the type of numerical analysis on existing data or by changing the defined variables. It would not be surprising if the results of the present analysis differ from previous works. Not only are some of the data sources different from those previously employed but the numerical analysis is entirely new to Phillips' curve theory. Spline theory is used to fit the Phillips' curve.  相似文献   

14.
This article investigates the impact of microstructure factors on asset pricing in some African stock markets. We use data on stocks listed on the Johannesburg Stock Exchange, the “Bourse Régionale des Valeurs Mobilières, and the Nigeria Stock Exchange, and we consider international portfolio management from 2000 to 2014. Generalized least square and fixed effect are estimation methods used to highlight the effect of microstructure variables on expected return. At the same time, panel smooth transition regression (PSTR) modeling is considered to identify the thresholds in this effect. The results show that liquidity and to a lesser extent the number of trading days are the most common significant microstructure variables for all the studied markets. However, other variables’ effects on the return are specific to the considered stock markets. Furthermore, the PSTR estimator reveals that the impact of indicated factors on asset pricing is not linear because it produces a double threshold between return and microstructure.  相似文献   

15.
We demonstrate that t ratios (the F statistic) for I(1) regressors in a model with an I(0) dependent variable will generally be oversized. This indicates that spurious significance occurs in a situation where it was not previously identified. We also compare the asymptotic rejection rates of t ratios for various combinations of I(1) and I(0) variables in the two-variable linear regression model. These rejection rates systematically increase with the degree of autocorrelation, yielding spurious significance, when both variables are either positively or negatively autocorrelated. In contrast, when one variable is negatively autocorrelated and the other is positively autocorrelated the rejection rates systematically fall and are undersized.  相似文献   

16.
The discernment of relevant factors driving health care utilization constitutes one important research topic in health economics. This issue is frequently addressed through specification of regression models for health care use (y—often measured by number of doctor visits) including, among other covariates, a measure of self-assessed health (sah). However, the exogeneity of sah within those models has been questioned, due to the possible presence of unobservables influencing both y and sah, and because individuals’ health assessments may depend on the quantity of medical care received. This article addresses the possible simultaneity of (sah, y) by adopting a full information approach, through specification of the bivariate probability function (p.f.) of these discrete variables, conditional on a set of exogenous covariates (x). The approach is implemented with copula functions, which afford separate consideration of each variable margin and their dependence structure. The specification of the joint p.f. of (sah, y) enables estimation of several quantities of potential economic interest, namely features of the conditional p.f. of y given sah and x. The adopted models are estimated through maximum likelihood, with cross-sectional data from the Portuguese National Health Survey of 1998–1999. Estimates of the margins’ parameters do not vary much among different copula models, while, in accordance with theoretical expectations, the dependence parameter is estimated to be negative across the various joint models.  相似文献   

17.
Anderson and van Wincoop (American Economic Review (2003), 69 :106) make a convincing argument that traditional gravity equation estimates are biased by the omission of multilateral resistance terms. They show that these multilateral resistance terms are implicitly defined by a system of non‐linear equations involving all regions' GDP shares and a global interdependence structure involving trade costs. We show how linearizing the system of non‐linear relationships around a free trade world leads to an interdependence structure that can be used as a Bayesian prior to produce statistical estimates of the inward and outward multilateral resistance indices. This reflects a statistical approach that has advantages over the non‐stochastic numerical approach used by Anderson and van Wincoop (2003) to solve for these indices.  相似文献   

18.
ByungWoo Kim 《Applied economics》2013,45(11):1347-1362
Barro and Sala-i-Martin (2004) analysed the empirical determinants of growth. They used a cross-sectional empirical framework that considered growth from two kinds of factors, initial levels of steady-state variables and control variables (e.g. investment ratio, infrastructure). Recent literature suggests that Generalized Method of Moments (GMM) estimation of dynamic panel data models produce more efficient and consistent estimates than Ordinary Least Squares (OLS) or pooled regression models. Following Cellini (1997), we also consider co-integration and error-correction methods for the growth regression. We extend the previous research for Asian countries of Kim (2009) to developed countries. Following the implications of semi-endogenous growth theory, we regressed output growth on a constant, 1-year lagged output (initial income) and the determinants of steady-state income (investment rate, population growth, the quadratic (or linear) function of Research and Development (R&D) intensity). The regression suggests faster significant convergence. This contradicts with that of Mankiw et al. (1992), which asserts that the speed is lower when considering broad concept of capital including human capital. The coefficients for the determinants of steady-state income, especially for the quadratic function of R&D intensity, are significant and occur in the expected direction. Our results suggest that adopting appropriate growth policy, an economy can grow more rapidly through transition dynamics or changing fundamentals.  相似文献   

19.
We address the question of the measurement of health achievement and inequality in the context of variables exhibiting an inverted-U relation with health and well-being. The chosen approach is to measure separately achievement and inequality in the health increasing range of the variable, from a lower survival bound a to an optimum value m, and in the health decreasing range from m to an upper survival bound b. Because in the health decreasing range, the equally distributed equivalent value associated with a distribution is decreasing in progressive transfers, the paper introduces appropriate relative and absolute achievement and inequality indices to be used for variables exhibiting a negative association with well-being. We then discuss questions pertaining to consistent measurement across health attainments and shortfalls, as well as the ordering of distributions exhibiting an inverted-U relation with well-being. An illustration of the methodology is provided using a group of five Arab countries.  相似文献   

20.
Abstract

Objective:

This study constructed the Economic and Health Outcomes Model for type 2 diabetes mellitus (ECHO-T2DM), a long-term stochastic microsimulation model, to predict the costs and health outcomes in patients with T2DM. Naturally, the usefulness of the model depends upon its predictive accuracy. The objective of this work is to present results of a formal validation exercise of ECHO-T2DM.

Methods:

The validity of ECHO-T2DM was assessed using criteria recommended by the International Society for Pharmacoeconomics and Outcomes Research/Society for Medical Decision Making (ISPOR/SMDM). Specifically, the results of a number of clinical trials were predicted and compared with observed study end-points using a scatterplot and regression approach. An F-test of the best-fitting regression was added to assess whether it differs statistically from the identity (45°) line defining perfect predictions. In addition to testing the full model using all of the validation study data, tests were also performed of microvascular, macrovascular, and survival outcomes separately. The validation tests were also performed separately by type of data (used vs not used to construct the model, economic simulations, and treatment effects).

Results:

The intercept and slope coefficients of the best-fitting regression line between the predicted outcomes and corresponding trial end-points in the main analysis were ?0.0011 and 1.067, respectively, and the R2 was 0.95. A formal F-test of no difference between the fitted line and the identity line could not be rejected (p?=?0.16). The high R2 confirms that the data points are closely (and linearly) associated with the fitted regression line. Additional analyses identified that disagreement was highest for macrovascular end-points, for which the intercept and slope coefficients were 0.0095 and 1.225, respectively. The R2 was 0.95 and the estimated intercept and slope coefficients were 0.017 and 1.048, respectively, for mortality, and the F-test was narrowly rejected (p?=?0.04). The sub-set of microvascular end-points showed some tendency to over-predict (the slope coefficient was 1.095), although concordance between predictions and observed values could not be rejected (p?=?0.16).

Limitations:

Important study limitations include: (1) data availability limited one to tests based on end-of-study outcomes rather than time-varying outcomes during the studies analyzed; (2) complex inclusion and exclusion criteria in two studies were difficult to replicate; (3) some of the studies were older and reflect outdated treatment patterns; and (4) the authors were unable to identify published data on resource use and costs of T2DM suitable for testing the validity of the economic calculations.

Conclusions:

Using conventional methods, ECHO-T2DM simulated the treatment, progression, and patient outcomes observed in important clinical trials with an accuracy consistent with other well-accepted models. Macrovascular outcomes were over-predicted, which is common in health-economic models of diabetes (and may be related to a general over-prediction of event rates in the United Kingdom Prospective Diabetes Study [UKPDS] Outcomes Model). Work is underway in ECHO-T2DM to incorporate new risk equations to improve model prediction.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号