首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Applied researchers often test for the difference of the Sharpe ratios of two investment strategies. A very popular tool to this end is the test of Jobson and Korkie [Jobson, J.D. and Korkie, B.M. (1981). Performance hypothesis testing with the Sharpe and Treynor measures. Journal of Finance, 36:889–908], which has been corrected by Memmel [Memmel, C. (2003). Performance hypothesis testing with the Sharpe ratio. Finance Letters, 1:21–23]. Unfortunately, this test is not valid when returns have tails heavier than the normal distribution or are of time series nature. Instead, we propose the use of robust inference methods. In particular, we suggest to construct a studentized time series bootstrap confidence interval for the difference of the Sharpe ratios and to declare the two ratios different if zero is not contained in the obtained interval. This approach has the advantage that one can simply resample from the observed data as opposed to some null-restricted data. A simulation study demonstrates the improved finite sample performance compared to existing methods. In addition, two applications to real data are provided.  相似文献   

2.
There is considerble debate concerning the dynamic relationship among stock returns and real activity. Two explanations that have received particular attention are those of the proxy hypothesis and the reverse causality explanation. The principle distinguishing feature between these two explanations concerns whether or not inflation merely proxies for some underlying relation among stock returns and real activity. The purpose of this article is to re-examine the evidence within the context of multiple time-series models. Our re-examination is motivated by some new developments in the area of causality testing employing Vector Autoregressive Moving Average (VARMA) models. These new developments have focused on the issue of causality testing when the hypotheses are not completely nested (as in VARMA models, by construction), and the potential bias that may arise by the inference procedure employed. Thus, this article carefully selects an inference procedure which permits one to infer the direction of potential bias and finds that the results of previous work are sensitive to the inference procedure employed. Controlling for this bias, we report that the evidence favors the proxy hypothesis.  相似文献   

3.
We study the portfolio optimization problem of maximizing the outperformance probability over a random benchmark through dynamic trading with a fixed initial capital. Under a general incomplete market framework, this stochastic control problem can be formulated as a composite pure hypothesis testing problem. We analyze the connection between this pure testing problem and its randomized counterpart, and from the latter we derive a dual representation for the maximal outperformance probability. Moreover, in a complete market setting, we provide a closed-form solution to the problem of beating a leveraged exchange traded fund. For a general benchmark under an incomplete stochastic factor model, we provide the Hamilton–Jacobi–Bellman PDE characterization for the maximal outperformance probability.  相似文献   

4.
As recent research highlights that the Sharpe ratio has a decision theoretic foundation even in the case of asymmetric or fat-tailed excess returns and thus is adequate even for the evaluation of hedge funds, this note provides the first Sharpe ratio based performance analysis of the hedge fund market. Furthermore, it addresses the important practical question whether the choice of hypothesis test used to statistically compare Sharpe ratios can influence an investor’s hedge fund selection process. Our key findings are as follows: (i) Only a small fraction of hedge funds in our large dataset can significantly outperform passive investments in corresponding hedge fund indices. (ii) Especially in the presence of autocorrelated or skewed excess returns, the traditional test of Jobson and Korkie, 1981, Memmel, 2003 tends to overstate the number of significant outperformers and thus provides potentially misleading information for investors. Decision makers are advised to use the bootstrap test of Ledoit and Wolf (2008) allowing robust and more reliable inference.  相似文献   

5.
The paper addresses two issues that arise in estimation and testing of the real effects of anticipated and unanticipated money. First it is shown that identification of the effects of unanticipated (or unperceived) monetary growth on real output is possible only if the a priori restriction is imposed that monetary growth does not depend on unanticipated (or unperceived) output. Second, the existing empirical work of Barro and others does not allow for three known channels through which money can affect real variables. These are (1) past and present anticipations of future monetary growth (the inflation tax channel), (2) expectations of monetary growth in a given period conditioned at various preceding dates (the Fischer-Phelps-Taylor effect) and (3) past and present revisions in forecasts of future monetary growth. The presence of the first of these would mean that alternative open-loop monetary growth rules have real effects. The presence of the other two implies that monetary feedback rules can have real effects. Omission of the first channel can lead to biased estimates of the effects of past anticipated monetary growth. Potentially serious observational equivalence problems are associated with the other two.  相似文献   

6.
This study considers the issues of noise-to-signal estimation, finite sample performance and hypothesis testing for a new nonparametric and stochastic efficiency estimation technique. We apply the technique for analyzing the efficiency of European banks from various regions and with various specializations. The technique seems well suited for this application area because banking inputs and outputs generally are measured with error, the banking production technology is not well-defined and large banking data sets such as BankScope allow for a nonparametric approach.  相似文献   

7.
The quality of scenario planning activities can be difficult to assess, as one cannot know how likely any projected future scenario is. Here, we introduce one approach for gaining greater confidence. Historical analogy provides the means for achieving this, whereby the model upon which scenarios are constructed is analysed in terms of how well it predicts and establishes links with recent historical environments. We apply this approach to a previously developed scenario tree, constructed using the field anomaly relaxation method, as a case study to indicate how historical analogy can be used to assess and enhance the model from which the scenarios are constructed.  相似文献   

8.
While there has been considerable research on the consequences of financial crises, there has been little empirical research on the possible effects of the role of domestic political institutions that influence a government's ability to implement crisis management policies. This paper investigates the impact of domestic institutions, characterized by a U-shaped veto player framework, on the output costs of banking crises. The analysis extends MacIntyre's qualitative study (2001 MacIntyre, Andrew. 2001. Institutions and investors: The politics of the economic crisis in Southeast Asia. International Organization, 55(1): 81122. [Crossref], [Web of Science ®] [Google Scholar]) of the relationship between veto players and policy risks in the Asian financial crises. For a large sample of emerging market economies, we find support for McIntyre's hypotheses that both too few and too many veto players are associated with greater costs of banking crises.  相似文献   

9.
Leaders tend to be so immersed in the specifics of strategy that they rarely stop to think how much of their reasoning is done by analogy. As a result, they miss useful insights that psychologists and other scientists have generated about analogies' pitfalls. Managers who pay attention to their own analogical thinking will make better strategic decisions and fewer mistakes. Charles Lazarus was inspired by the supermarket when he founded Toys R Us; Intel promoted its low-end chips to avoid becoming like U.S. Steel; and Circuit City created CarMax because it saw the used-car market as analogous to the consumer-electronics market. Each example displays the core elements of analogical reasoning: a novel problem or a new opportunity, a specific prior context that managers deem to be similar in its essentials, and a solution that managers can transfer from its original setting to the new one. Analogical reasoning is a powerful tool for sparking breakthrough ideas. But dangers arise when analogies are built on surface similarities (headlong diversification based on loose analogies played a role in Enron's collapse, for instance). Psychologists have discovered that it's all too easy to overlook the superficiality of analogies. The situation is further complicated by people's tendency to hang on to beliefs even after contrary evidence comes along (a phenomenon known as anchoring) and their tendency to seek only the data that confirm their beliefs (an effect known as the confirmation bias). Four straightforward steps can improve a management team's odds of using an analogy well: Recognize the analogy and identify its purpose; thoroughly understand its source; determine whether the resemblance is more than superficial; and decide whether the original strategy, properly translated, will work in the target industry.  相似文献   

10.
Jan H. Kwakkel 《Futures》2011,43(9):934-946
This paper discusses the evaluation of new infrastructure planning approaches. These new planning approaches have been put forward in response to the challenges of deep uncertainty about the future. These approaches emphasize the need for flexibility of the system in order to enable the plan to adapt to changing conditions. However, these adaptive approaches up till now have seen little real word applications. One important reason for this lack of application is that the efficacy of these approaches has not been established yet. In turn, this is largely due to the problem that there is no agreed upon method for proving the efficacy of a new planning approach. In this paper, we will draw an analogy to medical research and development in order to outline a methodology for establishing the efficacy of new planning approaches. We discuss how the well-established methodology for evaluation new medical treatments can be adapted to evaluating new planning approaches. We illustrate the resulting evaluation methodology by outlining an evaluation strategy for a specific new planning approach. It is concluded that the well-established methodology from medicine can successfully be used to inform the evaluation of infrastructure planning approaches.  相似文献   

11.
Ciaran Driver 《Futures》1984,16(5):508-512
This article tests the Gershuny hypothesis using UK input-output data. Gershuny has suggested that in many countries the share of consumer expenditure devoted to private or marketed services has not risen over time. Rather, consumers have tended to substitute durable goods and their own labour for purchased services. The UK data do provide support for this argument.  相似文献   

12.
We revisit and critically reevaluate the widely accepted modernization hypothesis which claims that per capita income causes the creation and the consolidation of democracy. Existing studies find support for this hypothesis because they fail to control for the presence of omitted variables. Controlling for these factors either by including country fixed effects in a linear model or by including parameterized random effects in a nonlinear double hazard model removes the correlation between income and the likelihood of transitions to and from democratic regimes. In addition, the estimated fixed effects from the linear model are related to historical factors that affect both the level of income per capita and the likelihood of democracy in a country. This evidence is consistent with the idea that events during critical historical junctures can lead to divergent political-economic development paths, some leading to prosperity and democracy, others to relative poverty and non-democracy.  相似文献   

13.
The paper estimates the impact of crises on output growth, augmenting Cerra and Saxena's (2008) analysis by extending the data until 2010, and by taking into account globalization and contagion effects. The paper finds that the decline in output growth rates following currency, banking and stock market crises are much larger in the sample ending in 2010, than in the one ending in 2001. The results are robust across different specifications and crisis databases. The paper finds that globalization, estimated using a factor augmented panel, has benefitted economic growth in the long run, but those gains have been diminishing in the new millennium. Moreover, globalization also amplifies the negative effects of crises, especially for upper middle and high income OECD countries, starting with the new millenium. As such, lower output growth is to be expected as the new norm, especially in these more advanced economies for a lot longer than what would have been expected in an usual cyclical recovery, confirming El-Erian and PIMCO's (2009) statement of a “new normal.” Last, but not least, the estimation using a factor augmented panel leads to results consistent with thresholds effects of finance and growth, and of globalization on growth.  相似文献   

14.
In many audit tasks, auditors evaluate multiple hypotheses to diagnose the situation. Research suggests this is a complex task that individuals have difficulty performing. Further, there is little guidance in professional standards or literature dealing with the many complexities present in the audit environment. Using probability theory, this study derives the appropriate revision of likelihoods for multiple hypotheses given different realistic audit conditions. The analysis shows that the relationships among the hypotheses dramatically impact the use of audit evidence and the resulting pattern of probability revisions. We also identify testable hypotheses to guide future research and discuss practice implications regarding ways to improve the effectiveness of analytical procedures.  相似文献   

15.
16.
17.
Hall shows that consumption obeys an AR(1) process if the life cycle-permanent income hypothesis is true. This paper expands Hall's framework to show that expenditure on durable goods should be ARMA(1, 1) but not AR(1). Post-war U.S. data rejects the expanded model.  相似文献   

18.
The present paper sheds further light on a well-known (alleged) violation of the expectations hypothesis of the term structure (EHT): the frequent finding of unit roots in interest rate spreads. We show that the EHT implies (i) that the nonstationarity stems from the holding premium, which is hence (ii) cointegrated with the spread. In a stochastic discount factor framework, we model the premium as being driven by the integrated variance of excess returns. Introducing the concept of mean-variance cointegration, we actually find cointegration relations between the conditional first and second moment of US bond data.  相似文献   

19.
This paper examines the statistical properties of the bilateral real exchange rates of the U.S. vs. France, Germany, and the U.K. during the Post-Bretton-Woods period, and draws implications on the Purchasing Power Parity (PPP) hypothesis. Contrary to traditional studies that consider only unit root and stationary processes to describe the real exchange rate behavior, this paper considers an in-between process, the locally persistent process. The empirical results demonstrate the following two findings: (1) Locally persistent processes describe the real exchange rate movements better than unit root and stationary processes, which implies that PPP reversion occurs and PPP holds in the long-run. (2) The confidence intervals for half-life deviations from PPP under local persistence tend to be narrower than those obtained by assuming the ADF and the local-to-unity models.  相似文献   

20.
Paul Rosenfield 《Abacus》2003,39(2):233-249
This article is about the commonly asserted view that present value, the discounted amount of future cash receipts and payments, is an attribute of assets and that it ideally should replace acquisition cost as the attribute of assets to present in financial statements. The view is based on faulty extrapolation from economics and the study of finance. Use of present value in connection with the preparation of financial statements does not contribute to performing their basic function, which is to report relevant real world conditions as they exist and existed and relevant financial effects of relevant real world events as they occurred. The view turns discounting into a magical process, it reverses the chronological order of cause and effect, and it unjustifiably substitutes the financial effects of supposed future events for the financial effects of historical events as the raw material of financial statements. The future does not yet exist and such financial events have not yet occurred; the present condition therefore can in no way depend on them. The discounted amount of future cash receipts and payments is not an attribute of assets and it should therefore not replace acquisition cost as the attribute of assets to present in financial statements.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号