首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
An important concern in portfolio management is the number of securities needed to create a well-diversified portfolio. The number of securities that constitute a welldiversified portfolio, however, varies widely among studies. It is demonstrated that past conclusions are highly sensitive to the methodology used in quantifying diversification. This finding motivates the development of alternative methods that reduce the effect of repeated replications on test results. The first approach exploits the power curves of statistical tests, whereas the second approach suggests the use of more robust statistics. Both approaches provide researchers with guidance in the design of future diversification studies.  相似文献   

2.
The net benefits resulting from credit policy decisions can be evaluated based on either the opportunity cost approach or the net present value (NPV) approach. It is known that the two approaches are equivalent in that they provide the same accept/reject decision. Consequently, most text books cover the opportunity cost approach which is much simpler to formulate. This paper reexamines the equivalent relationship based on the NPV models formulated under a capital budgeting framework, and shows that the equivalent relationship only holds for very restrictive conditions. Also, the discount rate for the NPV models is examined along with other models.  相似文献   

3.
This paper illustrates how a misclassification cost matrix can be incorporated into an evolutionary classification system for bankruptcy prediction. Most classification systems for predicting bankruptcy have attempted to minimize misclassifications. The minimizing misclassification approach assumes that Type I and Type II error costs for misclassifications are equal. There is evidence that these costs are not equal and incorporating costs into the classification systems can lead to better and more desirable results. In this paper, we use the principles of evolution to develop and test a genetic algorithm (GA) based approach that incorporates the asymmetric Type I and Type II error costs. Using simulated and real-life bankruptcy data, we compare the results of our proposed approach with three linear approaches: statistical linear discriminant analysis (LDA), a goal programming approach, and a GA-based classification approach that does not incorporate the asymmetric misclassification costs. Our results indicate that the proposed approach, incorporating Type I and Type II error costs, results in lower misclassification costs when compared to LDA and GA approaches that do not incorporate misclassification costs. Copyright © 2001 John Wiley & Sons, Ltd.  相似文献   

4.
Modern semiconductor manufacturing is very complex and expensive. Maintaining high quality and yield enhancement have been recognized as important factors to build core competences for semiconductor manufacturing companies. Data mining can find potentially useful information from huge databases. This paper proposes a data-mining framework based on decision-tree induction for improving the yield of the solder bumping process in which the various (physical and chemical) input variables that affect the bumping process exhibit highly complex interactions. We conducted an empirical study in a semiconductor fabrication facility in Taiwan to validate this approach. The results show that the proposed approach can effectively derive the causal relationships among controllable input process factors and the target class to enhance the yield. Copyright © 2007 John Wiley & Sons, Ltd.  相似文献   

5.
本文在中国当前使用的流动性缺口管理方法的基础上,结合巴塞尔委员会《流动性风险计量、标准和监测的国际框架(征求意见稿)》(2009),给出关于流动性风险计量的改进方法。改进方法使用了高质量流动资产、净流动性缺口和净流动性缺口占总资产的比率三个指标,克服了传统流动性缺口管理没有充分考虑资产质量和资产流动性特征等缺点,更好地反映商业银行的流动性风险。通过实证分析先对国内3家银行2006~2009年年报数据进行比较,然后对国内10家银行2009年年报数据进行比较,进一步验证了改进方法的优越性。  相似文献   

6.
It is unnecessary to encourage operational research (OR) workers to become involved in problems of major world impact; this is why OR was born in wartime. Then it could be assumed that an entity existed (the decision maker) with sufficient problem awareness, clear objectives and adequate intelligence to decide between presented options. Not until peacetime was it recognized that today's problems affect our future rather than our present. It became necessary to invent futures studies, and from that point there can be no all-wise decision maker. Consequently there are gaps in futures methodology. Problems can be identified and their impact analysed, but there is no direct road towards implementation. There are many decision makers but no common approach or goal. Opinion pools have not been an adequate substitute for informed decision making. New methodology is needed. This article considers the difficulties and makes suggestions for changes in approach.  相似文献   

7.
Ratings issued by credit rating agencies (CRAs) play an important role in the global financial environment. Among other issues, past studies have explored the potential for predicting these ratings using a variety of explanatory factors and modeling approaches. This paper describes a multi-criteria classification approach that combines accounting data with a structural default prediction model in order to obtain improved predictions and test the incremental information that a structural model provides in this context. Empirical results are presented for a panel data set of European listed firms during the period 2002–2012. The analysis indicates that a distance-to-default measure obtained from a structural model adds significant information compared to popular financial ratios. Nevertheless, its power is considerably weakened when market capitalization is also considered. The robustness of the results is examined over time and under different rating category specifications.  相似文献   

8.
The importance of questioning the values, background assumptions, and normative orientations shaping sustainability research has been increasingly acknowledged, particularly in the context of transdisciplinary research, which aims to integrate knowledge from various scientific and societal bodies of knowledge. Nonetheless, the concept of reflexivity underlying transdisciplinary research is not sufficiently clarified and, as a result, is hardly able to support the development of social learning and social experimentation processes needed to support sustainability transitions. In particular, the concept of reflexivity is often restricted to building social legitimacy for the results of a new kind of ‘complex systems science’, with little consideration of the role of non-scientific expertise and social innovators in the design of the research practice itself.The key hypothesis of the paper is that transdisciplinary research would benefit from adopting a pragmatist approach to reflexivity. Such an approach relates reflexivity to collective processes of problem framing and problem solving through joint experimentation and social learning that directly involve the scientific and extra-scientific expertise. To test this hypothesis, the paper proposes a framework for analysing the different types of reflexive processes that play role in transdisciplinary research. The main conclusion of the analysis is the need to combine conventional consensus-oriented deliberative approaches to reflexivity with more open-ended, action-oriented transformative approaches.  相似文献   

9.
本文运用判别分析法和决策树模型对非上市中小企业违约风险进行了分析,并将两种方法的分析结果进行了对比。分析结果表明,两种方法均能较好地判别企业违约的可能性,而采用决策树模型的最大的好处是,除了能够较好地判断企业的违约率之外,它还能够找出影响企业违约的关键性因素。从我们的分析样本来看,商业银行在判断企业的信用水平时,现金流量/总债务的比例、流动资产/流动负债比例是两个非常重要的考察因素。如果银行花精力对它们进行调查、核实,保证其准确性,将大大提高对企业违约率判断的准确性。  相似文献   

10.
The aim of this study was to test the assertion of New Zealand company directors that CCA information was not useful for investor decision making. Subjects made investment decisions based on their predictions concerning two similar, real (identity disguised) companies. These decisions and other evaluations were made in a post-test only, control group design experiment. CCA's relevance and reliability according to particular definitions of these characteristics was thus assessed. The results show that current cost accounts are more useful for investor decision making because they are both more relevant and perceived to be more reliable than conventional historical cost accounts.  相似文献   

11.
The main intention of this paper is to investigate, with new daily data, whether prices in the two Chinese stock exchanges (Shanghai and Shenzhen) follow a random‐walk process as required by market efficiency. We use two different approaches, the standard variance‐ratio test of Lo and MacKinlay (1988) and a model‐comparison test that compares the ex post forecasts from a NAÏVE model with those obtained from several alternative models: ARIMA, GARCH and the Artificial Neural Network (ANN). To evaluate ex post forecasts, we utilize several procedures including RMSE, MAE, Theil's U, and encompassing tests. In contrast to the variance‐ratio test, results from the model‐comparison approach are quite decisive in rejecting the random‐walk hypothesis in both Chinese stock markets. Moreover, our results provide strong support for the ANN as a potentially useful device for predicting stock prices in emerging markets.  相似文献   

12.
Simulation methods are extensively used in Asset Pricing and Risk Management. The most popular of these simulation approaches, the Monte Carlo, requires model selection and parameter estimation. In addition, these approaches can be extremely computer intensive. Historical simulation has been proposed as a non-parametric alternative to Monte Carlo. This approach is limited to the historical data available.In this paper, we propose an alternative historical simulation approach. Given a historical set of data, we define a set of standardized disturbances and we generate alternative price paths by perturbing the first two moments of the original path or by reshuffling the disturbances. This approach is either totally non-parametric when constant volatility is assumed; or semi-parametric in presence of GARCH(1, 1) volatility. Without a loss in accuracy, it is shown to be much more powerful in terms of computer efficiency than the Monte Carlo approach. It is also extremely simple to implement and can be an effective tool for the valuation of financial assets.We apply this approach to simulate pay off values of options on the S&P 500 stock index for the period 1982–2003. To verify that this technique works, the common back-testing approach was used. The estimated values are insignificantly different from the actual S&P 500 options payoff values for the observed period.  相似文献   

13.
Insurance claims have deductibles, which must be considered when pricing for insurance premium. The deductible may cause censoring and truncation to the insurance claims. However, modeling the unobserved response variable using maximum likelihood in this setting may be a challenge in practice. For this reason, a practitioner may perform a regression using the observed response, in order to calculate the deductible rates using the regression coefficients. A natural question is how well this approach performs, and how it compares to the theoretically correct approach to rating the deductibles. Also, a practitioner would be interested in a systematic review of the approaches to modeling the deductible rates. In this article, an overview of deductible ratemaking is provided, and the pros and cons of two deductible ratemaking approaches are compared: the regression approach and the maximum likelihood approach. The regression approach turns out to have an advantage in predicting aggregate claims, whereas the maximum likelihood approach has an advantage when calculating theoretically correct relativities for deductible levels beyond those observed by empirical data. For demonstration, loss models are fit to the Wisconsin Local Government Property Insurance Fund data, and examples are provided for the ratemaking of per-loss deductibles offered by the fund. The article discovers that the regression approach is actually a single-parameter approximation to the true relativity curve. A comparison of selected models from the generalized beta family discovers that the usage of long-tail severity distributions may improve the deductible rating, while advanced frequency models such as 01-inflated models may have limited advantages due to estimation issues under censoring and truncation. In addition, in this article, models for specific peril types are combined to improve the ratemaking.  相似文献   

14.
We apply a bivariate approach to the asset allocation problem for investors seeking to minimize the probability of large losses. It involves modelling the tails of joint distributions using techniques motivated by extreme value theory. We compare results with a corresponding univariate approach using simulated and financial data. Through an examination of a simulated and real financial data set we show that the estimated risks using the bivariate and univariate approaches are in close agreement for a wide range of losses and allocations. This is important since the bivariate approach is significantly more computationally expensive. We therefore suggest that the univariate approach be used for the typical level of loss that an investor may want to guard against. This univariate approach is effective even if there are more than two assets. The software written in support of this work is available on demand and we describe its use in the appendix.  相似文献   

15.
Following a brief review of the main experimental work into the economics of risk and uncertainty, both static and dynamic, this paper reports the results of an experiment testing one of the key assumptions of the theory of dynamic economic behaviour—that people have a plan and implement it. Using a unique design which enables the plan (if one exists) to be revealed by the first move, the experiment was implemented via the Internet on a subset of the University of Tilburg's ongoing family expenditure survey panel. The advantages of using such a set of subjects for the experiment are twofold: the demographic characteristics of the set are known and therefore demographic inferences can be made; the representativeness of the set is known and therefore inferences about populations can be made. The results suggest that at least 36% of the subjects had behaviour inconsistent with the hypothesis under test: that people formulate plans and then implement them. Interestingly demographic variables are unable to explain the consistency or inconsistency of individuals. One conclusion is that subjects simply make errors. An alternative conclusion, consistent with previous experimental research, is that people are unable to predict their own future decisions. The implications for dynamic theory (particularly relating to savings and pensions decisions) are important.  相似文献   

16.
We develop a joint maximum likelihood estimator for the interest rate risk premium and the parameters of the Cox, Ingersoll, and Ross bond pricing model. This new approach resolves difficulties inherent in previous approaches that use only time-series data or cross-sectional data. We apply the new approach to a large sample of joint time-series/cross-sectional data. The resulting estimated parameters explain simultaneously the changes in short-term interest rates and the prices of zero-coupon bonds with various maturities. We also identify and provide solutions to potential computational difficulties that other researchers are likely to face.  相似文献   

17.
The Information Technology (IT) for realizing Organizational Decision Support Systems (ODSS) is in a nascent stage of development. This is particularly true in the area of coordination, which is a critical element of ODSS, and which distinguishes ODSS research from earlier research in Group DSS and individually oriented DSS. As a first step in ODSS coordination research, alternative representation schemes need to be examined in terms of both their match with the prevailing needs of organizations and of existing IT approaches that can be brought to bear. Matching ODSS needs with coordination representation requirements is examined by using several supporting reference disciplines including foundational DSS and recent ODSS research frameworks/architectures. Existing IT approaches are adapted from the reference disciplines of Active DSS, Distributed Artificial Intelligence (DAI), and Mathematical/Computational Organization Theory (MCOT) to operationalize a computational model of coordination that: (1) embodies the philosophies of Active DSS—including the idea that automated intelligent agents can play a significant role in supporting decision makers by independently carrying out rudimentary tasks to support the various phases of a decision making process; (2) adapts DAI and IT approaches to reflect practical human organizational realities including what we refer to as the ‘Open-Ended Knowledge World‘, and the evolutionary nature of organizations—whereby ODSS coordination representations will be subjected to almost constant revision due to both external environment disruptions and internal events that require adjustments to a preliminary plan; and (3) reflects the fact that organizational goals are often vague, which implies that a coordination representation should be sufficiently robust to reflect ad hoc analysis accommodating of strategy changes.  相似文献   

18.
Most of the parameters used to describe the credit rating are in linguistic terms, which are vague and difficult to put into precise numerical values. Fuzzy set theory, which was developed to handle this kind of vagueness, is used to represent and to aggregate the various linguistic data usually used in commercial banks. To illustrate the approach, numerical examples are solved and compared with existing approaches.  相似文献   

19.
The study by Anderson and Zimmer [1992] of goodwill accounting policies uses a pooled time series experimental design. This approach can add substantially to our understanding of accounting policy choices, but not in the manner used by Anderson and Zimmer. Where accounting policy choices are believed to be independent from one period to the next, then a time series approach can greatly enhance our ability to capture the influence on such policy choices of changing circumstances, more so than a simple cross-sectional test. Conversely, if accounting policy choices are not independent between periods, pooling over time can overstate significance levels of statistical tests. The nature of Anderson and Zimmer's data makes the impact indeterminate. However, even under an extreme assumption, pre-regulation evidence remains significant at conventional levels.  相似文献   

20.
Existing approaches to the problem of designing management control systems may be described as primarily structural or behavioral. The first approach is characteristic of the accounting literature. It takes a rational and mechanistic view of control and treats the control system design problem as one of designing an effective information structure. The behavioral approach, exemplified by the socio-psychological literature on performance, views control as a problem of designing social relationships which lead to high performance.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号