首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   6篇
  免费   1篇
财政金融   4篇
经济学   2篇
贸易经济   1篇
  2017年   1篇
  2015年   1篇
  2013年   2篇
  2011年   1篇
  2010年   1篇
  2001年   1篇
排序方式: 共有7条查询结果,搜索用时 0 毫秒
1
1.
This article studies the consequences of fixed commissions and low entry barriers in Greater Boston's real estate brokerage industry from 1998–2007. We find that agent entry reduces average service quality and use a dynamic empirical model to study the inefficiency in the current market structure. To accommodate a large state space, we approximate the value function using sieves and impose the Bellman equation as an equilibrium constraint. Our results suggest that a 50% cut in commissions would result in 40% fewer agents, social savings that amount to 23% of industry revenue, and 73% more transactions for the average agent.  相似文献   
2.
Costly punishment can facilitate cooperation in public-goods games, as human subjects will incur costs to punish non-cooperators even in settings where it is unlikely that they will face the same opponents again. Understanding when and why it occurs is important both for the design of economic institutions and for modeling the evolution of cooperation. Our experiment shows that subjects will engage in costly punishment even when it will not be observed until the end of the session, which supports the view that agents enjoy punishment. Moreover, players continue to cooperate when punishment is unobserved, perhaps because they (correctly) anticipate that shirkers will be punished: Fear of punishment can be as effective at promoting contributions as punishment itself.  相似文献   
3.
This paper illustrates how a misclassification cost matrix can be incorporated into an evolutionary classification system for bankruptcy prediction. Most classification systems for predicting bankruptcy have attempted to minimize misclassifications. The minimizing misclassification approach assumes that Type I and Type II error costs for misclassifications are equal. There is evidence that these costs are not equal and incorporating costs into the classification systems can lead to better and more desirable results. In this paper, we use the principles of evolution to develop and test a genetic algorithm (GA) based approach that incorporates the asymmetric Type I and Type II error costs. Using simulated and real-life bankruptcy data, we compare the results of our proposed approach with three linear approaches: statistical linear discriminant analysis (LDA), a goal programming approach, and a GA-based classification approach that does not incorporate the asymmetric misclassification costs. Our results indicate that the proposed approach, incorporating Type I and Type II error costs, results in lower misclassification costs when compared to LDA and GA approaches that do not incorporate misclassification costs. Copyright © 2001 John Wiley & Sons, Ltd.  相似文献   
4.
Most Web search engines determine the relevancy of Web pages based on query terms, and a few content filtering applications allow consumers to block objectionable material. However, not many Web search engines and content filtering applications learn the user preferences over time. In this study, we proposed two machine-learning approaches that can be used to learn consumer preferences to identify documents that are most relevant to the consumer. We test the proposed machine learning approaches on a few simulated data sets. The results of our study illustrate that data mining approaches can be used to design intelligent adaptive agents that can select the relevant Web pages, given query terms, for the user.  相似文献   
5.

In recent years, there has been an increase in awareness of trans-boundary pollution that places environmental assets at risk both globally and regionally. Globally, manmade pollutants have degraded the stratospheric ozone shield, the oceans, the atmosphere and the biodiversity of the planet. Regionally, these pollutants have harmed aquifers, rivers, lakes, soils and forests. Harmful effects of acid rains, greenhouse gasses, and thin ozone shield are not concentrated within political boundaries of a country, thus jeopardizing the well-being of people in other countries. These trans-boundary pollution problems—termed as Transnational Public Goods (TPGs)—often share two common features: strategic interactions among nations and public good properties. This paper applies the theory of voluntary provisions of TPGs to the behavior of nations to curb chloro-fluoro-carbon emissions that, in large part, preceded the ratification and institution of the Montreal Protocol on substances that deplete the ozone layer.

  相似文献   
6.
This paper describes the market for borrowing corporate bonds using a comprehensive data set from a major lender. The cost of borrowing corporate bonds is comparable to the cost of borrowing stock, between 10 and 20 basis points, and both have fallen over time. Factors that influence borrowing costs are loan size, percentage of inventory lent, rating, and borrower identity. There is no evidence that bond short sellers have private information. Bonds with Credit Default Swaps (CDS) contracts are more actively lent than those without. Finally, the 2007 Credit Crunch does not affect average borrowing costs or loan volume, but does increase borrowing cost variance.  相似文献   
7.
Three probabilistic neural network approaches are used for credit screening and bankruptcy prediction: a logistic regression neural network (LRNN), a probabilistic neural network (PNN) and a semi‐supervised expectation maximization‐based neural network. Using real‐world bankruptcy prediction and credit screening datasets, we compare the three probabilistic approaches using various performance criteria of sensitivity, specificity, accuracy, decile lift and area under receiver operating characteristics (ROC) curves. The results of our experiments indicate that the PNN outperforms the other two techniques for decile lift and specificity performance metric. Using the area under ROC curve, we find that for bankruptcy prediction data the PNN outperforms the other two approaches when false positive rates (FPRs) are less than 40 %. LRNN outperforms the other two techniques for FPRs higher than 40 % for bankruptcy data. We observe that the LRNN results are very sensitive to the ratio of examples belonging to two classes in training data and there is a tendency to overfit training data. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   
1
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号