首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The focus of this paper is a subset of income trusts called business trusts, a Canadian financial innovation that has experienced remarkable success in the Canadian market, but not in the U.S. At theendof2005, there were more than 170 business trusts (most of them in Canada, but a handful in the U.S.) with an aggregate market value of over $90 billion. Like income trusts generally, which include REITs and oil & gas trusts, business trusts are designed in large part to avoid taxation at the corporate level by distributing a substantial proportion of a business's operating cash flow. The business trust structure provides investors (called “unit holders”) with what amounts to a combination of subordinated, high‐yield debt and high‐yielding equity. But unlike the subordinated debt in most highly leveraged transactions (HLTs), the “internal” debt in a business trust unit is effectively “stapled” to the equity part of the security. And this kind of “strip financing” (which was a common practice in U.S. LBOs during the‘80s) means that, besides providing stable cash‐generating companies with a tax‐minimizing way of paying out excess cash, the business unit structure also limits the “financial distress costs” associated with HLTs. In the event of financial trouble, the unit holders are likely to be much more cooperative than ordinary subordinated debt holders in restructuring interest payments since the benefits of so doing accrue to the equity portion of their units. The original income trust structure has also been used by a number of U.S.‐based companies that listed their shares on the TSX. But, in the attempt to make the securities suitable for listing on the AM EX, and in response to auditor demands intended to address potential IRS concerns, the instruments were modified in ways that sacrificed one of the important benefits of the original structure. The changes were designed to make the subordinated debt issued as part of a package with equity look more like external, third‐party debt. And in so doing, the low‐cost restructuring feature built into the Canadian version was lost, and the U.S. trusts failed to gain acceptance.  相似文献   

2.
Italy is about to enforce the first comprehensive reform of its corporate insolvency framework since the Second World War. The new Codice della Crisi d'Impresa e dell'Insolvenza builds on international recommendations, European laws and foreign best practice. One area that has been subject to substantial influence from foreign models is preventive insolvency mechanisms, where the Italian Legislator drew from the French and English experiences, as these countries have a widely recognized reputation of excellence in this field. Nevertheless, the similarities between the Italian and the English system – particularly with reference to the Italian panel of experts in the alert procedures and the English “pre‐pack pool” in pre‐packaged administrations – have so far been overlooked in the academic literature. This article sheds some light on the degree of cross‐fertilization between the Italian panel of experts in the procedure d'allerta and the English pre‐pack pool in pre‐packaged administration. The primary purpose of this study is to investigate whether regulatory reforms are needed to support the activity of the Italian panel in promoting restructuring deals for debtors in distress.  相似文献   

3.
For over a decade, the SEC has required corporations to disclose in their 10‐K filings the nature and extent of their risk exposures using one or more of the following three methods: (1) sensitivity analysis; (2) the so‐called “tabular” format; and (3) value‐at‐risk (VaR). After discussing the significant differences in the type and level of information revealed by each method, this article presents the findings of a study that examines how corporate choices of disclosure method vary with firm‐specific and industry characteristics. While sensitivity analysis has been the “middle of the road” chosen by the majority of companies, there have also been significant minorities adopting the tabular and VaR methods. Those companies that have chosen the tabular method—in the authors' view, the most revealing of the three methods—have also had the largest interest rate and commodity exposures as well as the greatest demand for external financing. The propensity of companies to use VaR, the least revealing method, has been associated with larger size, perhaps reflecting scale economies in risk management, greater use of derivatives, and competitiveness concerns about revealing proprietary information.  相似文献   

4.
Shadow banking is the process by which banks raise funds from and transfer risks to entities outside the traditional commercial banking system. Many observers blamed the sudden expansion in 2007 of U.S. sub‐prime mortgage market disruptions into a global financial crisis on a “liquidity run” that originated in the shadow banking system and spread to commercial banks. In response, national and international regulators have called for tighter and new regulations on shadow banking products and participants. Preferring the term “market‐based finance” to the term “shadow banking,” the authors explore the primary financial instruments and participants that comprise the shadow banking system. The authors review the 2007–2009 period and explain how runs on shadow banks resulted in a liquidity crisis that spilled over to commercial banks, but also emphasize that the economic purpose of shadow banking is to enable commercial banks to raise funds from and transfer risks to non‐bank institutions. In that sense, the shadow banking system is a shock absorber for risks that arise within the commercial banking system and are transferred to a more diverse pool of non‐bank capital instead of remaining concentrated among commercial banks. The article also reviews post‐crisis regulatory initiatives aimed at shadow banking and concludes that most such regulations could result in a less stable financial system to the extent that higher regulatory costs on shadow banks like insurance companies and asset managers could discourage them from participating in shadow banking. And the net effect of this regulation, by limiting the amount of market‐based capital available for non‐bank risk transfer, may well be to increase the concentrations of risk in the banking and overall financial system.  相似文献   

5.
Not surprisingly, the recent accounting scandals look different when viewed from the perspectives of the political/regulatory process and of the market for corporate governance and financial reporting. We do not have the opportunity to observe a world in which either market or political/regulatory processes operate independently, and the events are recent and not well researched, so untangling their separate effects is somewhat conjectural. This paper offers conjectures on issues such as: What caused the scandalous behavior? Why was there such a rash of accounting scandals at one time? Who killed Arthur Andersen—the Securities and Exchange Commission, or the market? Did fraudulent accounting kill Enron, or just keep it alive for too long? What is the social cost of financial reporting fraud? Does the United States in fact operate a “principles‐based” or a “rules‐based” accounting system? Was there market failure? Or was there regulatory failure? Or both? Was the Sarbanes‐Oxley Act a political and regulatory overreaction? Does the United States follow an ineffective regulatory model?  相似文献   

6.
In this account of the evolution of finance theory, the “father of modern finance” uses the series of Nobel Prizes awarded finance scholars in the 1990s as the organizing principle for a discus‐sion of the major developments of the past 50 years. Starting with Harry Markowitz's 1952 Journal of Finance paper on “Portfolio Selection,” which provided the mean‐variance frame‐work that underlies modern portfolio theory (and for which Markowitz re‐ceived the Nobel Prize in 1990), the paper moves on to consider the Capi‐tal Asset Pricing Model, efficient mar‐ket theory, and the M & M irrelevance propositions. In describing these ad‐vances, Miller's major emphasis falls on the “tension” between the two main streams in finance scholarship: (1) the Business School (or “micro normative”) approach, which focuses on investors ‘attempts to maximize returns and cor‐porate managers’ efforts to maximize shareholder value, while taking the prices of securities in the market as given; and (2) the Economics Depart‐ment (or “macro normative”) approach, which assumes a “world of micro optimizers” and deduces from that assumption how the market prices actually evolve. The tension between the two ap‐proaches is resolved, and the two streams converge, in the final episode of Miller's history–the breakthrough in option pricing accomplished by Fischer Black, Myron Scholes, and Rob‐ert Merton in the early 1970s (for which Merton and Scholes were awarded the Nobel Prize in 1998, “with the late Fischer Black everywhere ac‐knowledged as the third pivotal fig‐ure”). As Miller says, the Black‐Scholes option pricing model and its many successors “mean that, for the first time in its close to 50‐year history, the field of finance can be built, or…rebuilt, on the basis of ‘observable’ magnitudes.” That option values can be calculated (almost entirely) with observable vari‐ables has made possible the spectacu‐lar growth in financial engineering, a highly lucrative activity where the prac‐tice of finance has come closest to attaining the precision of a hard sci‐ence. Option pricing has also helped give rise to a relatively new field called “real options” that promises to revolu‐tionize corporate strategy and capital budgeting. But if the practical applications of option pricing are impressive, the op‐portunities for further extensions of the theory by the “macro normative” wing of the profession are “vast,” in‐cluding the prospect of viewing all securities as options. Thus, it comes as no surprise that when Miller asks in closing, “What would I specialize in if I were starting over and entering the field today?,” the answer is: “At the risk of sounding like the character in ‘The Graduate,’ I reduce my advice to a single word: options.”  相似文献   

7.
In this roundtable, an adviser to several central banks and founding member of the Group of 30 discusses regulatory reform and corporate risk management strategies with senior executives from three of the world's largest insurance companies. Much of the discussion attempts to explain why insurance and reinsurance companies have proven less vulnerable to the crisis than commercial and investment banks. Part of the explanation has to do with their financial conservatism, which is attributed to a habitual tendency to decision‐making that gives heavy weight to long‐term probabilities and risks. But along with this “actuarial” cast of mind is a growing willingness to accept and make use of risk‐based capital requirements—a decision‐making framework that is, in some respects, in conflict with the accounting and regulatory capital conventions that still prevail in the industry. In particular, “Solvency II”—the risk‐based capital guidelines that are set for adoption in 2012 by insurers in the European Union—is held up as a possible model for global use.  相似文献   

8.
The explosion of corporate risk management programs in the early 1990s was a hasty and ill‐conceived reaction by U.S. corporations to the great “derivatives disasters” of that period. Anxious to avoid the fate of Barings and Procter & Gamble, most top executives were more concerned about crisis management than risk management. Many companies quickly installed (often outrageously priced) value‐at‐risk (VaR) systems without paying much attention to how such systems fit their specific business requirements. Focused myopically on loss avoidance and technical risk measurement issues, the corporate risk management revolution of the '90s thus got underway in a disorganized, ad hoc fashion, producing a curious amalgam of policies and procedures with no clear link to the corporate mission of maximizing value. But as the risk management revolution unfolded over the last decade, the result has been the “convergence” of different risk management perspectives, processes, and products. The most visible sign of such convergence is a fairly recent development called “alternative risk transfer,” or ART. ART forms consist of the large and growing collection of new risk transfer and financing products now being offered by insurance and reinsurance companies. As just one example, a new class of security known as “contingent capital” gives a company the option over a specified period—say, the next five years—to issue new equity or debt at a pre‐negotiated price. And to hold down their cost, such “pre‐loss” financing options are typically designed to be “triggered” only when the firm is most likely to need an infusion of new capital to avoid underinvestment or financial distress. But underlying—and to a large extent driving—this convergence of insurance and capital markets is a more fundamental kind of convergence: the integration of risk management with corporate financing decisions. As first corporate finance theorists and now practitioners have come to realize, decisions about a company's optimal capital structure and the design of its securities cannot be made without first taking account of the firm's risks and its opportunities for managing them. Indeed, this article argues that a comprehensive, value‐maximizing approach to corporate finance must begin with a risk management strategy that incorporates the full range of available risk management products, including the new risk finance products as well as established risk transfer instruments like interest rate and currency derivatives. The challenge confronting today's CFO is to maximize firm value by choosing the mixture of securities and risk management products and solutions that gives the company access to capital at the lowest possible cost.  相似文献   

9.
Drawing on the work of Michael Jensen and William Meckling, the co‐formulators of “agency cost theory,” the authors argue that there are two main challenges in designing the structure of organizations: (1) the “rights assignment” problem—that is, ensuring that decision‐making authority is vested in managers and employees with the “specific knowledge” necessary to make the best decisions; and (2) the “control” or “agency” problem—designing performance‐evaluation and reward systems that give decision‐makers strong incentives to exercise their decision rights in ways that increase the long‐run value of the organization. The authors provide a number of instructive applications and extensions of the Jensen‐Meckling organizational framework. Using a series of short case studies that range from the Barings Brothers' debacle in the early 1990s and the decade‐long restructuring of ITT to the cases of McDonald's and Century 21, the authors demonstrate the importance of designing performance‐measurement and reward systems that are consistent with the assignment of decision rights. In so doing, the authors also work to dispel the widespread notion, popular among advocates of Total Quality Management, that the widespread use of performance measures and incentives undermines efforts to promote teamwork within large organizations. A number of brief case histories of companies like Xerox and Mary Kay Cosmetics are used to show the critical role of performance measurement and individual rewards in reinforcing a quality‐centered corporate culture. As the authors conclude, “It is a mistake to think of the ‘soft’ and ‘hard’ aspects of organizations as mutually exclusive or even as competing.”  相似文献   

10.
As bank regulatory reform tries to come to grips with the lessons of the financial crisis, several experts have proposed that some form of contingent convertible debt (CoCo) requirement be added to the prudential regulatory toolkit. In this article, the authors show how properly designed CoCos can be used not just to absorb losses, but more importantly to encourage banks to recognize losses and replace lost equity in a timely way, as well as to manage risk more effectively. Their proposed CoCos requirement strengthens management's incentives to promptly replace lost capital and enhance risk management by imposing major costs on the managers and existing shareholders of banks that fail to do so. Key elements of the proposal are that conversion of the CoCos into equity would be (1) triggered at a high trigger ratio of equity to assets (long before the bank is near an insolvency point), (2) determined by a market trigger (using a 90‐day moving average market equity ratio) rather than by supervisory discretion, and (3) significantly dilutive to shareholders. The only clear way for bank managements to avoid such dilution would be to issue equity into the market. Under most circumstances—barring an extremely rapid plunge of a bank's financial condition—management should be able and eager to replace lost capital in a timely way; as a result, dilutive conversions should almost never occur. Banks would face strong incentives to maintain high ratios of true economic capital relative to risky assets, and to manage their risks effectively. This implies that “too‐big‐to‐fail” financial institutions would not be permitted to approach the point of insolvency; they would face strong incentives to recapitalize long before that point. And if they should fail to issue new equity in a timely manner, the CoCos conversion would provide an alternative means of recapitalizing banks well before they reach the brink of insolvency. Thus, a CoCos requirement would go a long way to resolving the “too‐big‐to‐fail” problem. Such a CoCos requirement would not only increase the effectiveness of regulation, but also reduce its cost. It would be less costly for banks to raise CoCos than equity, reflecting both the lower adverseselection costs of CoCos issuance and the potential tax advantages of debt. And precisely because of the low probability of CoCo conversion, the Cocos would be issued at relatively modest (if any) discounts to otherwise comparable but straight subordinated debt. Thus requiring a mix of equity and appropriately designed CoCos would be less costly to banks, and would entail less of a reduction in the supply of loans than would a much higher book equity requirement alone.  相似文献   

11.
美国金融监管体制改革趋势:“伞+双峰”的模式   总被引:1,自引:0,他引:1  
朱摇摇  李辰 《海南金融》2009,(12):61-64
美国自2008年来相继公布了《美国金融监管体系现代化蓝图》和“金融白皮书”,二者都从金融业长期发展与稳定的角度对美国金融监管体制进行改革,代表了美国金融监管改革的发展趋势。本文旨在探讨美国金融监管体制的长期改革模式,通过对改革方案的分析及与传统金融监管模式比较,提出“伞+双峰”的新型监管模式。该模式综合了伞式和双峰监管模式的优点,在危机中诞生的“伞+双峰”式监管模式必将成为当代金融监管体制改革的趋势。  相似文献   

12.
This article shows that firms “voluntarily” increase their disclosures in response to the threat of more stringent disclosure regulations. These disclosures are mostly just sufficient to deter regulation. However, when investment risk is low, both managers and investors might strictly prefer the regulation deterring equilibrium. We further find that in many cases, regulation can only be deterred by asymmetric disclosure behavior of the firms. This suggests that coordination issues and free‐riding may be important reasons why self‐regulation may fail. The results also indicate the importance of considering political pressure and regulatory threats to explain observed symmetric and asymmetric voluntary disclosure behavior.  相似文献   

13.
This paper shows that bank runs can be modeled as an equilibrium phenomenon. We demonstrate that some aspects of the intuitive “story” that bank runs start with fears of insolvency of banks can be rigorously modeled. If individuals observe long “lines” at the bank, they correctly infer that there is a possibility that the bank is about to fail and precipitate a bank run. However, bank runs occur even when no one has any adverse information. Extra market constraints such as suspension of convertibility can prevent bank runs and result in superior allocations.  相似文献   

14.
Two distinguished Morgan Stanley “alumni” discuss how their management of risk and uncertainty has not only preserved but increased the profitability of their businesses. In both cases—one involving a commodities trading operation and the other a long‐short hedge fund—the key has been to find cost‐effective ways to “cut off the left tails” of the distribution by avoiding naked long or short positions and creating option‐like payoffs with limited downside. In the case of the hedge fund, the combination of longs and shorts with the use of other risk‐reducing strategies has enabled the fund's managers to produce twice the market's returns with only half the volatility (and only one losing year) during the 18‐year life of the fund. In the case of the commodities trading operation, the strategy is described as combining ownership of physical assets with the use of option pricing models to create what amount to “long gamma positions in the asset” that “produce payoffs regardless of whether the asset goes up or down in value.”  相似文献   

15.
In a conversation held in June 2016 between Nobel laureate Eugene Fama of the University of Chicago and Joel Stern, chairman and CEO of Stern Value Management, Professor Fama revisited some of the landmarks of “modern finance,” a movement that was launched in the early 1960s at Chicago and other leading business schools, and that gave rise to Efficient Markets Theory, the Modigliani‐Miller “irrelevance” propositions, and the Capital Asset Pricing Model. These concepts and models are still taught at prestigious business schools, whose graduates continue to make use of them in corporations and investment firms throughout the world. But while acknowledging the staying power of “modern finance,” Fama also notes that, even after a half‐century of research and refinements, most asset‐pricing models have failed empirically. Estimating something as apparently simple as the cost of capital remains fraught with difficulty. He dismisses betas for individual stocks as “garbage,” and even industry betas are said to be unstable, “too dynamic through time.” What's more, the wide range of estimates for the market risk premium—anywhere from 2% to 10%—casts doubt on their reliability and practical usefulness. And as if to reaffirm the fundamental insight of the M&M “irrelevance” propositions—namely, that what companies do with the right‐hand sides of their balance sheets “doesn't matter”—Fama observes that “we still have no real resolution on the key questions of debt and taxes, or dividends and taxes.” But if he has reservations about much of modern finance, Professor Fama is even more skeptical about subfields now in vogue such as behavioral finance, which he describes as “mostly just dredging for anomalies,” with no underlying theory and no testable predictions. Although he does not dispute that a number of well‐documented traits from cognitive psychology show up in individual behavior, Fama says that behavioral economists have thus far failed to come up with a testable theory that links cognitive psychology to market prices. And he continues to defend the concept of “efficient markets” with which his name has long been closely associated, while noting that empirically based asset pricing models such as his (with Ken French) “three‐factor” CAPM have produced much better results than the standard CAPM.  相似文献   

16.
The Financial CHOICE Act recently passed by the House proposes to create an “off‐ramp” that would allow banks to escape burdensome prudential regulation if the ratio of their equity capital to their total assets is 10% or more. The Financial Economists Roundtable supports this idea as a means of reducing regulatory costs, but believes some additional safeguards are needed. A capital ratio of 10% may not be high enough to discourage banks from excessive risk taking. A solution is to have two capital requirements for banks choosing the off‐ramp: one absolute (as proposed in the act) and one risk‐based. The FER believes that many banks will prefer this regime to the current burdensome prudential regulation, especially if regulators simplify the setting of risk weights and make them more rule‐based. Regulators setting minimum capital requirements should consider not only a bank’s stand‐alone risk, but also the systemic risk posed by banks, as well as the tendency of accounting measures of income and assets to overstate the economic value of banks’ equity capital. The Financial Choice Act would also eliminate useful elements of ongoing supervision and regulation, not all of which can be addressed by higher capital alone. Furthermore, to facilitate regulatory learning about risks, off‐ramped banks should continue to report the data that regulators use for stress tests, even if they are no longer subjected to the discipline of stress tests. Finally, the act is viewed as too permissive in its treatment of off‐ramped banks that get into trouble. To prevent gaming of regulation, FERC recommends that off‐ramped banks that subsequently fall below the minimum requirements should be required to raise new capital immediately.  相似文献   

17.
One of the challenges companies claim to face in making sustainability a core part of their strategy and operations is that the market does not care about sustainability, either in general or because the time frames in which it matters are too long. The response of investors who say they care about sustainability—and their numbers are large and growing—is that companies do a poor job in providing them with the information they need to take sustainability into account in their investment decisions. Whatever the merits of each view, the fact remains that an effective conversation about sustainability requires the participation of both sides of the market. There are two main mechanisms for companies to communicate to the market as a way of starting this conversation: mandated reporting and quarterly conference calls. In this paper, the authors argue that neither companies nor investors can be seen as taking sustainability seriously unless it is integrated into the quarterly earnings call. Until that happens, the core business and sustainability are two separate worlds, each of which has its own narrator telling a different story to a different audience. The authors illustrate their argument using the case of SAP, the German software company. SAP was the first company to host an “ESG Investor Briefing,” a conference call for analysts and investors held on July 30, 2013 in which the company discussed both its sustainability performance and its contribution to the firm's financial performance. The narrative of this call was very similar to the narrative of the company's first “integrated report,” which was issued in 2012 and presented the company's sustainability initiatives in the context of its operating and financial performance. Nevertheless, the content and main focus of the “ESG Briefing” were very different from that of most quarterly earnings conferences, and so were the audiences. Whereas the quarterly call was attended mainly by sell side analysts—and the words “sustainability” or “sustainable” failed to receive a single mention—the ESG briefing was delivered to an investor audience made up almost entirely of the “buy side.”  相似文献   

18.
Valuation models are useful tools, but they need to be handled with care. When taking the form of mathematical formulas, they can easily be made to convey a false sense of precision. In particular, selective choice of long‐term growth rates and discount rates can be used to justify almost any desired valuation. The author shows how relatively simple valuation models can be applied by active investors in a way that honors the fundamentalist dictum of building valuations on the foundation of “what we know” and avoiding speculation about long‐term growth rates. The article also emphasizes the role of accounting in discovering what we know, and shows how to use accounting results in a way that not only minimizes speculation about growth rates and discount rates, but actually challenges the speculation about those rates that is implicit in current stock prices. Accounting‐based valuation models are “reverse‐engineered” to discover the forecasts of future operating performance that are effectively built into current prices, so the plausibility of such forecasts can be evaluated with fundamental analysis. In this sense, valuation models are used not so much to discover the “right” price as to identify, and then subject to critical examination, the market's current expectations about future performance.  相似文献   

19.
The purpose of this paper is to analyze the impact of preferential regulatory treatment on banks' demand for government bonds. Using unique transaction‐level data, our analysis suggests that preferential treatment in microprudential liquidity and capital regulation significantly increases banks' demand for government bonds. Liquidity and capital regulation also seem to incentivize banks to substitute other bonds with government bonds. We also find evidence that this “regulatory effect” leads banks to reduce lending to the real economy.  相似文献   

20.
Well‐functioning financial markets are key to efficient resource allocation in a capitalist economy. While many managers express reservations about the accuracy of stock prices, most academics and practitioners agree that markets are efficient by some reasonable operational criterion. But if standard capital markets theory provides reasonably good predictions under “normal” circumstances, researchers have also discovered a number of “anomalies”—cases where the empirical data appear sharply at odds with the theory. Most notable are the occasional bursts of extreme stock price volatility (including the recent boom‐and‐bust cycle in the NASDAQ) and the limited success of the Capital Asset Pricing Model in accounting for the actual risk‐return behavior of stocks. This article addresses the question of how the market's efficiency arises. The central message is that managers can better understand markets as a complex adaptive system. Such systems start with a “heterogeneous” group of investors, whose interaction leads to “self‐organization” into groups with different investment styles. In contrast to market efficiency, where “marginal” investors are all assumed to be rational and well‐informed, the interaction of investors with different “decision rules” in a complex adaptive system creates a market that has properties and characteristics distinct from the individuals it comprises. For example, simulations of the behavior of complex adaptive systems suggest that, in most cases, the collective market will prove to be smarter than the average investor. But, on occasion, herding behavior by investors leads to “imbalances”—and, hence, to events like the crash of '87 and the recent plunge in the NASDAQ. In addition to its grounding in more realistic assumptions about the behavior of individual investors, the new model of complex adaptive systems offers predictions that are in some respects more consistent with empirical findings. Most important, the new model accommodates larger‐than‐normal stock price volatility (in statistician's terms, “fat‐tailed” distributions of prices) far more readily than standard efficient market theory. And to the extent that it does a better job of explaining volatility, this new model of investor behavior is likely to have implications for two key areas of corporate financial practice: risk management and investor relations. But even so, the new model leaves one of the main premises of modern finance theory largely intact–that the most reliable basis for valuing a company's stock is its discounted cash flow.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号