Using a small empirical model of inflation, output, and money estimated on U.S. data, we compare the relative performance of monetary targeting and inflation targeting. The results show monetary targeting to be quite inefficient, yielding both higher inflation and output variability. This is true even with a nonstochastic money demand formulation. Our results are also robust to using a P∗ model of inflation. Therefore, in these popular frameworks, there is no support for the prominent role given to money growth in the Eurosystem's monetary policy strategy. 相似文献
Is new market creation a search and selectionprocess within the theoretical space of all possible markets? Or is it the outcome of a process of transformation of extant realities into new possibilities? In this article we consider new market creation as a process involving a new
network of stakeholders. The network is initiated through an effectual commitment that sets in motion two concurrent cycles of expanding resources and convergingconstraints that result in the new market. The dynamic model was induced from two empirical investigations, a cognitive science-based
investigation of entrepreneurial expertise, and a real time history of the RFID industry.
JEL Classification:
M13, M31, D4, D52, D71, D72, L1, L2, P42
We would like to thank the Batten Institute at the Darden Graduate School of Business Administration, University of Virginia,
for supporting this research. We would also like to thank the following on specific contributions to our thesis: Anil Menon
for his relentless insistence on more precise formulations of effectual reasoning; Jim March for his conversation and for
inspiring us to dig into Type I and Type II errors; Rob Wiltbank for firming up the section on opportunity costs; and Stuart
Read for helping us clarify our writing.
Correspondence to: S.D. Sarasvathy 相似文献
Background: Validation of overall survival (OS) extrapolations of immune-checkpoint inhibitors (ICIs) during the National Institute for Health and Care Excellence (NICE) Single Technology Assessment (STA) process is limited due to data still maturing at the time of submission. Inaccurate extrapolation may lead to inappropriate decision-making. The availability of more mature trial data facilitates a retrospective analysis of the plausibility and validity of initial extrapolations. This study compares these extrapolations to subsequently available longer-term data.
Methods: A systematic search of completed NICE appraisals of ICIs from March 2000 to December 2017 was performed. A targeted search was also undertaken to procure published OS data from the pivotal clinical trials for each identified STA made available post-submission to NICE. Initial Kaplan-Meier curves and associated extrapolations from NICE documentation were extracted to compare the accuracy of OS projections versus the most mature data.
Results: The review identified 11 STAs, of which 10 provided OS data upon submission to NICE. The extrapolations undertaken considered parametric or piecewise survival models. Additional data cut-offs provided a mean of 18 months of OS beyond the end of the original data. Initial extrapolations typically under-estimated OS from the most mature data cut-off by 0.4–2.7%, depending on the choice of assessment method and use of the manufacturer- or ERG-preferred extrapolation.
Conclusion: Long-term extrapolation of OS is required for NICE STAs based on initial immature OS data. The results of this study demonstrate that the initial OS extrapolations employed by manufacturers and ERGs generally predicted OS reasonably well when compared to more mature data (when available), although on average they appeared to underestimate OS. This review and validation shows that, while the choice of OS extrapolation is uncertain, the methods adopted are generally aligned with later-published follow-up data and appear appropriate for informing HTA decisions. 相似文献
ABSTRACT The paper makes three contributions to the understanding of the post-crisis European banking governance. First, it offers a more comprehensive approach to banking governance, beyond the Banking Union, through its concept of ‘New European Banking Governance’ (NEBG) that incorporates EU state aid rules and fiscal regulations. Second, it considers the impact of NEBG on democratic institutions and processes in EU member states, an under-researched topic in the literature on European banking governance. Finally, through its in-depth case study of Slovenia it considers the NEBG in relation to peripheral Eurozone states. It argues that the post-crisis banking governance framework of the EU not only severely constrained the Slovenian state in its policy choices but rearranged its policy-making institutions in a way that restricted and continues to restrict democratic banking policy formation. 相似文献
This article investigates the role of taxation when public goods are privately provided. Externalities between consumers via the public good are shown to cause kinks in social indifference curves. As a result, a government restricted to income taxation should engineer enough inequality to ensure there are some non-contributors to the public good. Whether commodity taxation changes this conclusion depends on the extent to which consumers "see through" the government budget constraint. If they can, inequality should still be sought. When they cannot, in contrast to the case of an economy with only private goods, commodity taxation can be used in conjunction with income transfers to achieve the first-best. 相似文献
Historically, major consideration given to product management has focused on research and development or the introductory
stage of the product life cycle. The authors present an empirical study delineating the variables to be considered in the
product elimination process. More specifically, the elimination process is evaluated under a situation of poor product performance
despite a generally viable market. The basic objectives of the study were: to determine the significant variables in the product
elimination process of the Small Appliance Industry; to determine the relative importance of the variables; to examine the
interaction among variables. The data obtained through personal structured questionnaire interviews were analyzed and provided
a ranking of twenty-six variables relevant to the elimination process. Moreover, the Johnson's Hierarchical Clustering Schemes
was applied to determine the interaction among variables. The results indicate that profitability and financial variables
are most significant in the elimination decision-process. Second, the primary clusters of importance are concerned with market
share, market growth rate, consumer awareness, and competitive action. 相似文献
Due to the complexity of present day supply chains it is important to select the simplest supply chain scheduling decision support system (DSS) which will determine and place orders satisfactorily. We propose to use a generic design framework, termed the explicit filter methodology, to achieve this objective. In doing so we compare the explicit filter approach to the implicit filter approach utilised in previous OR research the latter focusing on minimising a cost function. Although the eventual results may well be similar with both approaches it is much clearer to the designer, both why and how, an ordering system will reduce the Bullwhip effect via the explicit filter approach. The “explicit filter” approach produces a range of DSS designs corresponding to best practice. These may be “mixed and matched” to generate a number of competitive delivery pipelines to suit the specific business scenario. 相似文献
A bstract . Historians interested in 20th century American reform often seek to analyze the ideologies of political leaders separately from the institutions that these same leaders created. Such emphases on ideas, as opposed to actions, has, for example, led "revisionist" American historians to argue that the presidencies of Herbert Hoover and Franklin D. Roosevelt were "conceptually continuous." Our examination of the major social welfare programs undertaken by the federal government in the 1920s disputes this claim. Examination of the operations of the federal bureaucracy instead of the rhetoric of politicians demonstrates the existence of decided policy differences between the Hoover and Roosevelt eras. "Efficiency" analogues dominant during the Hoover era were replaced with "direct service-provider" approaches which created a clear distinction between private and public welfare programs. Elements of "continuity" between the two eras have been overdrawn. Background is provided for increased understanding of some of the policy implications of America's contemporary welfare debate—particularly about "rehabilitation" strategies and/or rationales for action in the social welfare field. 相似文献
Summary In the study of information theoretic measures, additivity has been the basic requirement. However it is quite interesting to investigate the sub-additive measures. Starting from subadditivity for measures associated with a pair of distributions of a discrete random variable, it has been changed into an equality relation using another function of a pair of distributions., Under the sum property of the function and the measures, the relation is expressed in terms of a functional equation of which the most general complex solutions have been obtained. In terms of the real continuous solutions of the functional equation, the sub-additive measures of Relative information and Inaccuracy have been defined and characterized. Particular cases and simple properties particularly the convexities of some of these new measures have also been studied. 相似文献