首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 325 毫秒
1.
In their recent paper Tang and Tang (2003 Tang, S. L. and Tang, H. J. 2003. The variable financial indicator IRR and the constant economic indicator NPV.. The Engineering Economist, Vol. 48(No. 1): pp. 6978. [Taylor &; Francis Online] [Google Scholar], pp. 69–78) revive a longstanding controversy—net present value (NPV) versus internal rate of return (IRR)—by characterizing the NPV as an economic indicator and the IRR as a financial one. The paper implies that this distinction justifies ranking financial alternatives by ranking their IRRs. In the current article, it is argued that the direct IRR ranking does not necessarily provide the same evaluation environment—and therefore a fair comparison—for each alternative involved, and that the incremental ranking approach is needed to remedy this shortcoming. The article also points out that Tang and Tang's numerical examples of simple projects with one sign change in their cash flow patterns do not address the problem of multiple IRRs, which consequently renders Tang and Tang's ranking approach dysfunctional. It is demonstrated that the concept of a true rate of return, substituting for the non-performing IRR and applied in conjunction with the incremental approach, provides an adequate tool for ranking mutually exclusive projects or a project's technical or financial alternatives.  相似文献   

2.
Commodity price simulation is useful in many engineering economics applications, yet discrete approximations of the continuous stochastic processes used in modeling commodity prices are not always straightforward. This article describes the exact solution for discretely simulating the Schwartz and Smith (2000 Schwartz, E. and Smith, J. E. 2000. Short-term variations and long-term dynamics in commodity prices. Management Science, 46(7): 893911. [Crossref], [Web of Science ®] [Google Scholar]) two-factor model of commodity prices.  相似文献   

3.
A recent publication [2 Bernhard , Richard H. , “ A Comprehensive Comparison and Critique of Discounting Indices Proposed for Capital Investment Evaluation ,” The Engineering Economist , Volume 16 , Number 3 , Spring 1971 .[Taylor &; Francis Online] [Google Scholar]] in this journal presented a comprehensive comparison and critique of discounting indices proposed for (i) the examination of whether a proposed independent project should be accepted and (ii) selection of a project from a set of mutually exclusive projects. The present paper examines procedures available in the literature of engineering and managerial economics and in practice for selecting projects from a given number of proposed capital investment projects, if a given capital constraint does not permit the undertaking of all proposed projects. It appears that the present state of the art is subject to much controversy and confusion. The ensuing presentation intends to rectify this situation. In addition, a procedure for the above selection problem is developed. Usage of this procedure is recommended for the many policy makers who are continually faced with the trade-off between elaborate methods on the one hand and cursory /approximations on the other.  相似文献   

4.
A risk measure, expected opportunity loss (EOL), is introduced to quantify the potential loss of making an incorrect choice in risk-based decision making. Different from Savage's (1951 Savage, L. J. 1951. The theory of statistical decision. Journal of the American Statistical Association, 46(253): 5567. [Taylor &; Francis Online], [Web of Science ®] [Google Scholar]) minimax regret principle, EOL can account for the unbounded continuous random outcomes of alternatives and decision makers’ acceptable risk. This article studies the effects of the forms of loss function, correlation among outcomes, and the acceptable risk on the ranking results by considering the loss function in the power form. The results show that the loss functions and the outcomes correlations can significantly influence the rankings of alternatives in risk-based decision making.  相似文献   

5.
Management of swine waste generated in the United States is a challenging problem facing engineers, farmers, scientists, regulators, and policy-makers. Technologies for processing and storing swine waste have not been fully developed and refined in a manner acceptable to the public and environmental regulators. The primary concerns with improperly disposed swine waste are the effects on human and livestock health, surface and groundwater quality, air quality, and conservation of nitrogen fertilizers (Hagenstein 2003 Hagenstein, P. R. 2003. Air Emissions from Animal Feeding Operations: Current Knowledge, Future Needs, Washington, D. C.: National Research Council, National Academy Press.  [Google Scholar]).

The purpose of this study is to demonstrate the concept of target costing by applying it to a very specific example: the production of biomethanol from swine manure. This study summarizes the analyses that outline a design and calculate a preliminary cost estimate for a proposed system for producing biomethanol from swine manure (initial process). In this study the target costing process is demonstrated with calculation of a target cost. This article also demonstrates an application of value engineering as a systematic, interdisciplinary examination of factors affecting the cost of a product so as to find means to fulfill the product's specified purpose at the required standards of quality and reliability and at an acceptable cost.

The article is organized as follows. First, the purpose of applying target costing methodology to the development of marketable by-products from swine manure is outlined. Next, target cost is calculated for biomethanol made from swine manure based on current methanol prices and currently available subsidies for biomethanol made from swine manure. A system for producing biomethanol from swine manure is described. The current cost is calculated for producing biomethanol. Concepts of value engineering are employed to reduce a significant cost component of the initial process resulting in Process II. Finally, value engineering is employed the second time to further reduce the cost of Process II yielding Process III.  相似文献   

6.
This paper introduces a new method of capital project analysis called the perpetuity rate of return (PRR). As implied by its name, the PRR is found by transforming a project's cash flow stream into a perpetuity and then relating this value to the required investment outlay. The PRR method is essentially a compromise between the NPV and IRR techniques. Like the NPV, the PRR correctly values a project's cashflows by using the market-determined cost of capital as the discount rate; like the IRR, the PRR is a rate of return that is appropriately compared to the cost of capital to determine a project's acceptability. The new yield-based method fares well in comparison with the IRR on a conceptual level and appears to have practical potential.  相似文献   

7.
This article shows that the internal rate of return (IRR) of a project's expected cash flow stream is a weighted average of the IRRs offered by the project's (many) possible future outcomes, where the weights are calculated using the outcome probabilities and invested capital balances. Because the invested capital associated with a particular realization is a function of the Macaulay duration of the cash flows in that outcome, the weights depend on the outcome probabilities and the effective length of each cash flow stream.  相似文献   

8.
This paper examines the use of a simple heuristic for evaluating projects. We posit that ranking projects by IRR and rejecting marginal projects can be superior to a NPV rule if 1) project managers have incentives to overstate cash flow forecasts that occur late in a project's life, 2) project rankings determine project acceptance because not all positive NPV project's are accepted, and 3) a project's IRR is greater than the WACC. In these instances, the IRR heuristic undervalues distant cash flows and thus, reduces project managers' incentives to positively bias forecasts.  相似文献   

9.
Open-ended design problems have become an important component in our educational landscape (Grubbs and Strimel in J STEM Teach Educ 50(1):77–90, 2015; Jonassen et al. in J Eng Educ 95:139–151, 2006; National Research Council in Education for life and work: developing transferable knowledge and skills in the 21st Century, National Academies Press, Washington, 2012; Strimel in Technol Eng Teach 73(7):8–18, 2014a). The ability of students to confront open-ended problem scenarios, think creatively, and produce novel designs have all been lauded as necessary skills for today’s twenty first century learners (Partnership for 21st Century Skills in P21 framework definitions, Author, Washington, 2016). This emphasis on open-ended design problems in problem-based learning scenarios has been tied to workforce and higher education preparation for students (National Academy of Engineering and National Research Council in STEM integration in K–12 education: status, prospects, and an agenda for research, National Academies Press, Washington, 2014; National Research Council in Engineering in K–12 education: understanding the status and improving the prospects, National Academies Press, Washington, 2009; Strimel in Technol Eng Teach 73(5):16–24, 2014b). However, little research has been conducted to identify the impact of potentially-influential factors on student success in such open-ended design scenarios. Therefore, the researchers examined data from 706 middle school students, working in small groups, as they completed an open-ended design challenge to determine the relationships between a variety of potentially-influential factors and student performance, as measured through adaptive comparative judgment. The analysis of the data revealed several relationships, significant and not significant, between identified variables and student success on open-ended design challenges.  相似文献   

10.
Cultivating students’ design abilities can be highly beneficial for the learning of science, technology, engineering, and mathematics (STEM) concepts, and development of higher-order thinking capabilities (National Academy of Engineering and National Research Council in STEM integration in k-12 education: status, prospects, and an agenda for research, The National Academies Press, Washington, 2014). Therefore, examining students’ strategies, how they distribute their cognitive effort, and confront STEM concepts during design experiences, can help educators identify effective and developmentally appropriate methods for teaching and scaffolding design activities for students (National Research Council in standards for k-12 engineering education? The National Academies Press, Washington, 2010). Yet, educational researchers have only recently begun examining students’ engineering design cognition at the P-12 level, despite reports such as Standards for K-12 Engineering Education? (National Research Council 2010) designating this area of research as lackluster. Of the recent studies that have investigated engineering design cognition at the P-12 level, the primary method of investigation has been verbal protocol analysis using a think-aloud method (Grubbs in further characterization of high school pre- and non-engineering students’ cognitive activity during engineering design, 2016). This methodology captures participants’ verbalization of their thought process as they solve a design challenge. Analysis is typically conducted by applying a pre-determined coding scheme, or one that emerges, to determine the distribution of a group’s or an individual’s cognition. Consequently, researchers have employed a variety of coding schemes to examine and describe students’ design cognition. Given the steady increase of explorations into connections between P-12 engineering design cognition and development of student cognitive competencies, it becomes increasingly important to understand and choose the most appropriate coding schemes available, as each has its own intent and characteristics. Therefore, this article presents an examination of recent P-12 design cognition coding schemes with the purpose of providing a background for selecting and applying a scheme for a specific outcome, which can better enable the synthesis and comparison of findings across studies. Ultimately, the aim is to aid others in choosing an appropriate coding scheme, with cognizance of research analysis intent and characteristics of research design, while improving the intentional scaffolding and support of design challenges.  相似文献   

11.
In June 1982 the Justice Department issued itsMerger Guidelines which specify in terms of the Herfindahl index (H) what combinations of merger size and post-merger H are likely to lead to a merger challenge. This paper assesses theseGuidelines using Williamson's (1968) well-known model in which an optimal merger policy is viewed as one that considers both the price and cost consequences of merger. The Williamson model is recast in terms of H and changes in H and linked to theGuidelines. This allows an assessment of the welfare congequences of an industry merger for any given level of concentration and merger-produce changes in concentration. Among the conclusions are that, consistent with theGuidelines, higher values of H make socially successful mergers less likely, and a more appropriate, if perhaps not more feasible, focus for theGuidelines are coordination adjusted measures of concentration and merger size.  相似文献   

12.
13.
The internal rate of return (IRR) is often used by managers and practitioners for investment decisions. Unfortunately, it has serious flaws: (1) multiple real-valued IRRs may arise; (2) complex-valued IRRs may arise; (3) the IRR is, in general, incompatible with the net present value (NPV) in accept/reject decisions; (4) the IRR ranking is, in general, different from the NPV ranking; (5) the IRR criterion is not applicable with variable costs of capital. The efforts of economists and management scientists in providing a reliable project rate of return have generated over the decades an immense bulk of contributions aiming to solve these shortcomings. This article offers a complete solution to this long-standing unresolved issue by changing the usual perspective: the IRR equation is dismissed and the evaluator is allowed to describe the project as an investment or a borrowing at his discretion. This permits showing that any arithmetic mean of the one-period return rates implicit in a project reliably informs about a project's profitability and correctly ranks competing projects. With such a measure, which we call average internal rate of return, complex-valued numbers disappear and all the above-mentioned problems are wiped out. The economic meaning is compelling: it is the project return rate implicitly determined by the market. The traditional IRR notion may be found as a particular case.  相似文献   

14.
Recent science educational reforms in the United States have prompted increased efforts to teach engineering design as an approach to improve STEM (Science, Technology, Engineering, and Mathematics) learning in K-12 classrooms. Teaching design in early grades is a new endeavor for teachers in the United States. Much can be learned from design teaching and research on K-12 design education outside of the US. The purpose of this study was to explore how students learn and use design sketching to support their learning of science and design practices. Researchers provided a treatment of design sketching instruction based on best practices of prior research finding (Hope in Des Technol Educ Int J 10: 43–53, 2005; Gustafson et al. J Technol Educ 19(1):19–34, 2007). A delayed treatment model was used to provide a two-group counterbalanced quasi-experimental design to compare an experimental group and comparison (delayed treatment) group results from (6) grade 3 classrooms. Researchers employed Hope’s Des Technol Educ Int J 10: 43–53, (2005) frame to organize sketching data for analysis. Findings from this study indicated that design instruction treatment did improve student’s design and communication practices, moving from using sketching as a container of ideas to the use of sketching as a form of design communication and to refine design ideas. Both the treatment and comparison groups improved sketching skills after treatment was provided to both groups. Sketching is a design practice that can also help student learn science concepts through the generation of mental models of conceptual understanding.  相似文献   

15.
This paper examines the place of manual technical drawing in the 21st century by discussing the perceived value and relevance of teaching school students how to draw using traditional instruments, in a world of computer aided drafting (CAD). Views were obtained through an e-survey, questionnaires and structured interviews. The sample groups represent professional CAD users (e.g., engineers, architects); university lecturers; Technology Education teachers and student teachers; and school students taking Scottish Qualification Authority (SQA) Graphic Communication courses. An analysis of these personal views and attitudes indicates some common values between the various groups canvassed of what instruction in traditional manual technical drafting contributes towards learning. Themes emerge such as problem solving, visualisation, accuracy, co-ordination, use of standard conventions, personal discipline and artistry. In contrast to the assumptions of Prensky’s thesis [(2001a) Digital natives, digital immigrants. On the Horizon 9, 5. NCB University Press Retrieved Oct 2006 from http://www.marcprensky.com/writing/Prensky%20-%20Digital%20Natives,%20Digital%20Immigrants%20-%20Part1.pdf; (2001b) Digital natives digital immigrants, Part 2: Do they really think differently? On the Horizon 9, 6. NCB University Press retrieved Oct 2006 from http://www.marcprensky.com/writing/Prensky%20%20Digital%20Natives,%20Digital%20Immigrants%20-%20Part2.pdf] of digital natives, the study reported in this paper indicates that the school students apparently appreciate the experience of traditional drafting. In conclusion, the paper illustrates the perceived value of such learning in terms of transferable skills, personal achievement and enjoyment.  相似文献   

16.
The Federal Radio Commission regulated radio broadcasting, 1927–1934. With the passage of the Communications Act of 1934, the 1927 Radio Act (enabling the Commission) was re-enacted in whole. This congressional endorsement yields key evidence as to what policy outcomes were intended, differentiating competing theories for the origins of spectrum allocation law: Coase (J Law Econ 2(1):1–40, 1959), emphasizing policy error; Hazlett (J Law Econ 33:133–175, 1990), focusing on “franchise rents” in a public choice framework; and the “public interest” hypothesis, reconstructed by Moss and Fein (J Policy Hist 15(4):389–416, 2003). Congress’ revealed preferences prove consistent with the franchise rents theory, while contradicting the other two.  相似文献   

17.
The double sampling (DS) chart can reduce the sample size when monitoring the process mean. In this study, Duncan's cost model was modified by adding the statistical constraints to develop the design model of DS chart for the optimization of design parameters—sample size, control limit coefficient, warning limit coefficient and sampling interval. A numerical example was provided to illustrate the use of this model. A sensitivity analysis of the effects of model parameters and statistical constraints on the optimal design of DS chart was also performed.  相似文献   

18.
To meet the intentions of the New Zealand Curriculum 2007 teachers must critically reflect on their role and their idea of what defines ‘best practice’ for teaching and learning in the twenty-first century. The teacher’s role has changed considerably over time. There is now, more than ever, a need for much greater transparency, accountability and collaborative practice within education. Famous philosophers and theorists including Plato, Rousseau and Dewey have expounded ideals of authenticity and authentic engagement, but it is only with the spread of constructivism that authenticity has gained more favour. The authors will investigate perspectives of authenticity, authentic learning, and authentic activities (Kreber et al. in Adult Educ Q Am Assoc Adult Contin Educ 58(1):22–43, 2007; Newmann in Authentic achievement: restructuring schools for intellectual quality, Jossey-Bass Publishers, San Fransisco, 1996; Newmann and Wehlage in Educ Leadersh 50(7):8–12, 1993; Reeves et al. in Quality conversations. Paper presented at the 25th HERDSA annual conference, 2002; Splitter in Stud Philos Educ 28(2):135–151, 2008). Through qualitative investigation they identify and summarise key viewpoints and demonstrate how these can be successfully implemented through programmes of technology education. A model of authentic technology for producing quality technological outcomes is presented. The authors show how an activity from an initial teacher education course in technology education uses identified aspects of authentic technological practice through the various dimensions of authenticity to develop enduring learning for students. They consider the role of context in developing learning and introduce some new ideas on successful student engagement in the field of conation (Riggs and Gholar in Strategies that promote student engagement, Corwin Press, California, 2009). Conation is defined as the will, drive and effort behind students’ engagement in learning and is increasingly seen as an integral part of authentic education.  相似文献   

19.
This article discusses the empirical challenges that researchers face when demonstrating the existence and effects of resale price maintenance (RPM). We outline three approaches for finding price effects of RPM and the corresponding hurdles in data and methodology. We show that the quantity test that was suggested by Posner (Univ Chic Law Rev 45(1):1–20, 1977; Univ Chic Law Rev 48:6–26, 1981) does not identify the change to welfare when demand-enhancing effects are considered generally. Finally, we present some solutions to the challenge of identifying welfare effects, and we suggest guidelines for future research.  相似文献   

20.
The invention of the price/cost margin (P-MC)/P as an index of market power is usually credited to Lerner (Rev Econ Stud 1(3):157?C175, 1934). Landes and Posner (Harv Law Rev 94(5):937?C996, 1981) is similarly often considered the main reference for the generalized version of the index in the case of a dominant firm that shares the market with price-taking rivals. From the viewpoint of the history of industrial economics both claims are incorrect. It was not Lerner who invented the price/cost margin index and the generalized version was fully derived before WWII. In both cases, priority should be given to Luigi Amoroso, the leading Italian mathematical economist in the interwar decades. In the latter case the names of Heinrich von Stackelberg and George Stigler also deserve credit.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号