首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
This paper extends the forestry maximum principle of Heaps (1984) to allow the benefits of harvesting to be the utility of the volume of the wood harvested as in Mitra and Wan, 1985, Mitra and Wan, 1986. Unlike those authors, however, time is treated as a continuous rather than as a discrete variable. Existence of an optimal harvesting policy is established. Then necessary conditions are derived for the extended model which are also sufficient. The conditions are used to show that under certain boundedness conditions, sequences of optimal harvesting policies contain subsequences which converge pointwise a.e. and in net present value to an optimal harvesting policy. This result is then used to show that any optimal logging policy must converge in harvesting age to a constant rotation period given by modified Faustmann formula. The associated age class distribution converges to a normal forest.  相似文献   

2.
Slowly moving fundamental time series can be mistaken for time trends. Use of this series can increase credibility of medium-term and long-term forecasts. This paper introduces a new slowly moving fundamental time series—the age distribution of the US population—to explain trends in real US interest rates over the past 35 years. We argue that lifecycle consumption patterns at the individual level can influence aggregate saving and real interest rates. Empirical evidence is presented that supports the relationship between age distribution and expected real interest rates. Simulations of future interest rates are developed.  相似文献   

3.
郝建生 《价值工程》2014,(21):44-46
对影响纵轴式掘进机截割载荷的运动参数进行了分析,在ANSYS中建立了纵轴式掘进机截割头截割煤岩的有限元模型,并基于LS-DYNA对其截割过程进行了仿真分析,获得了截割载荷随时间的动态变化曲线。针对截割过程中对截割载荷影响较大的运动参数,包括截割头钻进速度、摆动速度和转速等对截割载荷的影响进行了比较分析。仿真结果表明,截割载荷随截割头钻进和摆动速度的增大而增加,随截割头转速的增大有所降低。分析结果为合理选取截割头运动参数及提高截割头的截割性能提供了理论依据。  相似文献   

4.
In a one-sector optimal growth model with uncertainty about production optimal capital stocks converge in distribution to a stochastic modified golden rule [see, for example Brock and Mirman (1972, 1973)]. We show that such a result cannot be obtained, in general, if in addition to the random one-period shocks to production there is also a lasting shock to the production function at some random date in the future; however, the conditional optimal capital stocks ‘bunch together’ over time, i.e., a turnpike result for optimal programs is proved.  相似文献   

5.
This paper tests the effect on stock value of an expected change in future trading costs. The capitalized value of a reduction in trading costs is hypothesized to increase the stock value, a trading cost effect. Improved liquidity reduces trading costs. Inclusion as an S&P 500 Index replacement stock is an event hypothesized to increase liquidity. We use 114 observations between January 1, 1983 and October 12, 1989 of stocks added to the Index as replacements for stocks removed. The abnormal return of each stock is regressed against the ratio of the bidask spread to the price of the stock, the change in trading volume of the stock, and the open interest in the Index futures contracts at the close of the month prior to the replacement announcement. We find that the positive abnormal returns for replacement stocks are related to increased daily trading volume after inclusion in the Index. Further, the trading cost effect is proportional to percentage bid-ask spreads prior to inclusion. The trading cost effect increases as trading in derivatives of the Index increases. The volume and stock price changes after replacement are not transitory, indicating an improvement in liquidity. Three alternate hypotheses suggested in prior research to explain the abnormal returns for replacement stocks are tested. Testing each of the three models previously considered: price pressure, inelastic demand curves, and information, we find that none can be accepted with statistical confidence. The abnormal returns of Index replacement stocks are consistent with rational pricing of an anticipated reduction in future transaction costs. This anticipated reduction is capitalized in the value of the stock at the time of the replacement announcement. These results are consistent with a trading cost effect.  相似文献   

6.
7.
《Statistica Neerlandica》1962,16(3):291-302
In calculating formulas to determine the optimal number of machines assigned to an operator, the ratio of service time to machine operating time is supposed to be a constant for a given type of equipment. Changing the cutting speed on automatic lathes changes the tool life and the unit production time and, therefore, the ratio of service time to machine operating time and the efficiency. In this article a method to determine the optimal cutting speed and the corresponding optimal number of machines assigned to an operator is given. An example is given in which it was possible to attain a saving of f 2200 per lathe per year by assigning 7 lathes to an operator instead of 3. A further saving of f 5500 per lathe per year was calculated by decreasing the cutting speed to three-fourths of its original value and correspondingly assigning 15 to 17 lathes to one operator.  相似文献   

8.
蔬菜、水果和生鲜食品等物品在存储的过程中,随时间的推移会发生大量的损耗,损耗成本不容忽视。文章建立以需求率、生产率和产品损耗率为常数,允许缺货但缺货部分时变拖后供给的生产库存模型,给出了寻求最优解的方法和解的唯一性的证明,并分析了主要参数变化对平均利润和服务率的影响。  相似文献   

9.
Misinvoicing is a major tool in fraud including money laundering. We develop a method of detecting the patterns of outliers that indicate systematic mis‐pricing. As the data only become available year by year, we develop a combination of very robust regression and the use of ‘cleaned’ prior information from earlier years, which leads to early and sharp indication of potentially fraudulent activity that can be passed to legal agencies to institute prosecution. As an example, we use yearly imports of a specific seafood into the European Union. This is only one of over one million annual data sets, each of which can currently potentially contain 336 observations. We provide a solution to the resulting big data problem, which requires analysis with the minimum of human intervention.  相似文献   

10.
The tree which results from the application of the Sonquist and Morgan method is based on the principle of dichotomizing the population at each point according to one of the independent variables in such a way as to explain as much variance of the dependent variables as possible. Confronted before applying this tree-analysis-method, with a distribution of the dependent variable, we may imagine that this distribution was the result of some process Pj. The present paper describes such processes, in which each element of the population is going through a sequence of nodes at each of which the value of the dependent variable is modified. These modifications can be expressed by certain types of regressive equations. The nodes can be considered as constituting a tree structure. If we are analysing the resulting distribution, the tree underlying the above described process may come out. However, the regressive functions involved are not determined. Furthermore, not tree-like processes may bring upon a certain distribution. Thus the tree-analyses can reveal only part of the processes which led to a given distribution. The explanation of such processes by soical science theories is studied.  相似文献   

11.
The aim of this paper is to derive methodology for designing ‘time to event’ type experiments. In comparison to estimation, design aspects of ‘time to event’ experiments have received relatively little attention. We show that gains in efficiency of estimators of parameters and use of experimental material can be made using optimal design theory. The types of models considered include classical failure data and accelerated testing situations, and frailty models, each involving covariates which influence the outcome. The objective is to construct an optimal design based of the values of the covariates and associated model or indeed a candidate set of models. We consider D-optimality and create compound optimality criteria to derive optimal designs for multi-objective situations which, for example, focus on the number of failures as well as the estimation of parameters. The approach is motivated and demonstrated using common failure/survival models, for example, the Weibull distribution, product assessment and frailty models.  相似文献   

12.
In this paper, we study the consumption, labor supply, and portfolio decisions of an infinitely lived individual who receives a certain wage rate and income from investment into a risky asset and a risk-free bond. Uncertainty about labor income arises endogenously, because labor supply evolves randomly over time in response to changes in financial wealth. We derive closed-form solutions for optimal consumption, labor supply and investment strategy. We find that deferring the retirement age stimulates optimal consumption over time and discourages optimal labor supply during the working life. We also find explicitly that optimal portfolio allocation becomes more ‘conservative’ when the individual approaches his prescribed retirement age. The effects of risk-aversion coefficients on optimal decisions are examined.  相似文献   

13.
While the likelihood ratio measures statistical support for an alternative hypothesis about a single parameter value, it is undefined for an alternative hypothesis that is composite in the sense that it corresponds to multiple parameter values. Regarding the parameter of interest as a random variable enables measuring support for a composite alternative hypothesis without requiring the elicitation or estimation of a prior distribution, as described below. In this setting, in which parameter randomness represents variability rather than uncertainty, the ideal measure of the support for one hypothesis over another is the difference in the posterior and prior log‐odds. That ideal support may be replaced by any measure of support that, on a per‐observation basis, is asymptotically unbiased as a predictor of the ideal support. Such measures of support are easily interpreted and, if desired, can be combined with any specified or estimated prior probability of the null hypothesis. Two qualifying measures of support are minimax‐optimal. An application to proteomics data indicates that a modification of optimal support computed from data for a single protein can closely approximate the estimated difference in posterior and prior odds that would be available with the data for 20 proteins.  相似文献   

14.
The scheduling of food production is often accomplished informally based upon approximate time requirements stated in recipes and the judgment and experience of a food production manager, administrative dietician, or cook. Such schedules are seldom optimized to least overall duration and consequently contain periods of non-productive time on the part of personnel and resources. In addition, these schedules often attempt to avoid resource conflicts through the early scheduled completion of work activity; this leads to many of the menu items being completed sufficiently in advance of serving time that undesirable changes in the nutritional, organoleptic and microbiological properties of the food can take place.In this paper, a branch and bound algorithm is presented as a solution procedure for the foodservice scheduling problem. The advantage of branch and bound in comparison to a heuristic based scheduling procedure is that it can produce schedules which are optimized to least overall duration from start to finish. The added computational cost of the branch and bound procedure is justified, because most foodservice systems cycle their menus. Consequently, each of a finite number of schedules is reused numerous times over an extended period resulting in long-term productivity gains.Another advantage of the algorithm is that right-shifted or late start schedules may be produced. This is in keeping with our objective of minimizing the delay time between the completion of the food and its being served to the consumer.The paper describes a method by which the process time for each of the various steps in a recipe may be computed as a function of the number of servings being prepared. Although these are normally considered to be linear relationships, the algorithm can easily be modified to accept other types of relationships as well.Perhaps the most important aspect of the this research is that the branch and bound algorithm has been implemented to perform branching operations over two classes of decisions. The first class of decisions involves the selection of which recipe steps or activities are to be scheduled at a certain time, and the second class of decisions involves the choice of which resource class to assign to the activities in those cases where alternative resources are allowed. This dual branching philosophy provides a great deal of flexibility to the decision maker for handling the type of scheduling problems commonly found in practice. The expense of this added flexibility, however, is a substantial increase in the size of the decision tree which must be developed and explored.In order to demonstrate the performance of the algorithm for practical purposes, the lunch menus of a short term, acute case, 300 bed hospital in Syracuse, New York were used to develop production schedules. These menus included a total of 89 different hot food items whose recipes were placed into a menu file in the computer along with the coefficients needed to develop process time estimates as a function of numbers of servings to be prepared. In total, fourteen lunch menus are cycled at the hospital; the number of items appearing on the menus ranges from 8 to 14 hot food items.In the first series of tests, resources were assumed not to be interchangeable. The branch and bound procedure was successful in producing optimal solutions for eleven of the fourteen schedules. The three menus which were not optimally solved were aborted, because the size of the solutions tree grew beyond that which could be stored in 500K bytes of common memory. In spite of this, however, the upper bound solutions given by the aborted problems were found to be very close to lower bound values and therefore may be considered as very good solutions.In the second series of tests, certain resources were allowed to be interchangeable for some of the activities. Specifically, we assumed two labor classes. The first of these are called special cooks who are more experienced personnel. Normally special cooks prepare entree items, but they may be called upon in some cases to perform any activity normally performed by the second labor class- the less experienced cooks—who normally prepare soups, sauces and vegetables. The cooks, on the other hand, are never allowed to prepare menu items which call for special cooks. These groundrules are identical to those currently in practice at the hospital from which the test problems were obtained. Ten of the fourteen menus were selected for this series of tests. In all of the ten cases, the algorithm successfully developed optimal solutions without exceeding the 500K byte common memory limitation. And, in spite of the vastly increased tree size resulting from the dual decision branches, the computation times for these tests were only modestly greater than those of the first set of tests where alternative resources were not considered. The success of the algorithm in solving the dual decision problems resulted in large part from its ability to develop strong upper bounds very early in solution process. In addition, the characteristics of food production scheduling problems are such that the lower bound pruning is very effective especially in the early stages of tree development.It is important to note that the use of an algorithm such as this in a practical setting affords a very friendly user interface. Once the foodservice system's menu items have been placed into a file, the user may easily select any set of these items to include in a given schedule. Only three lines of data input are required. The first line specifies the number of items as well as the zero-hour or serving time. The second line identifies the item numbers of the hot-food items to be scheduled. And the third line specifies the number of servings for each of those items which must be prepared. Kitchen personnel with limited experience can be trained to input the data in less than 15 minutes of instruction time.  相似文献   

15.
Returns policies can prevent a manufacturer's product from being discounted. Such discounting discourages inventory holdings, and can deny adequate retail representation to products with uncertain demand. We demonstrate that returns are not simply a substitute for resale price maintenance, but can instead be employed to support a desirable degree of price dispersion at retail. Surprisingly, optimal return policies depend only on market demand functions and marginal production costs. The manufacturer need not know the distribution of demand uncertainty for its product, but can instead rely on retailers to order appropriately. Our results generalize to oligopoly settings.  相似文献   

16.
王艳艳 《价值工程》2010,29(34):19-20
在分析具有同时送货和取货特点的单车辆配送路径问题的基础上,建立OV-VRPSDP数学模型,提出一种新的求解OV-VRPSDP问题的比值法优化算法,并通过实例验证该算法的有效性和可行性。新算法首先对配送树图的顶点和边做适当处理,计算各树枝的份量,使配送车辆优先沿着份量较大的树枝前进,选择一条最优路径。实例表明,比值法能够快速解决OV-VRPSDP问题,并且具有较好的优化效果。  相似文献   

17.
Abstract . If forest industry taxation is to be put on a sound economic basis, the Federal Government, the largest land owner, should pay the same taxes as any other landowner, so that the social and economic effects of taxation are realized. Specialists report that the form of the property tax preferred for the taxation of the property of the forest industry, under most circumstances, is land value taxation, not the property tax based on income realized at some point In the future which presumed the continued existence of virgin forests. This paper recognizes that the forest industry now is based on harvests of tree crops and proposes a further development of the land value taxation principle in the form of a forest tax composed of a land value tax combined with a tax on tree growth which increases as growth as a percentage of volume growth decreases with the tree's increasing age.  相似文献   

18.
Tackling poverty has been one of the greatest global challenges and a prerequisite to sustainable development of countries. Countries implement nationally appropriate social protection systems and measures to address poverty. This paper addresses an aid system adopted by the government in Turkey where significant amounts of coal is distributed to poor families each year. The objective of the coal aid system is to complete the delivery of coal to poor families by the start of winter. However, an analysis of the data from previous years indicates that the distribution to many families cannot be completed on time. This results from the fact that planning is done manually and by trial-and-error as there is no system that can be used for distribution planning. This paper describes the planning problem encountered and develops a mathematical model to solve it. The proposed model is a multimodal, multicommodity, and multiperiod linear programming (LP) model. The model can be used to develop and update a distribution plan as well as to answer several what-if questions with regard to capacities, time constraints, and so forth. The model is solved using CPLEX for several problem instances obtained under different scenarios using data for the year 2012. The results show that at least 9% cost savings and about 40% decrease in distribution completion time can be achieved when the model is used. We analyze scenario results qualitatively and quantitatively and provide several insights to the decision makers. As a part of quantitative analysis, we develop regression models to predict optimal costs based on several factors. Our main contribution is to provide an efficient and effective tool to handle a large-scale real-world problem. The model has also helped to prove that the organization responsible for distribution planning may move from the current planning practice to an all-encompassing top-down approach.  相似文献   

19.
In this article, we study production planning for a rolling horizon in which demands are known with certainty for a given number of periods in the future (the forecast window M). The rolling horizon approach implements only the earliest production decision before the model is rerun. The next production plan will again be based on M periods of future demand information, and its first lot-sizing decision will be implemented.Six separate lot-sizing methods were evaluated for use in a rolling schedule. These include the Wagner-Whitin algorithm, the Silver-Meal heuristic, Maximum Part-Period Gain of Karni, Heuristics 1 and 2 of Bookbinder and Tan, and Modification 1 to the Silver-Meal heuristic by Silver and Miltenburg.The performance of each lot-sizing rule was studied for demands simulated from the following distributions: normal, uniform (each with four different standard deviations); bimodal uniform (two types); and trend seasonal (both increasing and decreasing trends). We considered four values of the setup cost (leading to natural ordering cycles in an EOQ model of three, four, five, and six periods) and forecast windows in the range 2 M 20.Eight 300-period replications were performed for each combination of demand pattern, setup cost, and lot-sizing method. The analysis thus required consideration of 2304 300-period replications (6 lot-sizing methods × 12 demand patterns × 4 values of setup cost × 8 realizations), each of which was solved for nineteen different values of the forecast window M. The performance of the lot-sizing methods was evaluated on the basis of their average cost increase over the optimal solution to each 300-period problem, as though all these demands were known initially.For smaller forecast windows, say 4 M 8, the most effective lot-sizing rules were Heuristic 2 of Bookbinder and Tan and the modified Silver-Meal heuristic. Rolling schedules from each were generally 1% to 5% in total cost above that of the optimal 300-period solution. For larger forecast windows, M 10 or so, the most effective lot-sizing method was the Wagner-Whitin algorithm. In agreement with other research on this problem, we found that the value of M at which the Wagner-Whitin algorithm first becomes the most effective lot-sizing rule is a decreasing function of the standard deviation of the demand distribution.  相似文献   

20.
不对称信息下,公司总部不确定组织内各工作地的内在产品成本是高(低效率)还是低(高效率),高效率工作地管理者有机会和动机掩饰其内在能力,因为一旦能力变得已知,生产单元可能面临更加严苛的下期标准成本方案。为避免总部提高成本控制绩效要求,生产通过付出低努力和实现高成本来应对,棘轮效应出现。实施工作地轮换,抑制棘轮效应的同时,损害经验曲线效应。本研究通过构建两期博弈模型,设计最优支付方案,提出有效的工作地轮换条件。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号