首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 375 毫秒
1.
For many production systems, delivery performance relative to promised job due dates is a critical evaluation criterion. Delivery performance is affected by the way in which work is dispatched on the shop floor, and also by the way the job due dates are assigned to begin with. This pape shows how information regarding congestion levels on the shop floor can be used to assign due dates to arriving jobs in such a way that the mean tardiness of jobs is decreased without increasing the average length of the promised delivery lead times.Baker and Bertrand suggested a modification of the Total Work (TWK) rule for assigning job due dates which adjusts the job flow allowance according to the level of congestion in the shop. Their method gives longer flow allowances to jobs which arrive when the system is congested. Although their modified TWK rule results in lower mean tardiness in many settings, it also generally results in a higher proportion of jobs tardy.This paper presents an alternative modification of the TWK rule which, in most cases, provides mean tardiness as low as or lower than Baker and Bertrand's rule and also results in a lower proportion of jobs tardy. The alternative rule suggested here still results in higher proportion of tardy jobs than the non-workload adjusted rule in most settings, but suggestions are made for how this problem might be addressed.  相似文献   

2.
Both practitioners and researchers in the field of Operations Management have suggested that shop scheduling should be an integral component in both the strategic and tactical plans for an organization's assets. This paper examines the use of an accepted measure of return on assets, net present value (NPV), in a simulated shop scheduling environment where early shipment of jobs before their due dates is forbidden. In addition, early shipment of raw materials to the shop is also forbidden. This shop environment is consistent with the prevalent practice in industry of accepting orders only on a just-in-time basis to reduce purchased parts inventories. The NPV measure provides a means of balancing a variety of performance criteria that have been treated as separate objectives previously, including work-in-process inventory, finished goods inventory, mean flow time and mean tardiness, while also providing a means of measuring monetarily the value of various shop scheduling approaches.The NPV performance of priority scheduling rules and order release policies is measured in this research through the simulation of a random job shop under a variety of environmental conditions. It is found in a comparison of priority rules that use time-based information with those that use job value information that the Critical Ratio rule provides higher average performance than the three other rules used in the study. However, in some situations that are consistent with JIT practice, value-based priority rules also perform well. The use of a mechanism for delaying the release of jobs to each work center in the shop provided higher average NPV when shop utilization was set at a low level of 80%, while immediate release of work upon its arrival to the shop provided superior performance at a higher shop utilization level of 94%. While JIT materials delivery and costing yields higher NPV, it did not alter the relative ranking of priority rule/release policy combinations. In addition, it was found that environmental factors, including average job length, average number of tasks per job and level of tardiness penalty, resulted in greater variations in NPV performance than the institution of a JIT raw materials delivery policy.  相似文献   

3.
One of the management decisions required to operate a dual-constrained job shop is the labor assignment rule. This study examines the effects of various labor assignment rules on the shop's performance. Eleven different labor assignment rules are simulated. A longest-queue rule and the traditional counterparts of the first-in-system, first-served, shortest operation time, job due date, critical ratio and shortest processing time dispatching rules are used to determine to which work center available workers should be transferred. Also tested are five new labor assignment rules that use an average of the priority values of all jobs in queue at a particular work center to determine whether that work center should receive the available worker.A SIMSCRIPT simulation program that models nine work centers provided the mechanism by which these rules were tested. Five dispatching rules, the counterparts of the five “traditional counterpart” labor assignment rules mentioned earlier, provided different shop environments. Also, the level of staffing of the work centers was altered to provide additional ship environments. Staffing levels of 50% and 67% were employed.The results show that none of the eleven labor assignment rules had a significant impact on shop performance. This is an important result because it implies that a manager can make the labor assignment decision based on other criteria such as ease or cost of application of the rules. These results were relatively insensitive to the shop environment, as represented by the dispatching rule and the staffing level.  相似文献   

4.
Material Requirements Planning (MRP) systems have been widely applied in industry to better manage multiproduct, multistage production environments. Although many applications have been quite successful, much is still left to the planner's intuition as to how to assure that master schedules, component lot sizes, and priorities realistically conform to the capacity limits at individual work centers. Capacity issues may indeed be the soft spot in MRP logic.This paper explores some possible causes of irregular workload patterns when using an MRP system. Better insight on which factors cause temporary bottlenecks could help managers better assess the vulnerability of their plants to this problem. It might also suggest ways of dampening peaks and valleys. The problem setting is a multistage environment; several products are made from various subassemblies and parts. Each shop order is routed through one or more capacitated work centers. An order is delayed either by temporary capacity shortages or the unavailability of components. Of course, the second delay can be caused by capacity problems previously encountered by the shop orders of its components.Seven experimental factors are tested with a large-scale simulator, and five performance measures are analyzed. The factors are the number of levels in the bill of material, the average load on the shop, the average lot size, the choice of priority rule, demand variability, the use of a gateway department, and the degree of equipment specialization. We have one measure of customer service, two for inventory, and two for workload. The workload measures are unconventional, since our interest is when workload variability occurs and how it affects inventory and customer service.The simulator has been developed over the course of eight years, and since this study has been further enhanced to handle many more factors. The simulator was validated recently with real data at two manufacturing plants. It is quite general, in that the bills of material, shop configuration, routings, worker efficiencies, and operating rules can be changed as desired.An initial screening experiment was performed, whereupon the average load and priority rules were not statistically significant at even the .05 level. A full factorial analysis with two replications was then conducted on the five remaining factors. Multivariate analysis of variance (MANOVA) and analysis of variance (ANOVA) statistical tests have been performed.The results confirm that workload variability can have a detrimental impact on customer service and inventory. The following structural changes to the manufacturing system can be beneficial, but tend to be more difficult to achieve. More BOM levels improve customer service, but increase inventory and capacity bottlenecks. Resource flexibility is a powerful tool to reduce workload variability. Capacity slack averaging much over 10% is wasteful, having no benefits for inventory and customer service. In general, revising the routing patterns only, such as creating more dominant paths, will not give big payoffs. The following procedural changes are easier to implement. Master schedules which smooth aggregate resources are an excellent device to reduce workload variability. Even with a smooth MPS, debilitating workload variability can still occur due to the design of the BOM, lot size, and leadtime offset parameters. Selecting a priority rule does not seem to be of overriding importance compared to master scheduling and component lot sizing. These findings must be considered within the context of the range of plant environments encompassed by this study.  相似文献   

5.
This article presents the first attempt to develop and examine overtime policies for a repair shop environment. Overtime is used to augment repair capacity as needed to offset short-term demand fluctuations. If overtime provides sufficient additional repair capacity, it may be possible to reduce investment in spare parts inventory.The use of overtime in a repair shop requires managerial attention to several issues. In this article, the following five issues are examined: The relationship between overtime policies and spares stocking levels The timing of overtime—reactive or proactive The amount of overtime to use The level in the product structure at which overtime is most beneficial The priority scheduling and labor assignment policies usedSix overtime policies are developed that explore the above issues. These are examined using a simulation model of a hypothetical repair shop. Since the focus of the article is on overtime policies, a single labor assignment policy is used in conjunction with two priority scheduling rules.The results indicate that reactive overtime policies work well in this environment and overtime is most effective at the lowest level of the product structure, where repair times are relatively shorter. In addition, lowest level parts provide more usage flexibility to handle anticipated future failures.  相似文献   

6.
Input control is a generic procedure for smoothing production workload by delaying work during intervals of heavy load. While input control techniques have several practical benefits, they also have an inherent disadvantage. By restricting the set of jobs available for scheduling, an input control procedure removes some of the scheduling options that would otherwise be available. This paper examines the impact of such a procedure in a simple simulation model.The simulation model represents a production shop in a simplified way, as a single machine, but the production control system has three distinct parts. The first part assigns due-dates to customer orders, taking into consideration the size of each job and the workload in the shop. The second part is a job releasing rule that implements input control. The third part is a priority dispatching procedure that is aimed at meeting due-dates. By representing this three-part control system the model provides an opportunity to explore the interdependence between input control and other control procedures.Reinforcing previous research, the simulation experiments confirm that modified due-date priorities perform more effectively than other basic priority rules when performance is measured by average tardiness. Moreover, the experiments indicate that performance under the modified due-date regime is never improved by the use of input control. On the other hand, with dispatching rules that rely on shortest-first or critical ratio priorities, the experiments indicate that input control is sometimes advantageous. The effects of input control on scheduling performance thus appear to be somewhat complicated, and further experiments were designed to explore some of the relationships involved. The principal finding, however, provides a warning that input control can be counterproductive.  相似文献   

7.
We describe an experimental study of a single-machine scheduling model for a system that assigns due-dates to arriving jobs. The average lightness of the due-dates is assumed to be governed by a policy constraint, which we link analytically to the parameters of the decision rules for due-date assignment. We examine the use of different kinds of information in setting due-dates, and we investigate the relationship between the due-date assignment rule and the priority dispatching rule. On the basis of our results we identify situations in which the dispatching rule is critical to effective scheduling, others in which the due-date assignment rule is critical, and still others in which the combination of the two rules is a critical design issue.  相似文献   

8.
This paper introduces a new dispatching rule to job shop scheduling, extending earlier results to a multi-machine environment. This new rule, which uses modified due dates is compared to other popular dispatching methods over a range of due date tightnesses at two utilization levels. The results for mean tardiness indicate that the modified operation due date (MOD) rule compares very favorably with other prominent dispatching methods.The modified due date is a job's original due date or its early finish time, whichever is larger. For an individual operation, it is the operation's original due date or the operation's early finish time, whichever is larger. A comparison of the job-based version with the operation-based version indicated that the operation-based version tended to be more effective at meeting due dates.The main performance measure was mean job tardiness, although the proportion of tardy jobs was also reported, and the two measures together imply the conditional mean tardiness. The MOD rule was compared to several well-known tardiness-oriented priority rules, such as minimum slack-per-operation (S/OPN), smallest critical ratio (SCR), and COVERT. The MOD rule tended to achieve lower levels of mean tardiness than the other rules, except under conditions where due dates were quite loose. In this situation, very little tardiness occurs for any of the rules. The MOD rule appeared to be more robust than the other rules to changes in the tightness of due dates, and similar results occurred at both high and low utilizations.  相似文献   

9.
The purpose of this study is to investigate due date setting procedures and dispatching decisions in a flow line cell with family setups. In this environment, setups are not required when switching from a job in a given family to a job in the same family. However, switching from a job in one family to a job in another family requires a setup. Family setups in this shop are sequence independent. The dispatching decisions in this shop are threefold: (1) when should the decision to switch from one part family to another be made; (2) once the decision to switch families is made, how should the next part family be chosen (next family decision); and (3) how should the jobs within a family be prioritized (next job decision)? If the decision to switch classes can only be made after the current family is exhausted, the rule is called a class exhaustion rule. Otherwise the rule is a truncated rule. The results indicate that the due date setting procedure has a major impact on how dispatching should be performed in the shop. The family exhaustion procedure using the APT next family rule and the SPT next job rule is the best performer for mean flow time. When setup times are long, the SEQ due date rule using the family exhaustion procedure with the FCFS next family and the EDD next job rules performed well for due date criteria. When setup times are short, the EDD/T, Sawicki truncation rule and the family exhaustion rules performed well for due date criteria.  相似文献   

10.
We propose and develop a scheduling system for a very special type of flow shop. This flow shop processes a variety of jobs that are identical from a processing point of view. All jobs have the same routing over the facilities of the shop and require the same amount of processing time at each facility. Individual jobs, though, may differ since they may have different tasks performed upon them at a particular facility. Examples of such shops are flexible machining systems and integrated circuit fabrication processes. In a flexible machining system, all jobs may have the same routing over the facilities, but the actual tasks performed may differ; for instance, a drilling operation may vary in the placement or size of the holes. Similarly, for integrated circuit manufacturing, although all jobs may follow the same routing, the jobs will be differentiated at the photolithographic operations. The photolitho-graphic process establishes patterns upon the silicon wafers where the patterns differ according to the mask that is used.The flow shop that we consider has another important feature, namely the job routing is such that a job may return one or more times to any facility. We say that when a job returns to a facility it reenters the flow at that facility, and consequently we call the shop a re-entrant flow shop. In integrated circuit manufacturing, a particular integrated circuit will return several times to the photolithographic process in order to place several layers of patterns on the wafer. Similarly, in a flexible machining system, a job may have to return to a particular station several times for additional metal-cutting operations.These re-entrant flow shops are usually operated and scheduled as general job shops, ignoring the inherent structure of the shop flow. Viewing such shops as job shops means using myopic scheduling rules to sequence jobs at each facility and usually requires large queues of work-in-process inventory in order to maintain high facility utilization, but at the expense of long throughput times.In this paper we develop a cyclic scheduling method that takes advantage of the flow character of the process. The cycle period is the inverse of the desired production rate (jobs per day). The cyclic schedule is predicated upon the requirement that during each cycle the shop should perform all of the tasks required to complete a job, although possibly on different jobs. In other words, during a cycle period we require each facility to do each task assigned to it exactly once. With this requirement, a cyclic schedule is just the sequencing and timing on each facility of all of the tasks that that facility must perform during each cycle period. This cyclic schedule is to be repeated by each facility each cycle period. The determination of the best cyclic schedule is a very difficult combinatorial optimization problem that we cannot solve optimally for actual operations. Rather, we present a computerized heuristic procedure that seems very effective at producing good schedules. We have found that the throughput time of these schedules is much less than that achievable with myopic sequencing rules as used in a job shop. We are attempting to implement the scheduling system at an integrated circuit fabrication facility.  相似文献   

11.
Because the state of the equity market is latent, several methods have been proposed to identify past and current states of the market and forecast future ones. These methods encompass semi‐parametric rule‐based methods and parametric Markov switching models. We compare the mean‐variance utilities that result when a risk‐averse agent uses the predictions of the different methods in an investment decision. Our application of this framework to the S&P 500 shows that rule‐based methods are preferable for (in‐sample) identification of the state of the market, but Markov switching models for (out‐of‐sample) forecasting. In‐sample, only the mean return of the market index matters, which rule‐based methods exactly capture. Because Markov switching models use both the mean and the variance to infer the state, they produce superior forecasts and lead to significantly better out‐of‐sample performance than rule‐based methods. We conclude that the variance is a crucial ingredient for forecasting the market state. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

12.
Scheduling identical jobs on uniform parallel machines   总被引:1,自引:0,他引:1  
We address the problem of scheduling n identical jobs on m uniform parallel machines to optimize scheduling criteria that are nondecreasing in the job completion times. It is well known that this can be formulated as a linear assignment problem, and subsequently solved in O ( n 3) time. We give a more concise formulation for minsum criteria, and show that general minmax criteria can be minimized in O ( n 2) time. We present faster algorithms, requiring only O ( n + m log m ) time for minimizing makespan and total completion time, O ( n log n ) time for minimizing total weighted completion time, maximum lateness, total tardiness and the weighted number of tardy jobs, and O ( n log2 n ) time for maximum weighted tardiness. In the case of release dates, we propose an O ( n log n ) algorithm for minimizing makespan, and an O ( mn 2m+1) time dynamic programming algorithm for minimizing total completion time.  相似文献   

13.
Work center control rules, defined as a combination of job dispatch rules and short-term work center capacity adjustments, are analyzed using queueing theory. Promising rules are evaluated with a job shop simulation model. Simulations comparing work center control rules to the critical ratio rule for job dispatching indicate that work center control can increase performance to customer due date while simultaneously reducing average work in process inventory. The work center control rules are easily implemented by shops currently using input/output control and daily dispatch lists.  相似文献   

14.
W G Sullivan  E L Blair 《Socio》1979,13(1):35-39
A model is developed for predicting workload requirements for scheduled health care services. The model is then applied to an actual planning problem for a radiology department. The probability distribution of future workload is represented by the convolution of two families of random variables such that a compound Poisson process adequately describes workload requirements. The model developed herein can be applied to a wide assortment of capacity-expansion problems that are characterized by discrete demands (e.g. number of jobs) occurring in a given period of time, where the amount of time needed to complete each job is a continuous random variable.  相似文献   

15.
In many plants, the performance of shop floor workers is measured by accounting-based productivity criteria. Such systems encourage workers to maximize their individual performance, often at the expense of total shop performance. One such company, Union Switch & Signal, a manufacturer of railroad equipment, has decided to increase finished goods inventory in an effort to counteract poor due date performance. Management at Union Switch & Signal feel that workers not following priorities contribute significantly to this poor performance. It has been suggested that the controlled release of jobs into the shop, i.e., Order Review/Release (ORR), may provide the operations manager a vehicle for enforcing job priorities when formal dispatching rules are not strictly followed by workers. In this study, two ORR methodologies are studied in regards to their ability to offset the dysfunctional behavior by workers who seek to maximize their own individual productivity. This type of behavior was captured by simulating the phenomenon of `cherry picking'. Cherry picking occurs when a job is selected for processing based not on its formal priority but on the difference in standard allowable processing time and actual processing time. Results suggest that at least one ORR methodology is able to reduce the difference in resulting labor productivity while improving overall shop performance.  相似文献   

16.
《Statistica Neerlandica》1962,16(2):195-204
In an office a number of employees do the same kind of work. The jobs arrive at random and the holding time is exponential; the queuediscipline is first in, first served. The mean queuelength is shortened by work done in overtime according to the rule that on every day on which the total number of jobs exceeds a certain number N, one or more extra hours of work will be done. The distribution of the number of jobs in the office, the mean value of this number and the probability of overtime are given. The model is illustrated with a numerical example.  相似文献   

17.
In this paper, we examine the factors that influence the value of supply disruption information, investigate how information accuracy influences this value, and provide managerial suggestions to practitioners. The study is motivated by the fact that although accurate disruption information creates benefits, fully accurate disruption information may be difficult and costly to obtain. Furthermore, inaccurate disruption information can decrease the financial benefit of prior knowledge and even lead to negative performance. To perform the analysis, we adopt a newsvendor model within a single product setting where the focal firm can source from a supply network and has a given resilience capacity. The results show that information accuracy, specifically information bias and information variance, plays an important role in determining the value of disruption information. This influence varies at different levels of disruption severity and resilience capacity, and our results imply that higher amounts of resilience capacity actually may be detrimental to a firm without accurate information about a disruption's influence. Thus, for companies with a high resilience capacity, obtaining quality information is critical for effectively coping with disruptions.  相似文献   

18.
Job shop scheduling usually includes the process of selecting dispatch rules for loading shops with work. Traditionally, dispatch rules have been formed on the basis of processing time, operating time, or queueing order. A job shop scheduling model was developed to include external factors (such as due dates), internal factors (e.g., capacity), as well as influence factors (e.g., job status). Based on the model developed in this report a survey of industrial engineers, shop foremen, and production control supervisors was undertaken to determine what dispatch rules experienced job shop schedulers would select and if the selection process could be influenced by schedule conditions (status) or other organizational factors. Results suggest that schedulers may be influenced by other factors. This article suggests a model for further research with respect to job shop scheduling.  相似文献   

19.
In social science research, hypotheses about group means are commonly tested using analysis of variance. While deemed to be formulated as specifically as possible to test social science theory, they are often defined in general terms. In this article we use two studies to explore the current practice concerning group mean hypotheses. The first study consists of a content analysis of published articles where the reconstructed reality of hypotheses use is explored. The second study is a qualitative interview study with researchers, adding information about daily practice. We argue that, at present, hypotheses are not used to their utmost potential and that progress can be made by using informative hypotheses instead of the current non-informative hypotheses. Informative hypotheses capitalize on knowledge that researchers already possess and enable them to focus in their proceeding projects. The substantive focus of our work is the case of applied psychology.  相似文献   

20.
The growth of the shop stewards movement in South Africa prior to majority rule represented a challenge to the institutionalised managerial prerogative and cemented the position of black trade unions in the workplace, posing a threat to both Apartheid’s cheap labour system and also to the political control of the Apartheid regime. With the advent of majority rule in 1994, shop stewards are now expected to comply with, and co-operate in, the implementation of workplace changes which they would have traditionally opposed. This, plus the move towards Japanese-style ‘lean’ work practices, is creating farreaching challenges for the ‘New’ South African shop steward movement.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号