首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
A procedure is presented for calculating stochastic costs, which include operator (labor) and inventory costs, associated with dynamic line balancing. Dynamic line balancing, unlike the traditional methods of assembly and production line balancing, assigns operators to one or more operations, where each operation has a predetermined processing time and is defined as a group of identical parallel stations. Operator costs and inventory costs are stochastic because they are functions of the assignment process employed in balancing the line, which may vary throughout the balancing period, and the required flow rate. Earlier studies focused on the calculation of the required number of stations and demonstrated why the initial and final inventories at the different operations are balanced.The cost minimization method developed in the article can be used to evaluate and compare the assignment of operators to stations for various assignment heuristics. Operator costs and inventory costs are the components of the cost function. The operator costs are based on the operations to which operators are assigned and are calculated for the entire work week regardless of whether an operator is given only a partial assignment which results in idle time. It is assumed that there is no variation in station speeds, no learning curve effect for operators' performance times, and no limit on the number of operators available for assignment. The costs associated with work-in-process inventories are computed on a “value added” basis. There is no charge for finished goods inventory after the last operation or raw material before the first operation.The conditions which must be examined before using the cost evaluation method are yield, input requirements, operator requirements, scheduling requirements and output requirements. Yield reflects the output of good units at any operation. The input requirement accounts for units discarded or in need of reworking. The operator requirements define the calculation of operator-hours per hour, set the minimum number of operators at an operation, and require that the work is completed. The scheduling requirements ensure that operators are either working or idle at all times, and that no operator is assigned to more than one operation at any time. The calculation of the output reflects the yield, station speed, and work assignments at the last operation on the line.An application of the cost evaluation method is discussed in the final section of the article. Using a simple heuristic to assign operators, the conditions for yield, inputs, operators, scheduling, and output are satisfied. The costs are then calculated for operators and inventories.In conclusion, the cost evaluation method for dynamic balancing enables a manager to compare the costs of assigning operators to work stations. Using this method to calculate the operator and inventory costs, a number of different heuristics for assigning operators in dynamic balancing can be evaluated and compared for various configurations of the production line. The least cost solution procedure then can be applied to a real manufacturing situation with similar characteristics.  相似文献   

2.
This paper develops a new model for the analysis of stochastic volatility (SV) models. Since volatility is a latent variable in SV models, it is difficult to evaluate the exact likelihood. In this paper, a non-linear filter which yields the exact likelihood of SV models is employed. Solving a series of integrals in this filter by piecewise linear approximations with randomly chosen nodes produces the likelihood, which is maximized to obtain estimates of the SV parameters. A smoothing algorithm for volatility estimation is also constructed. Monte Carlo experiments show that the method performs well with respect to both parameter estimates and volatility estimates. We illustrate our model by analysing daily stock returns on the Tokyo Stock Exchange. Since the method can be applied to more general models, the SV model is extended so that several characteristics of daily stock returns are allowed, and this more general model is also estimated. Copyright © 1999 John Wiley & Sons, Ltd.  相似文献   

3.
Many financial assets, such as currencies, commodities, and equity stocks, exhibit both jumps and stochastic volatility, which are especially prominent in the market after the financial crisis. Some strategic decision making problems also involve American-style options. In this paper, we develop a novel, fast and accurate method for pricing American and barrier options in regime switching jump diffusion models. By blending regime switching models and Markov chain approximation techniques in the Fourier domain, we provide a unified approach to price Bermudan, American options and barrier options under general stochastic volatility models with jumps. The models considered include Heston, Hull–White, Stein–Stein, Scott, the 3/2 model, and the recently proposed 4/2 model and the α-Hypergeometric model with general jump amplitude distributions in the return process. Applications include the valuation of discretely monitored contracts as well as continuously monitored contracts common in the foreign exchange markets. Numerical results are provided to demonstrate the accuracy and efficiency of the proposed method.  相似文献   

4.
Considerable controversy surrounds the role of money in the production of goods and services. Previous empirical research has appeared to find that the real money stock affects aggregate output, holding other, more conventional inputs constant. However, the theoretical literature offers no convincing explanation for this empirical finding. One interpretation is that real money balances reduce the extent to which labor and capital are diverted into exchange-related activities instead of being used in production defined in a more narrow sense. To investigate this hypothesis, we estimate a production function augmented with real money balances as an input, using time-series data for the aggregate U.S. economy. A stochastic production frontier is then estimated without real money balances. We use these estimates to establish the presence of technical inefficiency. Finally, we show that the extent of technical inefficiency is negatively correlated with the real money stock. Our results provide a reconciliation between the empirical literature, which finds that real money balances affect output in a production function framework, and the theoretical literature, which suggests that real money balances enhance the technical efficiency of the economy.  相似文献   

5.
We wish to study inter-rater agreement comparing groups of observers who express their ratings on a discrete or ordinal scale. The starting point is that of defining what we mean by “agreement”. Given d observers, let the scores they assign to a given statistical unit be expressed as a d-vector in the real space. We define a deterministic ordering among these vectors, which expresses the degree of the raters’ agreement. The overall scoring of the raters on the sample space will be a d-dimensional random vector. We then define an associated partial ordering among the random vectors of the ratings, illustrate a number of its properties, and look at order-preserving functions (agreement measures). In this paper we also show how to test the hypothesis of greater agreement against the unrestricted hypothesis, and the hypothesis of equal agreement against the hypothesis that an agreement ordering holds. The test is applied to real data on two medical observers rating clinical guidelines.  相似文献   

6.
This paper proposes a new approach to handle nonparametric stochastic frontier (SF) models. It is based on local maximum likelihood techniques. The model is presented as encompassing some anchorage parametric model in a nonparametric way. First, we derive asymptotic properties of the estimator for the general case (local linear approximations). Then the results are tailored to a SF model where the convoluted error term (efficiency plus noise) is the sum of a half normal and a normal random variable. The parametric anchorage model is a linear production function with a homoscedastic error term. The local approximation is linear for both the production function and the parameters of the error terms. The performance of our estimator is then established in finite samples using simulated data sets as well as with a cross-sectional data on US commercial banks. The methods appear to be robust, numerically stable and particularly useful for investigating a production process and the derived efficiency scores.  相似文献   

7.
The translog profit functional form is widely used to study technical efficiency for banks. Although this functional form is known as being flexible, it is not applicable to those banks incurring economic losses. The recently developed approach, i.e., the censored stochastic frontier model (CSFM), by Tsay et al. (2013) appears to be superior to existing approaches, since CSFM does not need to transform negative profit into positive profit before taking the natural logarithm. The transformation with respect to the profit variable tends to bias the parameter estimates of the profit frontier and the subsequent profit efficiency measure. We show that the parameter estimates of CSFM have the desirable statistical properties. Moreover, empirical results reveal that the mean profit efficiency of CSFM is more robust than those models using transformed profits across the sub-periods 1991–1998 and 1999–2009.  相似文献   

8.
We aim to calibrate stochastic volatility models from option prices. We develop a Tikhonov regularization approach with an efficient numerical algorithm to recover the risk neutral drift term of the volatility (or variance) process. In contrast to most existing literature, we do not assume that the drift term has any special structure. As such, our algorithm applies to calibration of general stochastic volatility models. An extensive numerical analysis is presented to demonstrate the efficiency of our approach. Interestingly, our empirical study reveals that the risk neutral variance processes recovered from market prices of options on S&P 500 index and EUR/USD exchange rate are indeed linearly mean-reverting.  相似文献   

9.
This paper proposes a new two-step stochastic frontier approach to estimate technical efficiency (TE) scores for firms in different groups adopting distinct technologies. Analogous to Battese et al. (J Prod Anal 21:91–103, 2004), the metafrontier production function allows for calculating comparable TE measures, which can be decomposed into group specific TE measures and technology gap ratios. The proposed approach differs from Battese et al. (J Prod Anal 21:91–103, 2004) and O’Donnell et al. (Empir Econ 34:231–255, 2008) mainly in the second step, where a stochastic frontier analysis model is formulated and applied to obtain the estimates of the metafrontier, instead of relying on programming techniques. The so-derived estimators have the desirable statistical properties and enable the statistical inferences to be drawn. While the within-group variation in firms’ technical efficiencies is frequently assumed to be associated with firm-specific exogenous variables, the between-group variation in technology gaps can be specified as a function of some exogenous variables to take account of group-specific environmental differences. Two empirical applications are illustrated and the results appear to support the use of our model.  相似文献   

10.
A rapidly aging U. S. population is straining the resources available for long term care and increasing the urgency of efficient operations in nursing homes. The scope for productivity improvements can be examined by estimating a stochastic frontier production function. We apply the methods of maximum likelihood and quantile regression to a panel of Texas nursing facilities and infer that the average productivity shortfall due to avoidable technical inefficiency is at least 8 percent and perhaps as large as 20 percent. Non-profit facilities are notably less productive than comparable facilities operated for profit, and the industry has constant returns to scale.  相似文献   

11.
Human dynamics and sociophysics build on statistical models that can shed light on and add to our understanding of social phenomena. We propose a generative model based on a stochastic differential equation that enables us to model the opinion polls leading up to the 2017 and 2019 UK general elections and to make predictions relating to the actual results of the elections. After a brief analysis of the time series of the poll results, we provide empirical evidence that the gamma distribution, which is often used in financial modelling, fits the marginal distribution of this time series. We demonstrate that the proposed poll-based forecasting model may improve upon predictions based solely on polls. The method uses the Euler–Maruyama method to simulate the time series, measuring the prediction error with the mean absolute error and the root mean square error, and as such could be used as part of a toolkit for forecasting elections.  相似文献   

12.
A dynamic pre-positioning problem is proposed to efficiently respond to victims’ need for relief supplies under uncertain and dynamic demand in humanitarian relief. The problem is formulated as a multi-stage stochastic programming model that considers pre-positioning with the dynamic procurement and return decisions about relief supplies over a time horizon. To validate the advantages of dynamic pre-positioning, three additional pre-positioning strategies are presented: pre-positioning with one-time procurement and without returns, pre-positioning with one-time procurement and returns, and pre-positioning with dynamic procurement and without returns. Using data from real-world disasters in the United States in the Emergency Events Database, we present a numerical analysis to study the applicability of the proposed models. We develop a sample average approximation approach to solving the proposed model in large-scale cases. Our main contribution is that we integrate dynamic procurement and return strategies into pre-positioning to decrease both costs and shortage risks in uncertain and dynamic contexts. The results illustrate that dynamic pre-positioning outperforms the other three strategies in cost savings. It also indicates that a higher return price is particularly helpful for decreasing unmet demand. The proposed models can help relief agencies evaluate and choose the solutions that will have the greatest overall effectiveness in the context of different relief practices.  相似文献   

13.
We study a resource allocation problem in which law enforcement aims to balance intelligence and interdiction decisions to fight against illegal city-level drug trafficking. We propose a Markov Decision Process framework, apply a column generation technique, and develop a heuristic to solve this problem. Our approaches provide insights into how law enforcement should prioritize its actions when there are multiple criminals of different types known to them. We prove that when only one action can be implemented, law enforcement will take action (either target or arrest) on the highest known criminal type to them. Our results demonstrate that: (i) it may be valuable to diversify the action taken on the same criminal type when more than one action can be implemented; (ii) the marginal improvement in terms of the value of the criminals interdicted per unit time by increasing available resources decreases as resource level increases; and (iii) there are losses that arise from not holistically planning the actions of all available resources across distinct operations against drug trafficking networks.  相似文献   

14.
In this paper we present a simple model which gives a solution to a (one period) stochastic cash problem with a fixed cash outlay at the end of the period. We focus on the role of options as insurance contracts, as to value a constraint on the minimum cash level. It is argued that a cash level adjustment is optimal where the sum of the marginal cost of liquidity and the marginal insurance premium (options value) is zero.We like to thank Edwin O. Fischer, Jaap Spronk and three anonymous referees for helpfull comments. Excellent computational assistance by Henk Hofmans is hereby acknowledged. Of course, the usual disclaimers apply.  相似文献   

15.
《Journal of econometrics》1987,34(3):373-389
This paper presents a simple version of the theory of M-estimation. It is argued that the theory is immediately applicable to almost all estimation schemes employed by econometricians. It is further argued that the great overlooked benefit of the theory is that is provides almost automatic asymptotic results, e.g., probability limits and asymptotic covariances. Thus one need not be a theoretical econometrician to invent and use specially tailored estimators. To illustrate its use the theory is applied to a variety of theoretical and applied problems. Particular attention is paid to two-stage estimators.  相似文献   

16.
In this paper, we introduce a threshold stochastic volatility model with explanatory variables. The Bayesian method is considered in estimating the parameters of the proposed model via the Markov chain Monte Carlo (MCMC) algorithm. Gibbs sampling and Metropolis–Hastings sampling methods are used for drawing the posterior samples of the parameters and the latent variables. In the simulation study, the accuracy of the MCMC algorithm, the sensitivity of the algorithm for model assumptions, and the robustness of the posterior distribution under different priors are considered. Simulation results indicate that our MCMC algorithm converges fast and that the posterior distribution is robust under different priors and model assumptions. A real data example was analyzed to explain the asymmetric behavior of stock markets.  相似文献   

17.
We investigate how closely NBA teams play up to their potential. We find that shooting, rebounding, stealing the ball and blocking shots raise the number of potential wins while turnovers lower it. We also learn that better coaching and defensive prowess raise a team's win efficiency. Copyright © 2006 John Wiley & Sons, Ltd.  相似文献   

18.

This paper proposes a semiparametric smooth-varying coefficient input distance frontier model with multiple outputs and multiple inputs, panel data, and determinants of technical inefficiency for the Indonesian banking industry during the period 2000 to 2015. The technology parameters are unknown functions of a set of environmental factors that shift the input distance frontier non-neutrally. The computationally simple constraint weighted bootstrapping method is employed to impose the regularity constraints on the distance function. As a by-product, total factor productivity (TFP) growth is estimated and decomposed into technical change, scale component, and efficiency change. The distance elasticities, marginal effects of the environmental factors on the distance elasticities, temporal behavior of technical efficiency, and also TFP growth and its components are investigated.

  相似文献   

19.
《Economic Systems》2014,38(1):115-135
This paper investigates the process of GDP generation in former Soviet Union (FSU) economies to provide an understanding of the impact of technology channels on countries’ efficiency. We apply a stochastic frontier approach to 15 FSU economies over the period 1995–2008 and find that FDI and human capital improve countries’ technical efficiency. Furthermore, we show that these factors also have a positive impact on total factor productivity (TFP), which, in turn, improves real GDP growth. Hence, our results suggest that FSU countries should promote public policies that provide incentives to attract foreign investment and enhance domestic education in order to improve their economic growth. Additionally, our empirical evidence argues against the resource curse hypothesis. We also show, by computing efficiency change and technological change indices at the country level, that FSU economies benefit more from exploiting technological progress than from catching up to the best practice frontier.  相似文献   

20.
Record linkage is the act of bringing together records from two files that are believed to belong to the same unit (e.g., a person or business). It is a low‐cost way of increasing the set of variables available for analysis. Errors may arise in the linking process if an error‐free unit identifier is not available. Two types of linking errors include an incorrect link (records belonging to two different units are linked) and a missed record (an unlinked record for which a correct link exists). Naively ignoring linkage errors may mean that analysis of the linked file is biased. This paper outlines a “weighting approach” to making correct inference about regression coefficients and population totals in the presence of such linkage errors. This approach is designed for analysts who do not have the expertise or time to use specialist software required by other approaches but who are comfortable using weights in inference. The performance of the estimator is demonstrated in a simulation study.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号