ICORES 2017 Abstracts


Conference

ICORES 2017

Area 1 - Methodologies and Technologies

Full Papers
Paper Nr: 12
Title:

Process Optimization for Cutting Steel-Plates

Authors:

Markus Rothe

Abstract: In this paper, we consider the two-stage three-dimensional guillotine cutting stock problem with usable leftover. There are two properties that distinguish our problem formulation from others. First, we allow the items to be rotated. Second, we consider the case in which leftover material is to be reused in subsequent production cycles. Third, we solve the problem in three dimensions. The optimization problem is formulated as a mixed integer linear program. To verify the approach, several examples show that our formulation performs well.

Paper Nr: 17
Title:

Optimizing Spare Battery Allocation in an Electric Vehicle Battery Swapping System

Authors:

Michael Dreyfuss and Yahel Giat

Abstract: Electric vehicle battery swapping stations are suggested as an alternative to vehicle owners recharging their batteries themselves. To maximize the network's performance spare batteries must be optimally allocated in these stations. In this paper, we consider the battery allocation problem where the criterion for optimality is the window fill rate, i.e., the probability that a customer that enters the swapping station will exit it within a certain time window. This time is set as the customer's tolerable wait in the swapping station. In our derivation of the window fill rate formulae, we differ from previous research in that we assume that the swapping time itself is not negligible. We numerically analyse the battery allocation problem for a hypothetical countrywide application in Israel and demonstrate the importance of estimating correctly customers' tolerable wait, the value of reducing battery swapping time and the unique features of the optimal battery allocation.

Paper Nr: 21
Title:

Optimal Price Reaction Strategies in the Presence of Active and Passive Competitors

Authors:

Rainer Schlosser and Martin Boissier

Abstract: Many markets are characterized by pricing competition. Typically, competitors are involved that adjust their prices in response to other competitors with different frequencies. We analyze stochastic dynamic pricing models under competition for the sale of durable goods. Given a competitor’s pricing strategy, we show how to derive optimal response strategies that take the anticipated competitor’s price adjustments into account. We study resulting price cycles and the associated expected long-term profits. We show that reaction frequencies have a major impact on a strategy’s performance. In order not to act predictable our model also allows to include randomized reaction times. Additionally, we study to which extent optimal response strategies of active competitors are affected by additional passive competitors that use constant prices. It turns out that optimized feedback strategies effectively avoid a decline in price. They help to gain profits, especially, when aggressive competitors are involved.

Paper Nr: 23
Title:

Using the FMEA Method as a Support for Improving the Social Responsibility of a Company

Authors:

Patrycja Hąbek and Michał Molenda

Abstract: The concept of Corporate Social Responsibility (CSR) is based on companies voluntarily respecting environmental and social needs while making business decisions and at the same time taking into account the expectations of stakeholders. The notion of CSR is well known nowadays and practised by businesses around the world. However, this concept is sometimes interpreted and implemented differently. It is important to realize that the concept of CSR should be considered from the perspective of manufactured products as well as all processes realized in the company. The focus in this paper is on company processes. Socially responsible processes are those that do not adversely affect the company stakeholders. Therefore, the need arises to assess the risk of potential failures that may occur in company processes, taking into account the subjects of social responsibility. The authors present the possibility of using Failure Mode and Effects Analysis (FMEA) for this purpose. This paper presents an example of using a modified FMEA method which it is hoped can on one hand provide inspiration for further development of tools dedicated to CSR implementation at the operational level, and on the other hand offer help to those companies which want to integrate CSR into company processes.

Paper Nr: 43
Title:

Discontinued Products - An Empirical Study of Service Parts Management

Authors:

Luís Miguel D. F. Ferreira

Abstract: The procurement and inventory management of service parts for discontinued products has often been overlooked by companies, resulting in several problems such as stock outs, rush orders or obsolete stocks. Accordingly, the main aim of this work is to develop a methodology to deal with these issues following product discontinuation. To this end, an empirical study – based on action research principles – was carried out in a producer of household appliances, which is bound by law to provide service parts for its products for a period of 15 years after they have been discontinued. The work was developed in three stages: characterization of the company situation; definition of a procedure to eliminate obsolete stocks; and definition of a procedure to manage active service parts. The resulting methodology and respective procedures are presented and the results obtained with the implementation are discussed.

Paper Nr: 45
Title:

The Possibilistic Reward Method and a Dynamic Extension for the Multi-armed Bandit Problem: A Numerical Study

Authors:

Miguel Martin

Abstract: Different allocation strategies can be found in the literature to deal with the multi-armed bandit problem under a frequentist view or from a Bayesian perspective. In this paper, we propose a novel allocation strategy, the possibilistic reward method. First, possibilistic reward distributions are used to model the uncertainty about the arm expected rewards, which are then converted into probability distributions using a pignistic probability transformation. Finally, a simulation experiment is carried out to find out the one with the highest expected reward, which is then pulled. A parametric probability transformation of the proposed is then introduced together with a dynamic optimization, which implies that neither previous knowledge nor a simulation of the arm distributions is required. A numerical study proves that the proposed method outperforms other policies in the literature in five scenarios: a Bernoulli distribution with very low success probabilities, with success probabilities close to 0.5 and with success probabilities close to 0.5 and Gaussian rewards; and truncated in [0,10] Poisson and exponential distributions.

Paper Nr: 46
Title:

A Dynamic and Collaborative Truck Appointment Management System in Container Terminals

Authors:

Ahmed Azab

Abstract: Given the rising growth in containerized trade, Container Terminals (CTs) are facing truck congestion at the gate and yard. Truck congestion problems not only result in long queues of trucks at the terminal gates and yards but also leads to long turn times of trucks and environmentally harmful emissions. As a result, many terminals are seeking to set strategies and develop new approaches to reduce the congestions in various terminal areas. In this paper, we tackle the truck congestion problem with a new dynamic and collaborative truck appointment system. The collaboration provides shared decision making among the trucking companies and the CT management, while the dynamic features of the proposed system enable both stakeholders to cope with the dynamic nature of the truck scheduling problem. The new Dynamic Collaboration Truck Appointment System (DCTAS) is developed using an integrated simulation-optimization approach. The proposed approach integrates an MIP model with a discrete event simulation model. Results show that the proposed DCTAS could reduce the terminal congestions and flatten the workload peaks in the terminal.

Paper Nr: 57
Title:

Optimal Policies for Payment of Dividends through a Fixed Barrier at Discrete Time

Authors:

Raúl Montes-de-Oca and Patricia Saavedra

Abstract: In this paper a discrete-time reserve process with a fixed barrier is presented and modelled as a discounted Markov Decision Process. The non-payment of dividends is penalized. The minimization of this penalty results in an optimal control problem. This work focuses on determining the sequence of premiums that minimize penalty costs, and obtaining a rate for the probability of ruin to ensure a sustainable reserve operation.

Paper Nr: 64
Title:

Exact Approach to the Scheduling of F-shaped Tasks with Two and Three Criticality Levels

Authors:

Antonin Novak

Abstract: The communication is an essential part of a fault tolerant and dependable system. Safety-critical systems are often implemented as time-triggered environments, where the network nodes are synchronized by clocks and follow a static schedule to ensure determinism and easy certification. The reliability of a communication bus can be further improved when the message retransmission is permitted to deal with lost messages. However, constructing static schedules for non-preemptive messages that account for retransmissions while preserving the efficient use of resources poses a challenging problem. In this paper, we show that the problem can be modeled using so-called F-shaped tasks. We propose efficient exact algorithms solving the non-preemptive message scheduling problem with retransmissions. Furthermore, we show a new complexity result, and we present computational experiments for instances with up to 200 messages.

Paper Nr: 69
Title:

Supporting Efficient Global Moves on Sequences in Constraint-based Local Search Engines

Authors:

Renaud De Landtsheer, Gustavo Ospina and Yoann Guyot

Abstract: Constraint-Based Local Search (CBLS) is an approach for quickly building local search solvers based on a declarative modelling framework for specifying input variables, constraints and objective function. An underlying engine can efficiently update the optimization model to reflect any change to the input variables, enabling fast exploration of neighbourhoods as performed by local search procedures. This approach suffers from a weakness when moves involve modifying the value of a large set of input variables in a structured fashion. In routing optimization, if one implements the optimization model by means of integer variables, a two-opt move that flip a portion of route requires modifying the value of many variables. The constraint on this problems are then notified about many updates, but they need to infer that these updates constitute a flip, and waste a lot of time. This paper presents this multi-variable limitation, discusses approaches to mitigate it, and proposes an efficient implementation of a variable type that represents sequences of integers to avoid it. The proposed implementation offers good complexities for updating and querying the value of sequences of integers and some mechanisms to enable the use of state-of-the art incremental global constraints.

Paper Nr: 70
Title:

Strategic Capacity Expansion of a Multi-item Process with Technology Mixture under Demand Uncertainty: An Aggregate Robust MILP Approach

Authors:

Jorge Weston and Pablo Escalona

Abstract: This paper analyzes the optimal capacity expansion strategy in terms of machine requirement, labor force, and work shifts when the demand is deterministic and uncertain in the planning horizon. The use of machines of different technologies are considered in the capacity expansion strategy to satisfy the demand in each period. Previous work that considered the work shift as a decision variable presented an intractable nonlinear mix-integer problem. In this paper we reformulate the problem as a MILP and propose a robust approach when demand is uncertain, arriving at a tractable formulation. Computational results show that our deterministic model can find the optimal solution in reasonable computational times, and for the uncertain model we obtain good quality solutions within a maximum optimal gap of $10^{-4}$. For the tested instances, when the robust model is applied with a confidence level of 99\%, the upper limit of the total cost is, on average, 1.5 times the total cost of the deterministic model.

Paper Nr: 86
Title:

Data Clustering Method based on Mixed Similarity Measures

Authors:

Doaa S. Ali

Abstract: Data clustering aims to organize data and concisely summarize it according to cluster prototypes. There are different types of data (e.g., ordinal, nominal, binary, continuous), and each has an appropriate similarity measure. However when dealing with mixed data set (i.e., a dataset that contains at least two types of data.), clustering methods use a unified similarity measure. In this study, we propose a novel clustering method for mixed datasets. The proposed mixed similarity measure (MSM) method uses a specific similarity measure for each type of data attribute. When computing distances and updating clusters’ centers, the MSM method merges between the advantages of k-modes and K-means algorithms. The ‎proposed MSM method is tested using benchmark real life datasets obtained from the UCI Machine Learning Repository. The MSM method performance is compared against other similarity methods whether in a non-evolutionary clustering setting or an evolutionary clustering setting (using differential evolution). Based on the experimental results, the ‎MSM method proved its efficiency in dealing with mixed datasets, and achieved significant improvement in the clustering performance in 80% of the tested datasets in the non-evolutionary clustering setting and in 90% of the tested datasets in the evolutionary clustering setting. The time and space complexity of our proposed method is analyzed, and the comparison with the other methods demonstrates the effectiveness of our method.

Short Papers
Paper Nr: 22
Title:

Exact Solution of the Multi-trip Inventory Routing Problem using a Pseudo-polynomial Model

Authors:

Nuno Braga

Abstract: In this paper, we address an inventory routing problem where a vehicle can perform more than one trip in a working day. This problem was denominated multi-trip vehicle routing problem. In this problem a set of customers with demand for the planning horizon must be satisfied by a supplier. The supplier, with a set of vehicles, delivers the demand using pre-calculated valid routes that define the schedule of the delivery of goods on the planning horizon. The problem is solved with a pseudo-polynomial network flow model that is solved exactly in a set of instances adapted from the literature. An extensive set of computational experiments on these instances were conducted varying a set of parameters of the model. The results obtained with this model show that it is possible to solve instances up to 50 customers and with 15 periods in a reasonable computational time.

Paper Nr: 24
Title:

Variance of Departure Process in Two-Node Tandem Queue with Unreliable Servers and Blocking

Authors:

Yang Woo Shin and Dug Hee Moon

Abstract: This paper provides an effective method for evaluating the second moments such as variance and covariance for the number of departures in two-node tandem queue with unreliable servers. The behavior of the system is described by a level dependent quasi-birth-and-death process and the output process is modeled by a Markovian arrival process. Algorithms for the transient behavior, the variance and covariance structure for the output process and the time to the nth departure are developed. We show that the results can be applied to derive approximate formulae for the due-date performance and the distribution of the number of outputs in a time interval.

Paper Nr: 31
Title:

A Near Optimal Approach for Symmetric Traveling Salesman Problem in Euclidean Space

Authors:

Wenhong Tian

Abstract: The traveling salesman problem (TSP) is one of the most challenging NP-hard problems. It has widely applications in various disciplines such as physics, biology, computer science and so forth. The best known approximation algorithm for Symmetric TSP (STSP) whose cost matrix satisfies the triangle inequality (called ?STSP) is Christofides algorithm which was proposed in 1976 and is a 3/2 approximation. Since then no proved improvement is made and improving upon this bound is a funda- mental open question in combinatorial optimization. In this paper, for the first time, we propose Trun- cated Generalized Beta distribution (TGB) for the probability distribution of optimal tour lengths in a TSP. We then introduce an iterative TGB approach to obtain quality-proved near optimal approximation, i.e., (1+1/2((a+1)/(a+2))^(K-1))-approximation where K is the number of iterations in TGB and a(>> 1) is the shape parameters of TGB. The result can approach the true optimum as K increases.

Paper Nr: 47
Title:

Structuring Multicriteria Resource Allocation Models - A Framework to Assist Auditing Organizations

Authors:

Vivian Vivas and Mónica Duarte Oliveira

Abstract: Multicriteria resource allocation models have been reported in the literature to support decision makers in selecting options/projects/programmes. These models are particularly important in public contexts in which resources are limited and there is an increasing demand for transparency and accountability in spending. Despite the potential of these models to promote an effective use of scarce resources, there is little organized and integrated research on how to structure them. In this paper we propose a framework with techniques and tools to support the structuring of multicriteria resource allocation models, so that these models have a potential to assist organizations in evaluating and selecting audit and control actions; and we provide illustrative examples on to apply these techniques and tools in the context of the Comptroller General of the Union, the Ministry of the Brazilian federal government responsible for helping the Brazilian president regarding the treasury, federal public assets application and the government's transparency policies.

Paper Nr: 58
Title:

Designing Charging Infrastructure for a Fleet of Electric Vehicles Operating in Large Urban Areas

Authors:

Michal Koháni, Peter Czimmermann and Michal Váňa

Abstract: Here, we propose a method to design a charging infrastructure for a fleet of electric vehicles such as a fleet of taxicabs, fleet of vans used in the city logistics or a fleet of shared vehicles, operating in large urban areas. Design of a charging infrastructure includes decisions about charging stations location and number of charging points at each station. It is assumed that the fleet is originally composed of vehicles equipped with an internal combustion engine, however, the operator is wishing to replace them with fully electric vehicles. To avoid an interaction with other electric vehicles it is required to design a private network of charging stations that will be specifically adapted to the operation of a fleet. It is often possible to use GPS traces of vehicles characterizing actual travel patterns of individual vehicles. First, to derive a suitable set of candidate locations from GPS data, we propose a practical procedure where the outcomes can be simply controlled by setting few parameter values. Second, we formulate a mathematical model that combines location and scheduling decisions to ensure that requirements of vehicles can be satisfied. We validate the applicability of our approach by applying it to the data characterizing a large taxicab fleet operating in the city of Stockholm. Our results indicate that this approach can be used to estimate the minimal requirements to set up the charging infrastructure.

Paper Nr: 75
Title:

Multiobjective Optimization using Genetic Programming: Reducing Selection Pressure by Approximate Dominance

Authors:

Ayman Elkasaby

Abstract: Multi-objective optimization is currently an active area of research, due to the difficulty of obtaining diverse and high-quality solutions quickly. Focusing on the diversity or quality aspect means deterioration of the other, while optimizing both results in impractically long computational times. This gives rise to approximate measures, which relax the constraints and manage to obtain good-enough results in suitable running times. One such measure, epsilon-dominance, relaxes the criteria by which a solution dominates another. Combining this measure with genetic programming, an evolutionary algorithm that is flexible and can solve sophisticated problems, makes it potentially useful in solving difficult optimization problems. Preliminary results on small problems prove the efficacy of the method and suggest its potential on problems with more objectives.

Paper Nr: 81
Title:

Optimization and Scheduling of Queueing Systems for Communication Systems: OR Needs and Challenges

Authors:

Attahiru Sule Alfa and B. T. Maharaj

Abstract: The modern communication system is growing at an alarming rate with fast growth of new technologies to meet current and future demands. While the development of devices and technologies to improve and meet the expected communication demands keeps growing, the tools for their effective and efficient implementations seem to be lagging behind. On one hand there is a tremendous development and continued advancement of techniques in Operations Research (OR). However it is surprising how the key tools for efficiently optimizing the use of the modern technologies is lagging behind partly because there isn’t sufficient cooperation between core OR researchers and communication researchers. In this position paper, using one specific example, we identify the need to develop more efficient and effective OR tools for combined queueing and optimization tools for modern communication systems. OR scientists tend to focus more on either the analysis of communication issues using queueing theory tools or the optimization of resource allocations but the combination of the tools in research have not received as much attention. Our position is that this is one of major areas in the OR field that would benefit communication systems. We briefly touch on other examples also.

Paper Nr: 85
Title:

K-modes and Entropy Cluster Centers Initialization Methods

Authors:

Doaa S. Ali

Abstract: Data clustering is an important unsupervised technique in data mining which aims to extract the natural partitions in a dataset without a priori class information. Unfortunately, every clustering model is very sensitive to the set of randomly initialized centers, since such initial clusters directly influence the formation of final clusters. Thus, determining the initial cluster centers is an important issue in clustering models. Previous work has shown that using multiple clustering validity indices in a multiobjective clustering model (e.g., MODEK-Modes model) yields more accurate results than using a single validity index. In this study, we enhance the performance of MODEK-Modes model by introducing two new initialization methods. The two proposed methods are the K-Modes initialization method and the entropy initialization method. The two proposed methods are tested using ten benchmark real life datasets obtained from the UCI Machine Learning Repository. Experimental results show that the two initialization methods achieve significant improvement in the clustering performance compared to other existing initialization methods.

Paper Nr: 94
Title:

Integrated Production and Imperfect Preventive Maintenance Planning - An Effective MILP-based Relax-and-Fix/Fix-and-Optimize Method

Authors:

Phuoc Le Tam and El-Houssaine Aghezzaf

Abstract: This paper investigates the integrated production and imperfect preventive maintenance planning problem. The main objective is to determine an optimal combined production and maintenance strategy that concurrently minimizes production as well as maintenance costs during a given finite planning horizon. To enhance the quality of the solution and improve the computational time, we reconsider the reformulation of the problem proposed in (Aghezzaf et al., 2016) and then solved it with an effective MILP-based Relax-and-Fix/Fix-and-Optimize method (RFFO). The results of this Relax-and-Fix/Fix-and-Optimize technique were also compared to those obtained by a Dantzig-Wolfe Decomposition (DWD) technique applied to this same reformulation of the problem. The results of this analysis show that the RFFO technique provides quite good solutions to the test problems with a noticeable improvement in computational time. DWD on the other hand exhibits a good improvement in terms of computational times, however, the quality of the solution still requires some more improvements.

Paper Nr: 95
Title:

Optimal Combination RebateWarranty Policy with Second-hand Products

Authors:

Sriram Bhakthavatchalam and Claver Diallo

Abstract: With the increased awareness for sustainability, many engineered products are being recovered and reconditioned for secondary useful lives. These second-hand products can serve as replacement products to honour warranty pledges. This paper presents two mathematical models to determine the optimal combination rebate warranty policy when refurbished products are used for replacements from both the manufacturer and consumer point of views. Several numerical experiments are conducted to derive useful managerial knowledge.

Posters
Paper Nr: 10
Title:

A Heuristic for Optimization of Metaheuristics by Means of Statistical Methods

Authors:

Eduardo B. M. Barbosa and Edson L. F. Senne

Abstract: The fine-tuning of the algorithms parameters, specially, in metaheuristics, is not always trivial and often is performed by ad hoc methods according to the problem under analysis. Usually, incorrect settings influence both in the algorithms performance, as in the quality of solutions. The tuning of metaheuristics requires the use of innovative methodologies, usually interesting to different research communities. In this context, this paper aims to contribute to the literature by presenting a methodology combining Statistical and Artificial Intelligence methods in the fine-tuning of metaheuristics. The key idea is a heuristic method, called Heuristic Oriented Racing Algorithm (HORA), which explores a search space of parameters, looking for candidate configurations near of a promising alternative, and consistently finds good settings for different metaheuristics. To confirm the validity of this approach, we present a case study for fine-tuning two distinct metaheuristics: Simulated Annealing (SA) and Genetic Algorithm (GA), in order to solve a classical task scheduling problem. The results of the proposed approach are compared with results yielded by the same metaheuristics tuned through different strategies, such as the brute-force and racing. Broadly, the proposed method proved to be effective in terms of the overall time of the tuning process. Our results from experimental studies reveal that metaheuristics tuned by means of HORA reach the same good results than when tuned by the other time-consuming fine-tuning approaches. Therefore, from the results presented in this study it is concluded that HORA is a promising and powerful tool for the fine-tuning of different metaheuristics, mainly when the overall time of tuning process is considered.

Paper Nr: 13
Title:

Design a Study for Determining Labour Productivity Standard in Canadian Armed Forces Food Services

Authors:

Manchun Fang

Abstract: Canadian Armed Forces (CAF) Food Services recently implemented a standardized menu at all static service locations. Within this new regime, CAF Food Services requires a standard against which they can measure labour performance and use to inform future rationalization of staffing. To start, a pilot study was conducted in February and March 2015 to collect labour performance data. In this paper, we review the results from the pilot study. Due to issues identified with the pilot study, this paper also proposes a revised design and analytical approach for a follow-on study.

Paper Nr: 15
Title:

A New Procedure to Calculate the Owen Value

Authors:

José Miguel Giménez and María Albina Puente

Abstract: In this paper we focus on games with a coalition structure. Particularly, we deal with the Owen value, the coalitional value of the Shapley value, and we provide a computational procedure to calculate this coalitional value in terms of the multilinear extension of the original game.

Paper Nr: 19
Title:

Ability to Separate Situations with a Priori Coalition Structures by Means of Symmetric Solutions

Authors:

José Miguel Giménez

Abstract: We say that two situations described by cooperative games are inseparable by a family of solutions, when they obtain the same allocation by all solution concept of this family. The situation of separability by a family of linear solutions reduces to separability from the null game. This is the case of the family of solutions based on marginal contributions weighted by coefficients only dependent of the coalition size: the semivalues. It is known that for games with four or more players, the spaces of inseparable games from the null game contain games different to zero-game. We will prove that for five or more players, when a priori coalition blocks are introduced in the situation described by the game, the dimension of the vector spaces of inseparable games from the null game decreases in an important manner.

Paper Nr: 27
Title:

Development of an Innovative Methodology Supporting Project Risk Management in the Manufacturing Company of the Automotive Industry

Authors:

Anna Gembalska-Kwiecień

Abstract: The presented article attempts to develop an innovative methodology for supporting risk management of the implementation of projects. The methodology applies to manufacturing companies of the automotive industry, because it is one of the industries where the projects are comparable to each other. On this basis, it is possible to identify the risks that occurred in the past during the various stages of projects, which can contribute to more effective risk management during the current and future projects. The paper presents selected methods of data analysis: statistical method and method of graphical data visualization. There are also shown recommendations for data collection and processing which will enable the development of the innovation called authorial methodology. This developed methodology describes how to collect data on ongoing projects, as well as how to make their analysis to allow their subsequent use. The presented methodology is to aimed at optimizing decision making for project implementation in management sciences.

Paper Nr: 30
Title:

A Proposed Model for the Students’ Perceived Satisfaction in the Computer Architecture Course - An Exploratory Study based on Oliver’s Perceived Satisfaction Model

Authors:

Jorge Fernando Maxnuck Soares

Abstract: This paper presents a study with the preliminary results of an exploratory research to study the possibility of adapting the Perceived Satisfaction model proposed by Oliver (1997) into a model involving the Student Perceived Satisfaction in the Computer Architecture course, where practical classes were conducted using the Marie CPU simulator as a mechanism of active learning practice. This model seeks to represent the student satisfaction in relation to three latent variables: Equity, Performance and Expectations, where the data processing is performed by applying the correlations technique with the Spearman coefficient. The main goal was to find the correlations regarding the student satisfaction with respect to the use of Marie CPU Simulator as an active learning practice in the Computer Architecture course. The components of these variables, in principle, precede the student satisfaction and allow for the identification of those that are the most relevant to determine this response.

Paper Nr: 37
Title:

Network of M/M/1 Cyclic Polling Systems

Authors:

Carlos Martínez-Rodríguez

Abstract: This paper presents a Network of Cyclic Polling Systems that consists of two cyclic polling systems with two queues each when transfer of users from one system to the other is imposed. This system is modelled in discrete time. It is assumed that each system has exponential inter-arrival times and the servers apply an exhaustive policy. Closed form expressions are obtained for the first and second moments of the queue's lengths for any time.

Paper Nr: 38
Title:

Extended Shortest Path Problem - Generalized Dijkstra-Moore and Bellman-Ford Algorithms

Authors:

Maher Helaoui

Abstract: The shortest path problem is one of the classic problems in graph theory. The problem is to provide a solution algorithm returning the optimum route, taking into account a valuation function, between two nodes of a graph G. It is known that the classic shortest path solution is proved if the set of valuation is IR or a subset of IR and the combining operator is the classic sum (+). However, many combinatorial problems can be solved by using shortest path solution but use a set of valuation not a subset of IR and/or a combining operator not equal to the classic sum (+). For this reason, relations between particular valuation structure as the semiring and diod structures with graphs and their combinatorial properties have been presented. On the other hand, if the set of valuation is IR or a subset of IR and the combining operator is the classic sum (+), a longest path between two given nodes s and t in a weighted graph G is the same thing as a shortest path in a graph -G derived from G by changing every weight to its negation. In this paper, in order to give a general model that can be used for any valuation structure we propose to model both the valuations of a graph G and the combining operator by a valuation structure S. We discuss the equivalence between longest path and shortest path problem given a valuation structure S. And we present a generalization of the shortest path algorithms according to the properties of the graph G and the valuation structure S.

Paper Nr: 52
Title:

A Single-source Weber Problem with Continuous Piecewise Fixed Cost

Authors:

Gabriela Iriarte and Pablo Escalona

Abstract: This paper analyzes the location of a distribution center in an urban area using a single-source Weber problem with continuous piecewise fixed cost to find a global optimal location. The fixed cost is characterized by a Kriging interpolation method. To make the fixed cost tractable, we approximate this interpolation with a continuous piecewise function that is convex in each piece, using Delaunay triangulation. We present a decomposition formulation, a decomposition conic formulation and a conic logarithmic disaggregated convex combination model to optimally solve the single-source Weber problem with continuous piecewise fixed cost. Although our continuous approach does not guarantee the global optimal feasible location, it allows us to delimit a zone where we can intensify the search of feasible points. For instances we tested, computational results show that our continuous approach found better locations than the discrete approach in 23.25% of the instances and that the decomposition formulation is the best one, in terms of CPU time.

Paper Nr: 65
Title:

Survey of Reverse Logistics Practices - The Case of Portugal

Authors:

Ricardo Simões and Carlos Carvalho

Abstract: Reverse Logistics (RL) has gained substantial relevance in the field of supply chain management, mainly because RL combines environmental, economic and social factors. Although there are studies on RL practices, none of these studies are related to the Portuguese case. Therefore, a survey was conducted in Portugal to fill this gap. This study was applied to a group of Portuguese companies of four industrial sectors. These four sectors are highly diversified, regarding the way RL is managed. The results demonstrate that companies consider the management of RL important. The most common practice used is the proper disposal of returned products. The companies mainly adopt RL due to the benefits associated with the improvement of customer satisfaction and the reduction in logistics costs. The biggest barrier to the implementation of RL is a lack of strategic planning by the companies on handling returned products. The main reason affecting the performance of RL activities is the lack of quality of the returned product. The study also allowed to estimate the volume of returned products and the costs of RL.

Paper Nr: 66
Title:

Mathematical Modeling Approaches to Solve the Line Balancing Problem

Authors:

Shady Salama

Abstract: The assembly line balancing problem belongs to the class of NP-hard combinatorial optimisation problem. For several decades’ line balancing took attention of researchers who are trying to find the solutions for real world applications. Although tremendous works have been done, the gap still exists between the research and the real problems. This paper provides analysis of about 50 papers that used mathematical modeling in solving line balancing problems. Thereafter, a framework is proposed for future work.

Paper Nr: 71
Title:

Selecting Genetic Operators to Maximise Preference Satisfaction in a Workforce Scheduling and Routing Problem

Authors:

Haneen Algethami

Abstract: The Workforce Scheduling and Routing Problem (WSRP) is a combinatorial optimisation problem that involves scheduling and routing of workforce. Tackling this type of problem often requires handling a considerable number of requirements, including customers and workers preferences while minimising both operational costs and travelling distance. This study seeks to determine effective combinations of genetic operators combined with heuristics that help to find good solutions for this constrained combinatorial optimisation problem. In particular, it aims to identify the best set of operators that help to maximise customers and workers preferences satisfaction. This paper advances the understanding of how to effectively employ different operators within two variants of genetic algorithms to tackle WSRPs. To tackle infeasibility, an initialisation heuristic is used to generate a conflict-free initial plan and a repair heuristic is used to ensure the satisfaction of constraints. Experiments are conducted using three sets of real-world Home Health Care (HHC) planning problem instances.

Paper Nr: 83
Title:

Progressive Hedging and Sample Average Approximation for the Two-stage Stochastic Traveling Salesman Problem

Authors:

Pablo Adasme

Abstract: In this paper, we propose an adapted version of the progressive hedging algorithm (PHA) (Rockafellar and Wets, 1991; Lokketangen and Woodruff, 1996; Watson and Woodruff, 2011) for the two-stage stochastic traveling salesman problem (STSP) introduced in (Adasme et al., 2016). Thus, we compute feasible solutions for small, medium and large size instances of the problem. Additionally, we compare the PHA method with the sample average approximation (SAA) method for all the randomly generated instances and compute statistical lower and upper bounds. For this purpose, we use the compact polynomial formulation extended from (Miller et al., 1960) in (Adasme et al., 2016) as it is the one that allows us to solve large size instances of the problem in short CPU time with CPLEX. Our preliminary numerical results show that the results obtained with the PHA algorithm are tight when compared to the optimal solutions of small and medium size instances. Moreover, we obtain significantly better feasible solutions than CPLEX for large size instances with up to 100 nodes and 10 scenarios in significantly low CPU time. Finally, the bounds obtained with SAA method provide an average reference interval for the stochastic problem.

Paper Nr: 90
Title:

Chat Based Contact Center Modeling - System Modeling, Parameter Estimation and Missing Data Sampling

Authors:

Per Enqvist and Göran Svensson

Abstract: A Markovian system model for a contact center chat function is considered and partially validated. A hypothesis test on real chat data shows that it is reasonable to model the arrival process as a Poisson process. The arrival rate can be estimated using Maximum likelihood. The service process is more involved and the estimation of the service rate depends on the number of simultaneous chats handled by an agent. The estimation is made more difficult by the low level of detail in the given data-sets. A missing data approach with Gibbs sampling is used to obtain estimates for the service rates. Finally, we try to capture the generalized behaviour of the service-process and propose to use generalized functions to describe it when little information is available about the system at hand.

Paper Nr: 92
Title:

Parallel-machine Scheduling with Precedence Constraints and Controllable Job-processing Times

Authors:

Kailiang Xu

Abstract: A parallel-machine scheduling problem with tree-formed precedence constraints is studied, where jobprocessing times can be compressed by allocating extra resource to the jobs. A tabu-search algorithm is developed in this paper, which searches for the near optimal solutions based on the structural characteristics of the problem. Experiment shows the algorithm is capable for providing satisfactory schedules for media and large-sized problems within acceptable computing time.

Paper Nr: 93
Title:

Towards Collaborative Optimisation in a Shared-logistics Environment for Pickup and Delivery Operations

Authors:

Timothy Curtois, Wasakorn Laesanklang and Dario Landa-Silva

Abstract: This paper gives an overview of research work in progress within the COSLE (Collaborative Optimisation in a Shared Logistics Environment) project between the University of Nottingham and Microlise Ltd. This is an R&D project that seeks to develop optimisation technology to enable more efficient collaboration in transportation, particularly real-world operational environments involving pickup and delivery problems. The overall aim of the project is to integrate various optimisation techniques into a framework that facilitates collaboration in a shared freight transport logistics environment with the overall goal of reducing empty mileage.

Area 2 - Applications

Full Papers
Paper Nr: 7
Title:

Search-and-Fetch with 2 Robots on a Disk - Wireless and Face-to-Face Communication Models

Authors:

Konstantinos Georgiou

Abstract: We introduce and study treasure-evacuation with 2 robots, a new problem on distributed searching and fetching related to well studied problems in searching, rendezvous and exploration. The problem is motivated by real-life search-and-rescue operations in areas of a disaster, where unmanned vehicles (robots) search for a victim (treasure) and subsequently bring (fetch) her to safety (the exit). One of the critical components in such operations is the communication protocol between the robots. We provide search algorithms and contrast two standard models, the face-to-face and the wireless model. Our main technical contribution pertains to the face-to-face model. More specifically, we demonstrate how robots can take advantage of some minimal information of the topology (i.e., the disk) in order to allow for information exchange without meeting. The result is a highly efficient distributed treasure-evacuation protocol which is minimally affected by the lack of distant communication.

Paper Nr: 48
Title:

New Scenario-based Stochastic Programming Problem for Long-term Allocation of Renewable Distributed Generations

Authors:

Ikki Tanaka and Hiromitsu Ohmori

Abstract: Large installation of distributed generations (DGs) of renewable energy sources (RESs) on distribution network has been one of the challenging tasks in the last decade. According to the installation strategy of Japan, long-term visions for high penetration of RESs have been announced. However, specific installation plans have not been discussed and determined. In this paper, for supporting the decision-making of the investors, a new scenario-based two-stage stochastic programming problem for long-term allocation of DGs is proposed. This problem minimizes the total system cost under the power system constraints in consideration of incentives to promote DG installation. At the first stage, before realizations (scenarios) of the random variables are known, DGs’ investment variables are determined. At the second stage, after scenarios become known, operation and maintenance variables that depend on scenarios are solved. Furthermore, a new scenario generation procedure with clustering algorithm is developed. This method generates many scenarios by using historical data. The uncertainties of demand, wind power, and photovoltaic (PV) are represented as scenarios, which are used in the stochastic problem. The proposed model is tested on a 34 bus radial distribution network. The results provide the optimal long-term investment of DGs and substantiate the effectiveness of DGs.

Paper Nr: 51
Title:

On the Impact of using Mixed Integer Programming Techniques on Real-world Offshore Wind Parks

Authors:

Martina Fischetti and David Pisinger

Abstract: Wind power is a leading technology in the transition to sustainable energy. Being a new and still more competitive field, it is of major interest to investigate new techniques to solve the design challenges involved. In this paper, we consider optimization of the inter-array cable routing for offshore wind farms, taking power losses into account. Since energy losses in a cable depend on the load (i.e. wind), cable losses are estimated by considering a possibly large number wind scenarios. In order to deal with different wind scenarios efficiently we used a precomputing strategy. The resulting optimization problem considers two objectives: minimizing immediate costs (CAPEX) and minimizing costs due to power losses. This makes it possible to perform various what-if analyses to evaluate the impact of different preferences to CAPEX versus reduction of power losses. Thanks to the close collaboration with a leading energy company, we have been able to report results on a set of real-world instances, based on six existing wind parks, studying the economical impact of considering power losses in the cable routing design phase.

Paper Nr: 54
Title:

Optimization of Integrated Batch Mixing and Continuous Flow in Glass Tube & Fluorescent Lamp

Authors:

Mina Faragallah and A. A. Elimam

Abstract: This paper deals with production planning of in-series continuous flow, and discrete production plants. The work is applied to glass and fluorescent lamp industry, where raw materials are mixed in batches, charged to a continuous furnace to produce glass tubes, and then assembled into discrete lamps. A non-linear programming model was formulated from the raw material mixing stage till the production of fluorescent lamps. Using the model, the amount of each raw material can be obtained at minimum cost, while satisfying the desired properties of the produced glass. The model also provides the optimum lamp production amounts, inventory levels, and the glass pull rate from the furnace, which determines the production amounts of glass tubes. An important factor in the continuous flow process is the amount of broken glass (cullet) added in the furnace, which has an impact of raw material cost and natural gas consumption. In order to solve the model, separable programming methods and linear approximations were used to transform the non-linear terms. Results are validated versus actual production data from local Glass & Lamp factories, and the model proved to be an efficient tool of integrating the whole process at minimum cost.

Paper Nr: 55
Title:

A Dwell Time-based Container Positioning Decision Support System at a Port Terminal

Authors:

Myriam Gaete G. and Marcela C. González-Araya

Abstract: In this article, a methodology as well as a decision support system for the container storage assignment at a yard of a container terminal is proposed. The motivation of the proposed methodology are the cases of container terminals where inland flows present high levels of uncertainty and variability. This situation is typical of ports in developing countries such as is the case in Latin America where due to lack of automation, there are many paper-based procedures and little coordination with the hinterland. The proposed methodology is based on a dwell time segregated storage policy, considering only import containers (due to the difficulty to determine segregation criteria for this type of containers). Dwell times are discretized in order to determine dwell time classes or segregations, so that containers of the same segregation are assigned to close locations at the yard. As a case study, the port of Arica in Chile is considered. A discrete-event simulation model is also proposed to estimate potential benefits of the proposed methodology. Numerical results for the case study show a good performance, with potential reduction of the rehandles incurred.

Paper Nr: 62
Title:

Two-level Approach for Scheduling Multiproduct Oil Distribution Systems

Authors:

Hossein Mostafaei and Pedro M. Castro

Abstract: A core component of the oil supply chain is the distribution of products. Of the different types of distribution modes used, transportation by pipeline is one of the safest and most cost-effective ways to connect large supply sources to local distribution centers, where products are loaded into tanker trucks and delivered to customers. This paper presents a two-level optimization approach for detailed scheduling of tree-like pipeline systems with a unique refinery and several distribution centers. A mixed-integer linear programming (MILP) formulation is tackled in each level, with the upper and lower level models providing the aggregate and detailed pipeline schedules, respectively. Both models neither discretize time nor divide a pipeline segment into packs of equal size. Solutions to two case studies, one using real-life industrial data, show significant reductions in both operational cost and the CPU time with regards to previous two level approaches.

Short Papers
Paper Nr: 34
Title:

Applying Systems Thinking onto Emergency Response Planning - Using Soft Systems Methodology to Structure a National Act in Sweden

Authors:

Christine Große

Abstract: This paper outlines a soft systems method approach to model a national preparedness planning procedure for the case of an electrical power shortage. Through the model, we provide a new perspective on enhancing and understanding the joint decision-making environment for the actors involved in the planning procedure, as well as its underlying power structure. By a process of abstraction from the current implementation, a core root definition is presented which provides a generic systems view that can be a useful concept for the study of similar contexts. An action model dedicated to determining meaningful and valid activities is derived, providing insights for the improvement of collaborative emergency response planning in general. The paper, thus, aims to contribute to the communication and cooperation between actors and stakeholders in the development of appropriate decision processes and decision support in the context of emergency preparedness.

Paper Nr: 53
Title:

Applying Mathematical Programming to Planning Bin Location in Apple Orchards

Authors:

Marcela C. González-Araya

Abstract: In Chile, it has been observed that there is downtime during the apple harvest season. This is largely due to the long distances that the workers must cover and the lack of bins in the orchards. Currently, the administrators do not use methods that enable them to estimate the number of bins required or where they should be located. Taking these observations into consideration, this research paper proposes a plan for bin placement in apple orchards by applying a location model with the objective of diminishing distances covered by the harvest personnel. With data from an orchard in the Maule Region of Chile, the number of bins to be used is calculated taking into consideration the particular surface characteristics of the plantation and the apple variety maturity indicators. For the spatial distribution of the bins, the capacitated p-median was used, because better results were obtained with it in terms of reducing travel distance during the harvest and the ease of implementing the solutions.

Paper Nr: 68
Title:

A Stackelberg Game Model between Manufacturer and Wholesaler in a Food Supply Chain

Authors:

Javiera A. Bustos, Sebastián H. Olavarria and Víctor M. Albornoz

Abstract: : This paper describes an application of the Stackerlberg game model for the food supply chain. Specifically, the focus of this work is on the pork industry and considers a production game. Such game includes two players, manufacturer and wholesaler, who both aim to maximize profit. The role of leader is played by the manufacturer, and follower by the wholesaler. Decisions involved in the game are the level of production, quantity to be sold by the leader, and level of purchased products by the follower at each time period. This paper presents a case study, and results show that coordination between these players is seen in cost savings and improved service level.

Paper Nr: 89
Title:

Optimizing Supply Chain Management in Coal Power Generation

Authors:

Muhamad Iqbal Felani

Abstract: Indonesian government launched Fast Track Program Phase-1 in 2009 to increase national electricity ratio by installing 35 coal power generations with total capacity 10,000 Mega Watt. However only 25 coal power generations had been installed by now, spread all over Indonesia. Coal necessities were supplied by 14 domestic coal mining companies. There are two factors that affect the price of coal i.e : distance and unit price. Distance between supplier and coal power generation would determine the transportation cost while unit price would determine the price of procurement. The aim of this research is to minimize total price of coal by optimizing the distance and unit price (USD/Ton), allocating the coal necessities and scheduling the delivery. The optimization would be simulated using software What’sBest. By this simulation, 24 power plants were suggested to change their existing suppliers, while only one power plant was fitted. This change could reduce USD 27 Million/year for total price of coal.

Posters
Paper Nr: 11
Title:

A Surveillance Application of Satellite AIS - Utilizing a Parametric Model for Probability of Detection

Authors:

Cheryl Eisler

Abstract: The question of having sufficient surveillance capability to detect illicit behaviour in order to inform decision makers in a timely fashion is of the ultimate importance to defence, security, law enforcement, and regulatory agencies. Quantifying such capability provides a means of informing asset allocation, as well as establishing the link to risk of mission failure. Individual sensor models can be built and integrated into a larger model that layers sensor performance using a set of metrics that can take into account area coverage, coverage times, revisit rates, detection probabilities, and error rates. This paper describes an implementation of a parametric model for Satellite Automated Identification System (S-AIS) sensor performance. Utilizing data from a real data feed, the model was able to determine the percentage of uncorrupted S-AIS messages and the probability of detection of at least one correct S-AIS message received during an observation interval. It is important to note that the model implementation was not actively calculating the effect of message overlap based on satellite altitude and footprint width, or reductions in collisions due to signal decollision algorithms.

Paper Nr: 16
Title:

Variable Neighbourhood Search Solving Sub-problems of a Lagrangian Flexible Scheduling Problem

Authors:

Alexander Hämmerle and Georg Weichhart

Abstract: New technologies allow the production of goods to be geographically distributed across multiple job shops. When optimising schedules of production jobs in such networks, transportation times between job shops and machines can not be neglected but must be taken into account. We have researched a mathematical formulation and implementation for flexible job shop scheduling problems, minimising total weighted tardiness, and considering transportation times between machines. Based on a time-indexed problem formulation, we apply Lagrangian relaxation, and the scheduling problem is decomposed into independent job-level sub-problems. This results in multiple single job problems to be solved. For this problem, we describe a variable neighbourhood search algorithm, efficiently solving a single flexible job (sub-)problem with many timeslots. The Lagrangian dual problem is solved with a surrogate subgradient search method aggregating the partial solutions. The performance of surrogate subgradient search with VNS is compared with a combination of dynamic programming solving sub-problems, and a standard subgradient search for the overall problem. The algorithms are benchmarked with published problem instances for flexible job shop scheduling. Based on these instances we present novel problem instances for flexible job shop scheduling with transportation times between machines, and lower and upper bounds on total weighted tardiness are calculated for these instances.

Paper Nr: 41
Title:

Application of the Six Sigma Method for Improving Maintenance Processes – Case Study

Authors:

Michał Zasadzień

Abstract: The article presents an implementation attempt of the DMAIC method used in the Six Sigma concept for the improvement of production processes connected with maintenance. Thanks to the tools included therein (process map, FMEA, SIPOC chart) we were able to define the: problem, i.e. which types of breakdowns cause the most machine stoppage; precise structure of the failure removal process and its needs, owners, resources, client-supplier relationships in particular sub-processes; source causes for overly long stoppages. Learning the process and the causes of malfunctions allowed us to develop improvement procedures aimed at minimising the fault removal times. The procedures developed have been implemented in the company alongside a control plan, which will ensure supervision and their efficient functioning in the future.

Paper Nr: 50
Title:

On a Traveling Salesman based Bilevel Programming Problem

Authors:

Pablo Adasme and Rafael Andrade

Abstract: In this paper, we consider a linear bilevel programming problem where both the leader and the follower maximize their profits subject to budget constraints. Additionally, we impose a Hamiltonian cycle topology constraint in the leader problem. In particular, models of this type can be motivated by telecommunication companies when dealing with traffic network flows from one server to another one within a ring topology framework. We transform the bilevel programming problem into an equivalent single level optimization problem that we further linearize in order to derive mixed integer linear programming (MILP) formulations. This is achieved by replacing the follower problem with the equivalent Karush Kuhn Tucker conditions and with a linearization approach to deal with the complementarity constraints. The topology constraint is handled by the means of two compact formulations and an exponential one from the classic traveling salesman problem. Thus, we compute optimal solutions and upper bounds with linear programs. One of the compact models allows to solve instances with up to 250 nodes to optimality. Finally, we propose an iterative procedure that allows to compute optimal solutions in remarkably less computational effort when compared to the compact models.

Paper Nr: 56
Title:

Supporting Harvest Planning Decisions in the Tomato Industry

Authors:

Eduardo A. Alarcón Gerbier

Abstract: Tomato is a raw material that easily deteriorates once harvested and loaded on trucks, losing juice and flesh. Therefore, the reduction of trucks’ waiting times in the receiving area of a processing plant can allow reducing tomato waste. In this article, we develop a model that aims to keep a continuous flow of fresh tomato to a paste processing plant and to decrease trucks’ waiting times in the plant receiving area. The model is used in a real case of a tomato paste company. The obtained solutions present a better allocation of the harvest shifts, allowing more uniform truck arrivals to the plant during the day. Therefore, trucks waiting times are reduced, decreasing raw material deterioration.

Paper Nr: 59
Title:

A Fuzzy Chance-constraint Programming Model for a Home Health Care Routing Problem with Fuzzy Demand

Authors:

Yong Shi

Abstract: Home Health Care (HHC) companies are widespread in European countries, and aim to serve patients at home to help them recover from illness and injury in a personal environment. Since transportation costs constitute one of the largest forms of expenditure in the Home Health Care industry, it is of great significance to research the optimization of the Home Health Care logistics. This paper considers the Home Health Care Routing Problem with Fuzzy Demand, which comes from the logistics practice of the home health care company. A fuzzy chance constraint programming model is proposed based on the fuzzy credibility theory, the hybrid genetic algorithm and stochastic simulation method are integrated to solve the proposed model. Firstly the uncertain constraints have been reduced to the deterministic ones, experimental results for the benchmark test problem show the good efficiency of the proposed algorithm. Then the proposed hybrid algorithm has been applied to solve the fuzzy model, the influence of the parameters to the objective function has been discussed. This research will help HHC companies to make appropriate decisions when arranging their vehicle routes.

Paper Nr: 60
Title:

Assessment of Relative Technical Efficiency of Small Mental Health Areas in Bizkaia (Basque Country, Spain)

Authors:

Nerea Almeda, Carlos García-Alonso and José Alberto Salinas-Pérez

Abstract: Mental disorders cause an enormous burden to society. Considering the current economic context, an efficient use of scarce inputs, with an appropriate outcome production, is crucial. This situation defines a classical Relative Technical Efficiency (RTE) problem. A well-known methodology to assess RTE is the Data Envelopment Analysis, although it presents some limitations. These may be overcome through a hybrid strategy that integrates Monte-Carlo simulation and artificial intelligence. This study aims to (1) design of a Decision Support System for the assessment of RTE of Small Mental Health Areas based on DEA; and (2) analyse 19 mental health areas of the Bizkaian Healthcare System (Spain) to classify them and to identify potential management improvements. The results have showed higher global RTE in the output-oriented orientation than in the input-oriented one. This suggests that a decision strategy based on improving the input management, within the ranges of the expert-driven model of community healthcare, could be appropriate. A future research line will focus our attention on the validation process through the analysis of micromanagement interventions and their potential impacts in the real system.

Paper Nr: 61
Title:

Measuring the Efficiency of the Food Industry in Central and East European Countries by using the Data Envelopment Analysis Approach

Authors:

Zrinka Lukač and Margareta Gardijan

Abstract: The food industry plays an important role in economy of many countries. It is the leading manufacturing industry in EU in terms of turnover, value added and employment. However, it has been facing a decrease in competitiveness lately. In this paper we study the competitiveness of very large companies from the food industry sector in central and east European countries (CEE) by measuring their efficiency within the Data Envelopment Analysis (DEA) approach. The efficiency analysis is conducted by using the BCC model where certain financial ratios are used as its inputs and outputs. The study includes more than 200 very large companies from 13 CEE countries over time period from 2005-2013. The research results have shown that although some countries were more efficient than the others during the entire research period, no patterns in the efficiency of the food industry subsectors could be recognised. On the other hand, DEA approach enabled recognizing sources of inefficiency on a national level.

Paper Nr: 87
Title:

Comparison of Theoretical and Simulation Analysis of Electricity Market for Integrative Evaluation of Renewable Energy Policy

Authors:

Masaaki Suzuki

Abstract: Governments have introduced various policies for promoting renewable energy technologies. In particular, feed-in tariff (FIT) and renewable portfolio standard (RPS) have been introduced in various countries. In this work, multi-agent simulations of electricity markets with FIT/RPS have been conducted for integrative analysis and rational design of renewable energy policies. We analyze the effects of the FIT price and RPS level on social welfare. By comparing the results obtained from the simulation and the equilibrium analysis, we have examined the policies from both bottom-up and top-down viewpoints comprehensively.

Doctoral Consortium

DCORES 2017

Short Papers
Paper Nr: 1
Title:

The Route Network Development Problem based on QSI Models

Authors:

Assia Kamal Idrissi

Abstract: The growth of air passenger needs has forced airlines to improve their quality of service. Airlines have to choose flight schedules by considering demand, passengers preferences and competitors. The problem of allocating a new flight involves the route network development, and consists to determine a set of (Origin-Destination) pairs to serve and then choose flight schedules with respect to the Quality of Service Index (QSI) model. In this PhD project, we work with a software tool developed by the company Milanamos that helps airline managers to make decisions about destinations to serve. As a starting point, we define the flight radius problem related to this software. It is a sub-problem of the route network development problem and aims to optimize the visualization of the pertinent network by showing only interesting airports regarding QSI model. In this paper, we present the problem of allocating a new flight and formulate the flight radius problem as a problem of finding maximal sub-graph. Our objective is to locate in the network what routes represent business opportunities and are attractive regarding competition so it can be visualized. We construct the graph from Milanamos the database using the time-independent approach and store it in Neo4j a graph database. We describe the process of generating and storing the graph in Neo4j and sum up by outlining the expected outcome.

Paper Nr: 2
Title:

Langrangian Relaxation of Multi Level Capacitated Lot Sizing Problem with Consideration of Lead Time

Authors:

Hanaa Razki and Ahmed Moussa

Abstract: Tactical planning consists to develop production plans by determining the quantities of products manufactured by period to best meet customer demand at lower costs. This issue has been widely discussed, according to two criteria: multi level and single level planning. The concept of multi level reflects well the manufacturing structure. For this, we propose in this work a new mathematical model of lot sizing finite capacity (Multi Level Capacitated Lot Sizing Problem) based on Lagrangian relaxation optimization approach. Comparisons of this new model with traditional one demonstrate the efficiency of this new approach as well in simulated case as real situations. The generated production plans are optimal with 68% -98% compared to classical models.

Open Communications

OCommICORES 2017

Full Papers
Paper Nr: 3
Title:

Modeling a Materials Feeding System in a Hospital Operating Room

Authors:

Veronique Limère and Lawrence Bonczar

Abstract: In an industrial context, different line feeding policies are used to supply assembly lines with the required parts. The past decade, the assembly line feeding problem has become a growing field of research. In this study, it is investigated how existing research from a manufacturing context can be used to model a material delivery system in a hospital, and more specifically in an operating room environment. A mathematical decision model is proposed to decide how materials should be provided. Results for a case study show a cost reduction of 15% with regard to the current practice.

Paper Nr: 4
Title:

Application of lean methodology to improve sustainability in the construction and demolition waste industry

Authors:

Marta Castilho Gomes

Abstract: The construction industry is one of the largest consumers of natural resources and one of the main generators of construction and demolition waste (CDW). Thus, the incorporation of recycled aggregates in construction is highly relevant to achieve sustainability in the construction industry. However, recycling of CDW is only feasible if aggregates are competitive in relation to new construction materials. This work focuses on a Portuguese company that manages the CDW generated by its demolition works, and aims at increasing the efficiency of the production process of recycled aggregates. The main objective is to propose the implementation of improvement actions based on lean management following the DMAIC cycle (Define - Measure - Analyze - Improve - Control) methodology, an organized and sequential method for solving problems using various tools and methodologies from the six sigma management philosophy. The study of recycling operations at the company led to identify lean wastes, understand their root causes and propose improvement strategies and respective forms of maintenance. The proposed improvement measures were designed in the future value stream mapping showing increased production capacity, reduction of lead time and consequently enhanced quality of the aggregates, lower production costs and strengthening of the company competitiveness.