ICORES 2015 Abstracts


Area 1 - Methodologies and Technologies

Full Papers
Paper Nr: 21
Title:

Scheduling Problem in Call Centers with Uncertain Arrival Rates Forecasts - A Distributionally Robust Approach

Authors:

Mathilde Excoffier, Céline Gicquel, Oualid Jouini and Abdel Lisser

Abstract: We focus on the staffing and shift-scheduling problem in call centers. We consider that the call arrival rates are subject to uncertainty and are following unknown continuous probability distributions. We assume that we only know the first and second moments of the distribution. We propose to model this stochastic optimization problem as a distributionally robust program with joint chance constraints. We consider a dynamic sharing out of the risk throughout the entire scheduling horizon. We propose a deterministic equivalent of the problem and solve linear approximations of the program to provide upper and lower bounds of the optimal solution. We applied our approach on a real-life instance and give numerical results.

Paper Nr: 22
Title:

Schedule Two-machine Flow-shop with Controllable Processing Times Using Tabu-search

Authors:

Kailiang Xu and Gang Zheng

Abstract: A two-machine flow-shop scheduling problem with controllable processing times modeled by a non-linear convex resource consumption function is considered in this paper. The objective is to minimize the resource consumption that is needed to control the makespan not to exceed the given deadline. The problem is proved to be strongly NP-hard. An optimal resource allocation algorithm is designed to calculate the optimal processing times of the jobs, while the optimal or near-optimal processing sequence of the jobs is determined by a heuristic or a tabu-search algorithm.

Paper Nr: 44
Title:

A Decomposition Method for Frequency Assignment in Multibeam Satellite Systems

Authors:

Jean-Thomas Camino, Christian Artigues, Laurent Houssin and Stéphane Mourgues

Abstract: To comply with the continually growing demand for multimedia content and higher throughputs, the telecom- munication industry has to keep improving the use of the bandwidth resources, leading to the well-known Frequency Assignment Problems (FAP). In this article, we present a new extension of these problems to the case of satellite systems that use a multibeam coverage. With the models we propose, we make sure that for each frequency plan produced there exists a corresponding satellite payload architecture that is cost-efficient and decently complex. Two approaches are presented and compared : a global constraint program that handles all the constraints simultaneously, and a decomposition method that involves both constraint programming and integer linear programming. For the latter approach, we show that the two identified subproblems can re- spectively be modeled as a multiprocessor scheduling problem and a path-covering problem, and this analogy is used to prove that they both belong to the category of NP-hard problems. We also show that, for the most common class of interference graphs in multibeam satellite systems, the maximal cliques can all be enumer- ated in polynomial time and their number is relatively low, therefore it is perfectly acceptable to rely on them in the scheduling model that we derived. Our experiments on realistic scenarios show that the decomposition method proposed can indeed provide a solution of the problem when the global CP model does not.

Paper Nr: 52
Title:

Selection-based Approach to Cooperative Interval Games

Authors:

Jan Bok and Milan Hladík

Abstract: Cooperative interval games are a generalized model of cooperative games in which worth of every coalition corresponds to a closed interval representing the possible outcomes of its cooperation. Selections are all possible outcomes of the interval game with no additional uncertainty. We introduce new selection-based classes of interval games and prove their characterizations and relations to existing classes based on the weakly better operator. We show new results regarding the core and imputations. Then we introduce the definition of strong imputation and strong core. We also examine a problem of equality of two different versions of core, which is the main stability solution of cooperative games.

Paper Nr: 53
Title:

Re-aggregation Approach to Large Location Problems

Authors:

Matej Cebecauer and Lubos Buzna

Abstract: The majority of location problems are known to be NP-hard. An aggregation is a valuable tool that allows to adjust the size of the problem and thus to transform it to the problem that is computable in a reasonable time. An inevitible consequence is the loss of the optimality due to aggregation error. The size of the aggregation error might be significant, when solving spatially large problems with huge number of customers. Typically, an aggregation method is used only once, in the initial phase of the solving process. Here, we propose new re-aggregation approach. First, our method aggregates the original problem to the size that can be solved by the used optimization algorithm, and in an each iteration the aggregated problem is adapted to achieve more precise location of facilities for the original problem. We use simple heuristics to minimize the sources of aggregation errors, know in the literature as, sources A, B, C and D. To investigate the optimality error, we use the problems that can be computed exactly. To test the efficiency of the proposed method, we compute large location problems reaching 80000 customers.

Paper Nr: 63
Title:

PARSim, a Simulation Model of the Royal Canadian Air Force (RCAF) Pilot Occupation - An Assessment of the Pilot Occupation Sustainability under High Student Production and Reduced Flying Rates

Authors:

René Séguin

Abstract: Training personnel to operate extremely complex and expensive equipment requires a large monetary investment and takes lengthy periods of time. It goes without saying that careful planning is of the outmost importance. Such is the case for military pilots. The Pilot Production, Absorption, Retention Simulation (PARSim) model that was developed by the Centre for Operational Research and Analysis (CORA) simulates the flow of pilots from recruitment, through training, onto the operational training units and into the various operational fleets, accounting for attrition, instructor pilots and staff requirements. A key feature of the model is that it simulates the acquisition of experience, dynamically adjusting the experience acquisition rate in response to the existing experience level on the squadrons and the availability of resources. The model is a tool that can be used to perform what-if analysis quickly and easily. This paper describes the simulation model and reports on a study where the impact of high production combined with reduced budgets is analysed.

Paper Nr: 70
Title:

A Posteriori Approach of Real-time Ridesharing Problem with Intermediate Locations

Authors:

Kamel Aissat and Ammar Oulmara

Abstract: Ridesharing is a travel mode that provides several benefits and solutions, such as the reduction of travel cost, the reduction of the traffic congestion and the provision of travel options. In the classical ridesharing approach, the driver makes a detour to the rider’s origin in order to pick-up the rider, then drives him to his destination and finally the driver goes to his own destination. This implies that the driver endures the whole detour and may not accept such matching if the detour is too long. However, the matching could be accepted if the rider meets the driver at an intermediate location. In this paper, we present a general ridesharing approach in which a driver and a rider accept to meet each other at an intermediate pick-up location and to separate at an intermediate drop-off location not necessarily their origins and destinations locations, respectively. Thus, for a given rider, we propose an exact and heuristic methods to determine the best driver and the best meeting locations that minimize a total travel cost. Finally, we perform a numerical study using a real road network and a real dataset. Our experimental analysis shows that our heuristics provide efficient performance within short CPU times and improve participants cost-savings and matching rate compared to the classical ridesharing.

Paper Nr: 79
Title:

A Sampling Method to Chance-constrained Semidefinite Optimization

Authors:

Chuan Xu, Jianqiang Cheng and Abdel Lisser

Abstract: Semidefinite programming has been widely studied for the last two decades. Semidefinite programs are linear programs with semidefinite constraint generally studied with deterministic data. In this paper, we deal with a stochastic semidefinte programs with chance constraints, which is a generalization of chance-constrained linear programs. Based on existing theoretical results, we develop a new sampling method to solve these chance constraints semidefinite problems. Numerical experiments are conducted to compare our results with the state-of-the-art and to show the strength of the sampling method.

Paper Nr: 90
Title:

Partner Selection in Formation of Virtual Enterprises using Fuzzy Logic

Authors:

Shahrzad Nikghadam, Bahram Lotfi Sadigh, Ahmet Murat Ozbayoglu, Hakki Ozgur Unver and Sadik Engin Kilic

Abstract: Virtual Enterprise (VE) is a temporary cooperation among independent enterprises to build up a dynamic collaboration framework for manufacturing. One of the most important steps to construct a successful VE is to select the most qualified partners to take role in the project. This paper is a survey of ranking the volunteer companies with respect to four evaluation criteria, proposed unit price, delivery time, quality and enterprises’ past performance. Fuzzy logic method is proposed to deal with these four conflicting criteria, considered as input variables of the model. As each criterion is different in nature with the other criterion, various membership functions are used to fuzzify the input values. The next step is to construct the logical fuzzy rules combining the inputs to conclude the output. Mamdani’s approach is adopted to evaluate the output in this Fuzzy Inference System. The result of the model is the partnership chance of each partner to participate in VE. A partner with highest partnership chance will be the winner of the negotiation. Implementation of this model to the illustrative example of a partner selection problem in virtual enterprise and comparing it with fuzzy-TOPSIS approach verifies the feasibility of the proposed approach and the computational results are satisfactory.

Short Papers
Paper Nr: 13
Title:

Veto Values in Group Decision Making within MAUT - Aggregating Complete Rankings Derived from Dominance Intensity Measures

Authors:

Antonio Jiménez-Martín, Pilar Sabio and Alfonso Mateos

Abstract: We consider a group decision-making problem within multi-attribute utility theory, in which the relative importance of decision makers (DMs) is known and their preferences are represented by means of an additive function. We allow DMs to provide veto values for the attribute under consideration and build veto and adjust functions that are incorporated into the additive model. Veto functions check whether alternative performances are within the respective veto intervals, making the overall utility of the alternative equal to 0, whereas adjust functions reduce the utilty of the alternative performance to match the preferences of other DMs. Dominance measuring methods are used to account for imprecise information in the decision-making scenario and to derive a ranking of alternatives for each DM. Specifically, ordinal information about the relative importance of criteria is provided by each DM. Finally, an extension of Kemeny’s method is used to aggregate the alternative rankings from the DMs accounting for their relative importance.

Paper Nr: 27
Title:

Existence of Fractional Solutions in NTU DEA Game

Authors:

Jing Fu and Shigeo Muto

Abstract: This paper deals with the problem of fairly allocating a certain amount of benefit among individuals or organizations with multiple criteria for their performance evaluation. It is an extension work of our paper on game theoretic approaches to weight assignments in data envelopment analysis (DEA) problems Sek. One of the main conclusions in our previous work is that the core of the TU (transferable utility) DEA game is non-empty if and only if the game is inessential, that is, the evaluation indices are identical for all the criteria for each player. This condition is equivalent to a trivial single-criterion setting, which motivates us to turn to the NTU (non-transferable utility) situation and check the existence of the fractional solutions. In this study, we contribute on showing the existence of a-core, and giving two sufficient conditions such that ß-core exists and is identical to a-core in NTU DEA game. Our discussion is also interesting in light of a direction to improve the robustness of the ß-core existence condition by relaxing the inessential condition in TU DEA game.

Paper Nr: 36
Title:

Causes of Delays in Portuguese Construction Projects

Authors:

Amílcar Arantes and Luís Miguel D. F. Ferreira

Abstract: Delays are frequent in engineering and construction projects all over the world. Delays can have severe impacts, such as time and cost overruns and conflicts between stakeholders, and can also lead to total abandonment of the contract. This paper presents the findings of a survey conducted to identify the most important causes of delays in the Portuguese construction industry. Among the respondents were representatives from all stakeholders involved: contractors, consultants and developers. The Relative Importance Index was adopted to classify the relative importance of the 46 identified causes of the delays. The results show that the main causes of delay were slow decision making by developers, change orders, unrealistic contract duration and specifications in contracts, financial constraints on the contractor and the type of bidding and award of contract process. Additionally, factor analysis revealed eight high-level causes that aggregate 30 of the original causes. These findings are expected to contribute to expanding the knowledge in the scientific community of construction management, in particular in the field of supply chain management, and helping the Portuguese industry in the reduction and prevention of delays in construction projects.

Paper Nr: 42
Title:

A Comparative Study of Network-based Approaches for Routing in Healthcare Wireless Body Area Networks

Authors:

Pablo Adasme, Rafael Andrade, Janny Leung and Abdel Lisser

Abstract: In this paper, we propose a minmax robust formulation for routing in healthcare wireless body area networks (WBAN). The proposed model minimizes the highest power consumption of each bio-sensor node placed in the body of a patient subject to flow rate and network topology constraints. We consider three topologies in the problem: a spanning tree, a star, and a ring topology as well. In particular, we use an equivalent polynomial formulation of the spanning tree polytope (Yannakakis, 1991) to avoid having an exponential number of cycle elimination constraints in the model. For the ring topology approach, we use constraints from the well known mixed integer linear programming (MILP) formulation of the traveling salesman problem (Pataki, 2003). Thus, we compute optimal solutions and lower bounds directly using the MILP and linear programming (LP) relaxations. Finally, we propose a Kruskal-based (Cormen et al., 2001) variable neighborhood search metaheuristic to improve the solutions obtained with the star topology approach. Our preliminary numerical results indicate that the tree approach is more convenient as it allows saving significantly more power while the ring approach is the most expensive one. They also indicate that the difference between the optimal objective function values for the tree and star formulations is not very large and that VNS can improve significantly the solutions obtained with the star configuration, although, at a higher computational cost.

Paper Nr: 46
Title:

Resource Allocation and Scheduling based on Emergent behaviours in Multi-Agent Scenarios

Authors:

Hanno Hildmann and Miquel Martin

Abstract: We present our observations regarding the emergent behaviour in a population of agents following a recently presented nature inspired resource allocation / scheduling method. By having agents distribute tasks among themselves based on their local view of the problem, we successfully balance the work across agents, while remaining flexible to adapt to dynamic scenarios where tasks are added, removed or modified. We explain the approach and within it the mechanisms that give rise to the emergent behaviour; we discuss the model used for the simulations, outline the algorithm and provide results illustrating the performance of the method.

Paper Nr: 66
Title:

Managing a Retail Perishable Product with RFID-enabled Inventory Visibility

Authors:

Özgen Karaer

Abstract: Radio Frequency Identification Technology (RFID) helps reduce or completely eliminate inventory record inaccuracy at retail stores, and thus facilitates inventory visibility in the system. In this paper, we investigate the value of RFID-enabled inventory visibility for a retailer that sells perishable/seasonal merchandise, such as fashion apparel. Because the retailer already commits to a total quantity for the item before the season begins, we cannot anticipate an increase in sales or a reduction in holding cost. Here, we formulate the value as a change in total revenue generated from the product. We characterize how this impact changes with respect to various factors in the model and identify its components.

Paper Nr: 67
Title:

Data Mining Analysis of Turn around Time Variation in a Semiconductor Manufacturing Line

Authors:

Il-Gyo Chong, Chenbo Zhu and Yanfeng Wu

Abstract: Variation reduction of Turn Around Time (TAT) in a manufacturing line is one of the important issues for line optimization. In a manufacturing line with many sequential process steps such as semiconductor fabrication, it is not easy to find the root causes of the TAT variation because (1) there might be a big time gap (more than 30 days) between cause and effect, and (2) there are so many machines (or tools) related with a process. The purpose of this paper is to propose a data mining based method to identify the root cause of TAT variation. We also aim to validate the performance of the proposed method through a simulation study.

Paper Nr: 75
Title:

An Optimization Approach for Job-shop with Financial Constraints - In the Context of Supply Chain Scheduling Considering Payment Delay between Members

Authors:

S. Kemmoe, D. Lamy and N. Tchernev

Abstract: In this paper the use of Job-Shop Scheduling Problem (JSSP) is addressed as a support for a supply chain scheduling considering financial exchange between different supply chain partners. The financial exchange is considered as the cash flow exchanges between different upstream and downstream partners. Moreover, several suppliers are involved in operations. The problem under study can be viewed as an extension of the classical JSSP. Machines are considered as business or logistic units with their own treasury and financial exchanges happen between the different partners. The goal then is to propose the best schedule considering initial cash flows in treasuries as given data. The problem is formulated as integer linear programming model, and then a powerful GRASPxELS algorithm is developed to solve large scale instances of the problem. The experiments on instances with financial constraints proved the methods addressed the problem efficiently in a short amount of time, which is less than a second in average.

Paper Nr: 76
Title:

Croujaction - A Novel Approach to Text-based Job Name Clustering with Correlation Analysis

Authors:

Zunhe Liu, Yan Liu, Xiao Yang, Shengyu Guo and Buyang Cao

Abstract: Job name clustering gradually becomes more and more important in terms of numerous anomaly detections and analysis of cloud performance nowadays. Unlike crude texts, job name is a kind of sequential characters or tokens. This made it a challenge for clustering based on job name text. In this paper we analysis the correlation between columns and use user-job correlation to improve classic algorithm TF-IDF. We optimize words tokenizing and feature sets generating. We use hierarchical clustering methods to implement experience. Finally we develop a module and evaluate the performance of optimized algorithm, delivering it as a product to a prestige e-commerce company.

Paper Nr: 80
Title:

Chemo-inspired Genetic Algorithm for Optimizing the Piecewise Aggregate Approximation

Authors:

Muhammad Marwan Muhammad Fuad

Abstract: In a previous work we presented DEWPAA: an improved version of the piecewise aggregate approximation representation method of time series. DEWPAA uses differential evolution to set weights to different segments of the time series according to their information content. In this paper we use a hybrid of bacterial foraging and genetic algorithm (CGA) to set the weights of the different segments in our improved piecewise aggregate approximation. Our experiments show that the new hybrid gives better results in time series classification.

Paper Nr: 92
Title:

A Linear Physical Programming Approach to Power Flow and Energy Storage Optimization in Smart Grids Models

Authors:

Gabriella Dellino, Carlo Meloni and Saverio Mascolo

Abstract: The Optimal Power Flow problem (OPF) plays a crucial role in the successful energy management of modern smart grids. The diffusion of renewable energy sources poses new challenges to the power grid in which integrated energy storage combined with green generation solutions can help to address challenges associated with both power supply and demand variability. This work refers to a smart grid context and proposes a time indexed OPF model considering storage dynamics, adopting a preference-based optimization method with chance constraints to provide a suitable service level.

Posters
Paper Nr: 10
Title:

Multi-objective Evolutionary Method for Dynamic Vehicle Routing and Scheduling Problem with Customers' Satisfaction Level

Authors:

Seyed Farid Ghannadpour and Mohsen Hooshfar

Abstract: This paper studies the multi-objective dynamic vehicle routing and scheduling problem by using an evolutionary method. In this model, all data and information required to the routing process are not known before planning and they revealed dynamically during the routing process and the execution of the routes. Moreover, the model tries to characterize the customers’ satisfaction and the service level issues by applying the concept of fuzzy time windows. The proposed model is considered as a multi-objective problem where the overall travelling distance, fleet size and waiting time imposed on vehicles are minimized and the customers’ satisfaction or the service level of the supplier to customers is maximized. To solve this multi-objective model, an evolutionary algorithm is developed to obtain the Pareto solutions and its performance is analyzed on various test problems in the literature. The computational experiments on data sets represent the efficiency and effectiveness of the proposed approach.

Paper Nr: 43
Title:

Game Theoretic Models for Competition in Public Transit Services

Authors:

Eddie Y. S. Chan and Janny M. Y. Leung

Abstract: As metropolitan areas grow, the need to travel by the populace has increased the burden on the transport systems, leading to increased traffic congestion and environmental concerns. In this paper, we discuss some game-theoretic models that can be used to investigate the competitive situation when several service providers offer public transit services. The competition among the operators can be modelled by a class of games called potential games, and we discuss mathematical programmes that can be used to find the Nash equilibria for these games. By examining the equilibrium solutions, we can investigate the relative merits and tradeoffs for different structures of the transit networks, and the interplay between the services offered and the overall ridership of the system.

Paper Nr: 48
Title:

A Cooperative Game Approach to a Production Planning Problem

Authors:

D. G. Ramírez-Ríos, D. C. Landinez, P. A. Consuegra, J. L. García and L. Quintana

Abstract: This paper deals with a production planning problem formulated as a Mixed Integer Linear Programming (MILP) model that has a competition component, given that the manufacturers are willing to produce as much products as they can in order to fulfil the market’s needs. This corresponds to a typical game theoretic problem applied to the productive sector, where a global optimization problem involves production planning in order to maximize the utilities for the different firms that manufacture the same type of products and compete in the market. This problem has been approached as a cooperative game, which involves a possible cooperation scheme among the manufacturers. The general problem was approached by Owen (1995) as the ``production game'' and the core was considered. This paper identifies the cooperative game theoretic model for the production planning MILP optimization problem and Shapley Value was chosen as the solution approach. The results obtained indicate the importance of cooperating among competitors. Moreover, this leads to economic strategies for small manufacturing companies that wish to survive in a competitive environment.

Paper Nr: 49
Title:

Robust Flight’ Scheduling in a Colombian Domestic Airline

Authors:

Alejandro Cadavid Tobón, Pablo Andrés Maya Duque and Juan Guillermo Villegas

Abstract: Air traffic has been grown rapidly, increasing the airlines’ competition, generating complex planning problems for airlines and major customers’ demands. Airlines’ profitability is highly influenced by its planners ability to face these challenges and build efficient schedules. In this paper, we developed a bi-objective optimization model for the timetabling problem of a Colombian domestic airline. Preliminary results show an increase of 12% respect to the current profitability of the airline.

Paper Nr: 51
Title:

Enumeration of Pareto Optima for a Bicriteria Evacuation Scheduling Problem

Authors:

Kaouthar Deghdak, Vincent T'kindt and Jean-Louis Bouquard

Abstract: In this paper, we consider a large-scale evacuation problem after an important disaster. We model the evacuation of a region from a set of collection points to a set of capacitated shelters with the help of buses, thus leading to scheduling the evacuation operations by buses (Bus Evacuation Problem, BEP). The goal is twofold; first, minimizing the total evacuation time needed to bring the resident out of the endangered region, and secondly, minimizing the total exposure to danger. The resulting problem is a bicriteria problem. We propose a time-indexed formulation, as well as several approaches for finding both upper and lower bounds for BEP used within a branch and bound algorithm. In computational experiments, we analyse and evaluate the efficiency of the proposed solution algorithms.

Paper Nr: 61
Title:

Multi-Objective Capacitated Disassembly Scheduling with Lost Sales

Authors:

Hajar Cherkaoui, Matthieu Godichaud and Lionel Amodeo

Abstract: Disassembly scheduling is one of the important problems in reverse logistic decisions. This paper focuses on this problem with capacity restrictions on disassembly resources, lost sales, multiple products and without part commonality. A model with two objectives is developed and optimized by a multi-objective approach. The first objective is a sum of several costs to minimize: setup cost, inventory cost, and over capacity penalty cost. The second objective is a measure of the service level. Considering the complexity of this model, a genetic algorithm is developed (NSGA-II) to obtain a set of Pareto-optimal solutions, the results are compared with those calculated by a mixed integer programming model. Results of computational experiments on randomly generated test instances indicates that the genetic algorithm gives good quality solutions up to all problem sizes in a reasonable amount of computation time whereas linear programming solvers do not give solution in reasonable time.

Paper Nr: 88
Title:

Data Mining for the Unique Identification of Patients in the National Healthcare Systems

Authors:

D. G. Ramírez-Ríos, Laura P. Manotas Romero, Heyder Paez-Logreira, Luis Ramírez and Yohany Andrés Jimenez Florez

Abstract: This paper considers the application of data mining (DM) algorithms as a feasible and necessary strategy for optimal management of databases (DB) in the national healthcare systems. Specifically it deals with the management of multiple DB that consider patient’s affiliation information, under the supervision of the authorities in healthcare, an issue that involves not only the issues of every citizen but also its integral right to be treated by any institution. We support the idea that the administrative part of the healthcare system should not obstruct the attention of the patient and a total efficiency must be guaranteed. We believe that DM algorithms are appropriate for this task and human intervention should be minimized. A case study was developed in Colombia that considered the multiple affiliations to DB and its integration to a unique DB managed by the District Health Secretary (DHS, which detected frauds and other type of duplicities. The mechanism used to approach this, indicates not only a significant reduction of manual intervention of the DB, but also allows the extraction of data for future analysis, supporting the patient’s need for an efficient and integral health attention, as well as privacy of personal information registered.

Paper Nr: 89
Title:

Split-and-Merge Method for Accelerating Convergence of Stochastic Linear Programs

Authors:

Akhil Langer and Udatta Palekar

Abstract: Stochastic program optimizations are computationally very expensive, especially when the number of scenarios are large. Complexity of the focal application, and the slow convergence rate add to its computational complexity. We propose a split-and-merge (SAM) method for accelerating the convergence of stochastic linear programs. SAM splits the original problem into subproblems, and utilizes the dual constraints from the subproblems to accelerate the convergence of the original problem. Our initial results are very encouraging, giving up to 58% reduction in the optimization time. In this paper we discuss the initial results, the ongoing and the future work.

Area 2 - Applications

Full Papers
Paper Nr: 16
Title:

Newsvendor Model in Rail Contract to Transport Gasoline in Thailand

Authors:

Kannapha Amaruchkul

Abstract: A long-term contract between a railroad company and a shipper who wants to transport gasoline daily is studied. The contract specifies an upfront payment for reserving bogies and a per-container freight rate. The shipper also uses a trucking company to transport the excess demand, but its per-liter transportation cost is higher. The shipper’s problem is to determine the number of bogies to reserve at the beginning of the contract duration, before daily demand is revealed. Since demand is perishable, we formulate the problem as a newsvendor model. The expected cost function is not convex. We show that the expected cost is unimodal and derive an optimal solution under certain conditions, which are not very restrictive. We also provide a sensitivity analysis with respect to the change in the contract parameter.

Paper Nr: 24
Title:

The Truck Scheduling Problem at Crossdocking Terminals - Exclusive versus Mixed Mode

Authors:

L. Berghman, C. Briand, R. Leus and P. Lopez

Abstract: In this paper we study the scheduling of the docking operations of trucks at a warehouse; each truck is either empty and needs to be loaded, or full and has to be unloaded (but not both). We focus on crossdocking, which is a recent warehouse concept that favors the transfers of as many incoming products as possible directly to outgoing trailers, without intermediate storage in the warehouse. We propose a time-indexed integer programming formulation for scheduling the loading and unloading of the trucks at the docks, and we distinguish between a so-called “mixed mode”, in which some or all of the docks can be used both for loading as well as unloading, and an “exclusive mode”, in which each dock is dedicated to only one of the two types of operations. Computational experiments are provided to compare the efficiency of the two modes.

Paper Nr: 33
Title:

A Mixed Integer Linear Program for Operational Planning in a Meat Packing Plant

Authors:

Víctor M. Albornoz, Marcela González-Araya, Matías C. Gripe and Sara V. Rodríguez

Abstract: This paper reports a mixed integer linear programming model to support the planning at an operational level in a meat packing plant. The deterministic formulation considers multi-products (yielded from different cutting patterns applied to the carcasses), multi-periods, a batch quality distribution on carcasses and perishability. The perishability of the product is modeled by the inclusion of disaggregated inventory decision variables that take into account a given maximum number of days for fresh product. The main contribution of the present work is to develop an optimization model in a real tactical planning problem. Also we develop a sensitivity analysis on the quality of the carcasses, subject to large variability. We present here two different scenarios, comparing them to asses their economical impact.

Paper Nr: 37
Title:

Non Emergency Patients Transport - A Mixed Integer Linear Programming

Authors:

José A. Oliveira, João Ferreira, Luís Dias, Manuel Figueiredo and Guilherme Pereira

Abstract: This work presents a model and a heuristic to solve the non-emergency patients transport (NEPT) service issues given the new rules recently established in Portugal. The model follows the same principle of the Team Orienteering Problem by selecting the patients to be included in the routes attending the maximum reduction in costs when compared with individual transportation. This model establishes the best sets of patients to be transported together. The model was implemented in AMPL and a compact formulation was solved using NEOS Server. A heuristic procedure based on iteratively solving problems with one vehicle was presented, and this heuristic provides good results in terms of accuracy and computation time.

Paper Nr: 41
Title:

Evaluation Heuristics for Tug Fleet Optimisation Algorithms - A Computational Simulation Study of a Receding Horizon Genetic Algorithm

Authors:

Robin T. Bye and Hans Georg Schaathun

Abstract: A fleet of tugs along the northern Norwegian coast must be dynamically positioned to minimise the risk of oil tanker drifting accidents. We have previously presented a receding horizon genetic algorithm (RHGA) for solving this tug fleet optimisation (TFO) problem. Here, we first present an overview of the TFO problem, the basics of the RHGA, and a set of potential cost functions with which the RHGA can be configured. The set of these RHGA configurations are effectively equivalent to a set of different TFO algorithms that each can be used for dynamic tug fleet positioning. In order to compare the merit of TFO algorithms that solve the TFO problem as defined here, we propose two evaluation heuristics and test them by means of a computational simulation study. Finally, we discuss our results and directions forward.

Paper Nr: 55
Title:

Mixed Integer Programming with Decomposition to Solve a Workforce Scheduling and Routing Problem

Authors:

Wasakorn Laesanklang, Dario Landa-Silva and J. Arturo Castillo Salazar

Abstract: We propose an approach based on mixed integer programming (MIP) with decomposition to solve a workforce scheduling and routing problem, in which a set of workers should be assigned to tasks that are distributed across different geographical locations. This problem arises from a number of home care planning scenarios in the UK, faced by our industrial partner. We present a mixed integer programming model that incorporates important real-world features of the problem such as defined geographical regions and flexibility in the workers’ availability. Given the size of the real-world instances, we propose to decompose the problem based on geographical areas. We show that the quality of the overall solution is affected by the ordering in which the sub-problems are tackled. Hence, we investigate different ordering strategies to solve the sub-problems and show that such decomposition approach is a very promising technique to produce high-quality solutions in practical computational times using an exact optimization method.

Paper Nr: 58
Title:

Multi-start Iterated Local Search for Two-echelon Distribution Network for Perishable Products

Authors:

S. Kande, C. Prins , L. Belgacem and B. Redon

Abstract: This article presents a planning problem in a distribution network incorporating two levels inventory management of perishable products, lot-sizing, multi-sourcing and transport capacity with a homogeneous fleet of vehicles. A mixed integer linear programming (MILP) and a greedy heuristic were developed to solve this real planning problem. There are some instances for which the solver cannot give a good lower bound within the limited time and for other instances it takes a lot of time to solve MILP. The greedy heuristic is an alternative to the mixed integer linear program to quickly solve some large instances taking into account original and difficult constraints. For some instances the gap between the solutions of the solver (MILP) and the heuristic becomes quite significant. A multi-start iterated local search (MS-ILS), using the variable neighborhood descent (VND) and a greedy randomized heuristic, has been implemented. It has been included in an APS (Advanced Planning System) and compared with an exact resolution of the MILP. The instances are derived from actual data or built using a random generator of instances to have wider diversity for computational evaluation. The MS-ILS significantly improves the quality of solutions.

Paper Nr: 62
Title:

The Impact of Demand Correlation on Bullwhip Effect in a Two-stage Supply Chain with Two Retailers

Authors:

Jianhua Ji, Huafeng Li, Jie Zhang and Cuicui Meng

Abstract: In a two-stage supply chain with two retailers, if they have correlated customer demand, forecasting based on their respective history order might cause significant forecast inaccuracy. Current forecast methods only use supply chain members’ own history demand information. However, when there are multi-retailer’s having correlated demand, the common forecasting methods ignore the forecast error caused by retailers’ interaction. Then, a question comes up that what is the relation between this forecast error and the bullwhip effect. The present paper studies relation of multi-terminals’ demand correlation and bullwhip effect in a two-stage supply chain with two retailers. Under centralized or decentralized information, (1) the impact of retailers’ demand correlation on retailers’/supplier’s bullwhip effect is studied; (2) the contrast of supplier’s and retailers’ bullwhip effect and the contrast of supplier’s/ retailers’ bullwhip effect under different information sharing condition are studied. The studies show that multi-terminals’ demand correlation is a cause of supply chain’s bullwhip effect.

Paper Nr: 84
Title:

Critical Activity Effect on Project Duration in Precedence Diagram Method Scheduling Network

Authors:

S. A. Nisar and K. Suzuki

Abstract: Precedence Diagram Method (PDM) scheduling network with its additional relationships i.e., start-to-start, finish-to-finish, and start-to-finish, between activities provides more flexible schedule than traditional Critical Path Method (CPM). But, changing the duration of critical activities in PDM network will have anomalous effect on critical path and project duration. Researchers have proposed some classification of critical activity effects. However, these classifications were not completed and could not indicate all the critical activity’s characteristics. In this paper we do further study on classifications of critical activity effect and provide more information in detailed. Furthermore, we determine the maximum amount of time for each class of critical activity effect by which the project managers can control the dynamic feature (shortening/lengthening) of critical activities and project duration more efficiently.

Short Papers
Paper Nr: 28
Title:

An Integer Linear Programming Solution to the Telescope Network Scheduling Problem

Authors:

Sotiria Lampoudi, Eric Saunders and Jason Eastman

Abstract: Telescope networks are gaining traction due to their promise of higher resource utilization than single telescopes and as enablers of novel astronomical observation modes. However, as telescope network sizes increase, the possibility of scheduling them completely or even semi-manually disappears. In an earlier paper, a step towards software telescope scheduling was made with the specification of the Reservation formalism, through the use of which astronomers can express their complex observation needs and preferences. In this paper we build on that work. We present a solution to the discretized version of the problem of scheduling a telescope network. We derive a solvable integer linear programming (ILP) model based on the Reservation formalism. We show computational results verifying its correctness, and confirm that our Gurobi-based implementation can address problems of realistic size. Finally, we extend the ILP model to also handle the novel observation requests that can be specified using the more advanced Compound Reservation formalism.

Paper Nr: 32
Title:

Models for Multimodal Freight Transportation Integrating Consolidation and Transportation Phases

Authors:

Leonardo Malta, Nicolas Jozefowiez and Fréderic Semet

Abstract: It is important for economic development and international trade the ability to move freight in a cost-efficient, safe and quick fashion. The paper will discuss the door-to-door freight transportation problem in its two phases: consolidation phase and transportation between the platforms. In a general way, the problem is described as a set of orders that have a release and delivery date and must be consolidated and routed from a source to a destination point. Two models are proposed, both integrating several aspects of the problem such as long-haul transportation, freight consolidation, freight storage and intermodal transportation. The first is a time-space based model and the second an implicit time representation model. Models are formulated as integer programming problems and some results of small practical instances are shown along with some considerations.

Paper Nr: 39
Title:

Stenier-based Energy Efficient Multicast Routing for Wireless Sensor Networks

Authors:

Valeri Katerinchuk and Celal Ozturk

Abstract: Wireless Sensor Networks (WSN) consist of a suite of sensor nodes capable of autonomous data collection and wireless communication deployed over an area of interest. They constitute a popular means of automated data collection with the sensors working together to form a wireless communication network and funnel collected data to nodes capable of transmitting data to a single or multiple collection points for interpretation. The routing instance from a single source to multiple destinations is formalized as a Multicast Routing Problem (MRP). Recent applications WSNs have focused on exploiting sink mobility technology (with multiple destinations) to extend network lifetime. As the nodes are often reliant on limited power sources, algorithms for efficient routing of data in these networks have been a source of increased research interest. However, few current algorithms consider the remaining energy levels of an individual network node during a single routing instance, creating a situation where the network may become disconnected faster than necessary. In this paper, we present an algorithm for multicast routing in wireless sensor networks implementing energy-reweighting techniques to simultaneously optimize the energy-cost of a routing and the network lifetime.

Paper Nr: 45
Title:

A Network Model for the Hospital Routing Problem

Authors:

Arash Rafiey, Vladyslav Sokol, Ramesh Krishnamurti, Snezana Mitrovic Minic, Abraham Punnen and Krishna Teja Malladi

Abstract: We consider the problem of routing samples taken from patients to laboratories for testing. These samples are taken from patients housed in hospitals, and are sent to laboratories in other hospitals for testing. The hospitals are distributed in a geographical area, such as a city. Each sample has a deadline, and all samples have to be transported within their deadlines. We have a fixed number of vehicles as well as an unlimited number of taxis available to transport the samples. The objective is to minimize a linear function of the total distance travelled by the vehicles and the taxis. We provide a mathematical programming formulation for the problem using the multi-commodity network flow model, and solve the formulation using CPLEX, a general-purpose MIP solver. We also provide a computational study to evaluate the solution procedure.

Paper Nr: 56
Title:

Hazardous Materials Transportation using Bi-level Linear Programming - Case-study of Liquid Fuel Distribution

Authors:

Madalena S. Rodrigues, Marta C. Gomes, Alexandre B. Gonçalves and Sílvia Shrubsall

Abstract: Hazardous materials (hazmats) are essential for the competitiveness of contemporary societies, however their transportation is potentially dangerous and expensive. In Portugal, despite the interest of both academia and industry, studies enabling the identification of preferable road routes for hazmats distribution were not identified. Hence, this research aims at contributing to advancing knowledge in identifying these routes in the national current context by balancing two frequently intrinsically conflicting aspects of hazmats transportation: the safety and the economic viability of the available routes. For that, a bi-level linear programming model was implemented in the GAMS modelling system and applied to a real-world case study using petrol and diesel fuels delivery data from a prominent energy group acting in the country. The company shared data of distribution loads over one calendar year to both petrol stations and direct clients in Lisbon. A geographical information system (GIS) was used to map Lisbon road network, which was found to be significantly larger than other networks used in similar studies described in the literature. The model was solved to optimality in a short computation time leading to the clear identification of the preferable road routes for liquid fuel distribution in the Lisbon district of Olivais. The success of the methodology applied in this study, including the generic implementation of the bi-level linear programming model, offers an optimistic prospect for a gradual increase of the geographical coverage, assessed risks and general complexity of the initial model.

Paper Nr: 77
Title:

Modeling a Public Hospital Outpatient Clinic in Peru using Discrete Simulation

Authors:

Valeria Quevedo and Javier Chapilliquen

Abstract: Having insurance or not makes a difference in terms of the procedure patients need to follow to be attended in public hospitals in Peru. Studies show a high dissatisfaction towards the service offered by public hospitals, mainly due to long waiting times, specially for patients with insurance. The initiatives implemented by the government to solve these problems were not supplemented with a rigorous analysis to help quantify their impact. The main objective of this study is to assess the quality of care at one of the most visited public hospitals in Peru. Discrete simulation was used to build a model which was validated through historical data and hospital personnel. The model is capable of measuring the service level and it facilitates the identification of bottlenecks. It identified the most critical medical specialties most utilized and that have the longest queues. The results also serve to identify the services with a low utilization rate. High idle time during the insurance verification process was identified as a problem. It seems insurance verification could be integrated with admission tasks or during other services. The model can be applied to any public hospital in Peru given the fact that their outpatients processes are similar.

Paper Nr: 83
Title:

Proposal of a Framework to Assess the Supply Chain Performance in the Agri-food Sector

Authors:

Luis Miguel D. F. Ferreira and Amilcar Arantes

Abstract: Companies need to excel in many areas to achieve a competitive advantage. Supply chain management is critical for a company's overall performance. It is therefore crucial that organizations measure the performance of their supply chains in order to define strategies that contribute to maximize the impact of their operations. This paper aims to propose a novel framework for assessing and monitoring the supply chain performance of companies of the agri-food sector. The framework consists of six steps to evaluate the companies supply chain performance. The linear aggregation technique is suggested to aggregate indicators into a unique value giving rise to a composite index considering five dimensions. The proposed framework can be used as a valuable instrument for the monitoring of supply chain performance of agri-food companies.

Paper Nr: 91
Title:

Sales Forecasting Models in the Fresh Food Supply Chain

Authors:

Gabriella Dellino, Teresa Laudadio, Renato Mari, Nicola Mastronardi and Carlo Meloni

Abstract: We address the problem of supply chain management for a set of fresh and highly perishable products. Our activity mainly concerns forecasting sales. The study involves 19 retailers (small and medium size stores) and a set of 156 different fresh products. The available data is made of three year sales for each store from 2011 to 2013. The forecasting activity started from a pre-processing analysis to identify seasonality, cycle and trend components, and data filtering to remove noise. Moreover, we performed a statistical analysis to estimate the impact of prices and promotions on sales and customers’ behaviour. The filtered data is used as input for a forecasting algorithm which is designed to be interactive for the user. The latter is asked to specify ID store, items, training set and planning horizon, and the algorithm provides sales forecasting. We used ARIMA, ARIMAX and transfer function models in which the value of parameters ranges in predefined intervals. The best setting of these parameters is chosen via a two-step analysis, the first based on well-known indicators of information entropy and parsimony, and the second based on standard statistical indicators. The exogenous components of the forecasting models take the impact of prices into account. Quality and accuracy of forecasting are evaluated and compared on a set of real data and some examples are reported.

Posters
Paper Nr: 23
Title:

A Database-oriented Workflow Scheduler with Historical Data and Resource Substitution Possibilities

Authors:

Tibor Dulai and Ágnes Werner-Stark

Abstract: This paper presents the database of a novel workflow scheduler that is able to handle resource substitution and takes into account historical data. The generated schedule can be optimized either in time or cost. The scheduler enables a resource substitution in case of an immediate event or when cost- or time-efficiency-related reasons necessitates it. The underlying database is able to handle complex workflows, represents the fleet of various resources and supports data mining from the data of the logged execution of the schedule in order to further improving the schedule. The database and the scheduler is a part of a complex project which schedules workflows described in XPDL by an agent system taking into account the real-time events and historical data served by process mining. Our scheduler system is intended to be applied both in business and industrial processes.

Paper Nr: 47
Title:

Accelerating the Performance of Parallel Depth-First-Search Branch-and-Bound Algorithm in Transportation Network Design Problem

Authors:

Amirali Zarrinmehr and Yousef Shafahi

Abstract: Transportation Network Design Problem (TNDP) aims at selection of a subset of proposed urban projects in budget constraint to minimize the network users’ total travel time. This is a well-known resource-intensive problem in transportation planning literature. Application of parallel computing, as a result, can be useful to address the exact solution of TNDP. This paper is going to investigate how the performance of a parallel Branch-and-Bound (B&B) algorithm with Depth-First-Search (DFS) strategy can be accelerated. The paper suggests assigning greedy solutions to idle processors at the start of the algorithm. A greedy solution, considered in this paper, is a budget-wise feasible selection of projects to which no further project can be added while holding the budget constraint. The paper evaluates the performance of parallel algorithms through the theoretical speedup and efficiency which are based on the number of parallel B&B iterations. It is observed, in four cases of TNDP in Sioux-Falls transportation network, that achieving high-quality solutions by idle processors can notably improve the performance of parallel DFS B&B algorithm. In all four cases, super-linear speedups are reported.

Paper Nr: 54
Title:

A Greedy Heuristic for Workforce Scheduling and Routing With Time-dependent Activities Constraints

Authors:

J. Arturo Castillo-Salazar, Dario Landa-Silva and Rong Qu

Abstract: We present a greedy heuristic (GHI) designed to tackle five time-dependent activities constraints (synchronisation, overlap, minimum difference, maximum difference and minimum-maximum difference) on workforce scheduling and routing problems. These types of constraints are important because they allow the modelling of situations in which activities relate to each other time-wise, e.g. synchronising two technicians to complete a job. These constraints often make the scheduling and routing of employees more difficult. GHI is tested on set of benchmark instances from different workforce scheduling and routing problems (WSRPs). We compare the results obtained by GHI against the results from a mathematical programming solver. The comparison seeks to determine which solution method achieves more best solutions across all instances. Two parameters of GHI are discussed, the sorting of employees and the sorting of visits. We conclude that using the solver is adequate for instances with less than 100 visits but for larger instances GHI obtains better results in less time.

Paper Nr: 68
Title:

KSF-CA Correlation Matrix for Probabilistic Cashflow Model on Construction Project Financing in South Korea

Authors:

Jin-hyuk Yoo, Dong-gun Lee and Hee-sung Cha

Abstract: In the construction industry, the main obstacle in successfully completing a project is a failure in identifying and responding the project risk factors. Especially for construction project financing (PF), many project practitioners are struggling in developing a cash-flow model by integrating the key risk factors for the subject project. This study has identified key success factors (KSFs) of construction project financing (PF) throughout an extensive literature review in collaboration with an industry survey. They have been further derived from Factor Analysis technique and qualified using Fuzzy-AHP method. Throughout the evaluation of the derived success factors in real building construction projects, a strong correlation has been identified between the score of each PF success factor and the level of success and/or expected rate of return (ROR). Using the result of this investigation, this study has been developing a correlation matrix for inter-relating each KSF and its corresponding cash account in order to effectively measure the financial viability of PF projects. With the help of this mechanism, the project stakeholders can reach more objective and transparent decision-making process. The contribution of this study will help decision makers of the PF project make a better decision and give a meaningful guidance in achieving more successful PF projects.

Paper Nr: 81
Title:

A Study of Storage Assignment and Order Picking Route Problem for the Warehouse of Distribution Center - Assuming Synchronized Zone Picking Operations

Authors:

Chikong Huang and You-Jen Liu

Abstract: Operation efficiency of distribution centre is an important issue for most of the logistics industry, especially in a highly competitive environment. This study proposes an integrated model to solve the long-term storage assignment problem and the short-term picking route problem for the warehouse of distribution centre. The model developed in this study assumes the synchronized zone picking operations in warehouse. Therefore, the goal of this study focuses on improving the efficiency for both individual zones and overall system, which is represented in the objective function from short-term and long-term view. The heuristic solution algorithms using simulated annealing logic are also presented in this paper. Results from a numerical example indicate that the model and solution algorithm can effectively improve the performance for both the parallel zones and the overall system.

Paper Nr: 86
Title:

Police Response Officer Selection - Development of Tool to Aid the Dispatch of Police Response Officers

Authors:

Johanna Leigh, Sarah Dunnett and Lisa Jackson

Abstract: It’s essential the Police force use their resources to the highest possible efficiency to ensure adequate service in the face of major funding cuts. Automation of the response officer selection process can improve efficiency by assisting in selecting the most appropriate response officer to attend an incident. Currently dispatchers are tasked with selecting the appropriate response officers to send to incidents. This may not result in the most efficient officer being selected to attend an incident. Providing a software tool to assist in the decision making process will decrease uncertainty in the decision and hence increase the likelihood of the most efficient officer being selected to attend an incident. The selection considers response time, availability, area coverage, driving standard and traffic conditions. The tool is specific to the police dispatch process and hence accounts for factors which are not which general included in other dispatch tools.

Paper Nr: 87
Title:

Proposal of an SKU Classification Framework - A Multicriteria Approach

Authors:

Sara Santos, Luis Miguel D. F. Ferreira and Amílcar Arantes

Abstract: Changes to an organization’s internal and external environment may cause an increase in the number of Stock Keeping Units (SKU) in inventory. Therefore an SKU classification and corresponding grouping become highly important for improving the inventory management process. In this paper we propose a framework for SKU classification in an industrial context using a multicriteria approach considering three criteria: value of usage; criticality and demand variability. This approach emphasizes the importance of SKUs that despite their small value are of vital importance for the operations/production of the organization.