minimizing three simultaneous criteria in machine ...utq.edu.iq/final2/111.pdf · minimizing three...
TRANSCRIPT
![Page 1: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/1.jpg)
Republic of Iraq
Ministry of Higher Education
and Scientific Research
Thi-Qar University
College of Education for Pure Sciences
Minimizing Three Simultaneous Criteria in
Machine Scheduling Problem
A Thesis Submitted to
The Department of Mathematics, College of Education
for Pure Sciences, University of Thi-Qar in a Partial
Fulfillment of the Requirements for the Degree
of Master of Sciences in Mathematics
by Jafar Saleh Aneed
(B.Sc. 2010)
Supervised by
Assist Prof. Professor
Dr. Mohammed K.Z. Al-Zuwaini Dr. Kadhem M. H. Al-Mousawi
2013 A.D 1434 A.H
![Page 2: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/2.jpg)
ہ ہ ہ ھ ھ ھ ھ ے ے ۓ چ
ۓ ڭ ڭ ڭ ڭ ۇ ۇ ۆ
چۆ
صدق اهلل العلي العظيم
( 91 ) اآليةسورة النمل من
![Page 3: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/3.jpg)
الإهداء
ىل , عزمه يثين أن ادلهر يس تطع ومل, بصربه الصعاب وجابه, معره س نني وهبين من اإ
العزيز وادلي... الشموخ رمز
ىل من الساهرة العني فاكنت وحناهنا حهبا حبر يف وترعرعت دهما من سقتين من اإ
العزيزة وادليت... يل يدعو اذلي والقلب أجيل
ىل ال عزاء أ خوايت و أ خويت... حيايت رايحني و احلب مشوع اإ
ىل البحث هذا أ متام يف ساعدين من مجيع اإ
ىل أ صدقايئ مجيع اإ
الوفاء بعض فيه لعل هجدي مثرة... مجيعا هلم أهدي
جعفر صاحل
ACKNOWLEDGEMENTS
![Page 4: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/4.jpg)
ACKNOWLEDGEMENTS
All the praise and thanks be to ALLAH, the most gracious and
most merciful, for his grace that enabled me to continue the
requirements of my study.
My sincerest gratitude is due to my supervisors, Dr. Mohammed
K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their patience,
help and prudent guidance throughout this work. Their support and
guidance has allowed me to complete one of my goals in my life.
Without their help, completion of this goal would have been more
difficult.
Words of thanks should go to the head and the staff of the
Department of Mathematics, and all my M.Sc. course teachers to
whom I owe much.
My thanks and priding to my colleagues in the higher studies.
Also, my thanks go to everyone who helped me throughout the
fulfillment of my research, especially Dr. Rabee Hadi Jari, Firas Sabar
alhussinawi, Sami Mezal Araibi, and Hussein Jameel.
I cannot find words to express my indebtedness to my father, my
mother, my brothers, and my sisters, for their love, support and
inspiration at every step of my success and failure, without which
none of this would have been possible.
Jafar
![Page 5: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/5.jpg)
Supervision Certification
We certify that this thesis entitled (Minimizing Three Simultaneous
Criteria in Machine Scheduling Problem) was prepared under my
supervision at the Department of Mathematics/ College of Education for
Pure Sciences/ University of Thi-Qar, in a partial fulfillment of the
requirements for the degree of Master of Sciences (M.Sc.) in
Mathematics.
Signature:
Supervisor: Assist prof. Dr. Mohammed Kadhim Z. Al-Zuwaini
Date: / / 2013
Address: College of Computer Sciences and Mathematics, Thi-Qar University
Signature:
Supervisor: Professor Dr. Kadhem Mahdi Hashim Al-Mousawi
Date: / / 2013
Address: College of Education for Pure Sciences, Thi-Qar University
Head recommendation of the mathematics department
In view of the available recommendations; I forward this thesis for
debate by the examining committee.
Signature:
Name: Assist prof. Dr. Rabee Hadi Jari
Head of the Department of Mathematics,
College of Education for Pure Sciences, Thi-Qar University
Date: / / 2013
![Page 6: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/6.jpg)
Committee Certificate
We are the examining committee, certify that after reading this thesis
entitled (Minimizing Three Simultaneous Criteria in Machine Scheduling
Problem) and examining the students (Jafar Saleh Aneed) in its contents. We
think in our opinion it has met the requirements for the degree of Master of
Sciences (M.Sc.) in Mathematics, with ( ) grade.
Chairman Member
Signature: Signature:
Name: Name:
Scientific Position: Scientific Position:
Date: / / 2013 Date: / / 2013
Member Member (Supervisor)
Signature: Signature:
Name: Name: Dr. Mohammed K. Al-Zuwaini
Scientific Position: Scientific Position: Assist Professor
Date: / / 2013 Date: / / 2013
Member (Supervisor)
Signature:
Name: Dr. Kadhim M. Al-Mousawi
Scientific Position: Professor
Date: / / 2013
Approved for the University committee on Graduate studies.
Signature:
Name: professor Dr. Mohammed Jassim Mohammed Al-Mousawi
Dean of the College of Education for Pure Sciences
Date: / / 2013
![Page 7: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/7.jpg)
Abstract j I
Abstract
In this thesis, the problem of scheduling n jobs on a single machine
is considered to minimize Multiple Objective Function (MOF). There are
two aims for this study, the first one is to find the optimal solution for
the sum of completion times, maximum earliness and maximum tardiness
with unequal release dates, no preemption is allowed, this problem de-
noted by 1/ri/∑n
i=1Ci + Emax + Tmax, or 1/ri/∑n
i=1Ci + ETmax to the
best of our knowledge this problem is not studied before. The second aim
is to find the near optimal solution for the same problem by using neural
networks.
For the first aim, a Branch and Bound (BAB) algorithm is proposed
with two lower bounds (LB1, LB2) and four upper bounds (UB1, UB2, UB3,
UB4) that are introduced in this thesis, in order to find the exact (opti-
mal) solution. Nine special cases are derived and proved that yield optimal
solutions without using (BAB) algorithm. Three dominance rules are sug-
gested and proved which help in reducing the number of branches in the
search tree. Results of extensive computational tests show the proposed
(BAB) algorithm is effective in solving problems with up to (30) jobs at
a time less than or equal to (30) minutes. In general, this problem is
strongly NP-hard.
For the second aim, since our problem is strongly NP-hard, we apply
the neural network method to find near optimal solution. Computational
experience is found that this neural network method can solve the problem
with up to (8) jobs with reasonable time. We observed from computational
experience, the neural networks method is a very good process, it gives a
near optimal value for the objective function.
![Page 8: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/8.jpg)
II
List of Symbols and Abbreviations
ANN Artificial Neural Network
BAB Branch and Bound
Completion time of job i
The maximum completion time (makespan)
DP Dynamic programming
Due date of job i
Deadline of job i
EDD Earliest Due date
The maximum earliness
Maximum function
Flow shop scheduling
H1 First Hidden layer
H2 Second Hidden layer
ILB Initial Lower Bound
I Input layer
JIT Just in time
Job shop scheduling
LB Lower bound
MEDD Modified Earliest Due Date
MOF Multiple Objective Function
MST Minimum Slack Time
MRST Minimum Remaining Slack Time
m Number of machines
NP Non-deterministic polynomial
NN Neural Network
n Number of jobs
O Output layer
Open shop scheduling
Processing time of job i
pmtn Preemption
prec Precedence constraint
P Polynomial time
Release date of job i
SPT Shortest Processing Time
Slack time of job i
SRD Shortest release date
SRPT Shortest Remaining Processing Time
The maximum tardiness
TSP Traveling Salesman Problem
UB Upper Bound
Weight of job i
The Total Flow Time
![Page 9: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/9.jpg)
Contents
Abstract I
List of Symbols and Abbreviations II
Contents III
List of Figures VI
Introduction 1
1 Description of Machine Scheduling Problem 4
1.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.2 Machine Scheduling Problem . . . . . . . . . . . . . . . . . . . . . . . . . 5
1.2.1 Single Machine Scheduling . . . . . . . . . . . . . . . . . . . . . . 5
1.2.2 Parallel Machine Scheduling . . . . . . . . . . . . . . . . . . . . . 6
1.2.3 Flow Shop Scheduling (Fm) . . . . . . . . . . . . . . . . . . . . . 6
1.2.4 Job Shop Scheduling (Jm) . . . . . . . . . . . . . . . . . . . . . . 6
1.2.5 Open Shop Scheduling (Om) . . . . . . . . . . . . . . . . . . . . . 7
1.3 Regular and Non-regular Performance Measure of Machine Scheduling
Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
1.4 Basic Scheduling Concepts . . . . . . . . . . . . . . . . . . . . . . . . . . 7
1.5 The Classification Problem (α/β/γ) . . . . . . . . . . . . . . . . . . . . . 10
1.5.1 Machine Environment (α) . . . . . . . . . . . . . . . . . . . . . . 10
1.5.2 Job Characteristics (β) . . . . . . . . . . . . . . . . . . . . . . . . 11
1.5.3 Optimality Criteria (γ) . . . . . . . . . . . . . . . . . . . . . . . . 11
1.5.4 Examples of Scheduling Problems . . . . . . . . . . . . . . . . . . 12
1.6 Assumptions About Jobs . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
III
![Page 10: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/10.jpg)
1.7 Assumptions About Machines . . . . . . . . . . . . . . . . . . . . . . . . 13
1.8 Sequence Rules for Machine Scheduling Problems . . . . . . . . . . . . . 14
1.9 Problem Classes and Computational Complexity . . . . . . . . . . . . . . 14
1.10 Solution Approaches for Machine Scheduling
Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
1.10.1 Complete Enumeration . . . . . . . . . . . . . . . . . . . . . . . . 16
1.10.2 Branch and Bound Methods . . . . . . . . . . . . . . . . . . . . . 17
1.10.3 Dynamic Programming Method . . . . . . . . . . . . . . . . . . . 18
1.10.4 Heuristic Methods . . . . . . . . . . . . . . . . . . . . . . . . . . 19
2 Branch and Bound Method to Minimize Three Criteria 21
2.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
2.2 Problem Formulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
2.3 Decomposition of Problem (S) . . . . . . . . . . . . . . . . . . . . . . . . 24
2.4 Special Cases . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
2.5 Dominance Rule . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
2.6 Upper Bound . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
2.7 Lower Bound . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
2.7.1 The First Lower Bound . . . . . . . . . . . . . . . . . . . . . . . . 38
2.7.2 The second Lower Bound . . . . . . . . . . . . . . . . . . . . . . . 39
2.8 Branch and Bound Algorithm . . . . . . . . . . . . . . . . . . . . . . . . 40
2.9 Computational Experience . . . . . . . . . . . . . . . . . . . . . . . . . . 41
2.9.1 Test Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
2.9.2 Computational Experience with the Lower and Upper Bounds of
(BAB) Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . 42
3 Neural Networks to Solve Scheduling Problems 46
3.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
3.1.1 Artificial Neural Network (ANN) . . . . . . . . . . . . . . . . . . 46
3.1.2 Architecture of Neural Network . . . . . . . . . . . . . . . . . . . 49
3.1.3 Network Learning Methods . . . . . . . . . . . . . . . . . . . . . 53
3.1.4 The activation functions . . . . . . . . . . . . . . . . . . . . . . . 55
3.1.5 Applications of Neural Networks . . . . . . . . . . . . . . . . . . . 55
3.1.6 Beginning of Neural Nets . . . . . . . . . . . . . . . . . . . . . . . 56
IV
![Page 11: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/11.jpg)
3.2 Related Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
3.3 Problem Statement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
3.4 Multilayer Perceptron NN for Single Machine . . . . . . . . . . . . . . . 59
3.5 Proposed Neural Network Design . . . . . . . . . . . . . . . . . . . . . . 64
3.6 Artificial Neural Network Training . . . . . . . . . . . . . . . . . . . . . 65
3.7 The Matrix of Weights . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68
4 Conclusions and Future Work 71
4.1 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
4.2 Future Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
References 73
V
![Page 12: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/12.jpg)
List of Figures
2.1 BAB method without dominance rule . . . . . . . . . . . . . . . . . . . . 35
2.2 BAB method with dominance rule . . . . . . . . . . . . . . . . . . . . . . 36
2.3 (SRPT) rule . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
2.4 (MEDD) rule . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
3.1 A simple artificial neuron. . . . . . . . . . . . . . . . . . . . . . . . . . . 49
3.2 Single Layer Net. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51
3.3 A multilayer neural net. . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
3.4 Error-correction learning diagram. . . . . . . . . . . . . . . . . . . . . . . 53
3.5 Activation Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
3.6 The proposed design of ANN . . . . . . . . . . . . . . . . . . . . . . . . 65
3.7 Input of 3 jobs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
3.8 Input of 4 jobs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
VI
![Page 13: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/13.jpg)
Introduction j 1
Introduction
Operations Research (OR) is one of the modern practicing sciences
which have witnessed wide success in different fields of life. OR has arisen
during the second world war when the military administration in Britain
gave a team of scientists and researchers the task of studying the techno-
logical and strategic problems specially in terrestrial and voyage defense.
The aim of this team has been to find the best use to resources in addition
to study the action of the new kind of pomps. The form of this group was
regarded the first birth to what is called (Operations Research) in 1941
[34].
This study focuses on one of the most important subjects in Opera-
tions Research which is called (Machine Scheduling Problem). To solve
machine scheduling problems one tends to use optimization algorithms
which always find an optimal solution. However, not for all optimization
problems, polynomial time optimization algorithms can be constructed.
This is because, some of these problems, which cannot be solved in poly-
nomial time algorithms are NP-hard. In practice this means that solving
large instances of such problems optimality requires impracticable running
times. The NP-hard of a problem suggests that it is impossible to find
an optimal solution without the use of essentially implicit enumeration al-
gorithms (branch and bound or dynamic programming algorithms). But
this enumeration algorithm may be unable to solve problems with more
than a handful of jobs, and the solution generated by simple heuristics
may be far from the optimum [12].
![Page 14: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/14.jpg)
Introduction j 2
Because the one machine problem provides a useful laboratory for the
development of ideas for heuristics and interactive procedure that may
prove to be useful in more general models, we consider the one machine
case in this study, with multi-criteria.
There are two approaches for the multi-criteria problem; the hierar-
chical approach and the simultaneous approach. First, in the hierarchical
approach, one of two criteria is considered as the primary criterion and
the other one is considered as the secondary criterion. The problem is to
minimize the primary criterion while breaking ties in favor of the schedule
that has the minimum secondary criterion value. Second, in the simul-
taneous approach there are two types; the first one typically generates
all efficient schedules and selects the one that yields the best composite
objective function value of these criteria. The second is to find the sum
of these objectives [31].
Scheduling has received much attention in the literature since the pio-
neering work of Johnson in 1954. In the first 30 years, it has been usual to
consider only one objective function as performance criterion. However, in
many practical situations a decision-maker has to take into account simul-
taneously several objectives. Therefore, the investigation of multi-criteria
scheduling problems has begun about 30 years ago with a growing interest
nowadays [56].
The starting point of artificial neural networks was the McClloch-
Neuron Pitts in 1943, which demonstrated how a network of neuron could
exhibit learning behavior. Neural networks are categorized by their archi-
![Page 15: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/15.jpg)
Introduction j 3
tecture (number of layers ), topology (connectivity pattern, feed forward
or recurrent, etc.), and learning regime. The main advantages of artificial
neural networks technology are ( it is fast, it possess learning ability, it
adapts to the data, it is robust, and it is appropriate for nonlinear mod-
elling) [8].
This thesis consists of four chapters:
Chapter one: Gives a full description of machine scheduling problems,
including historical background, a number of assumptions for machines,
jobs and other assumptions, classification and representation of schedul-
ing problems are also mentioned. The well known methods for finding the
exact solution such as BAB algorithm and DP are discussed.
Chapter two: Is devoted to the problem of a single machine scheduling
with unequal release dates to minimize the sum of total completion time,
maximum earliness, and maximum tardiness by using branch and bound
algorithm. Special cases, heuristic methods (upper bounds), lower bounds
was also included. Computational experiences for the BAB algorithm was
found.
Chapter three: Presents the neural networks method to find near op-
timal solution. Computational experience is found that this local search
method can solve the problem with up to (8) jobs with reasonable time.
Chapter Four: Gives conclusions and an outlook on future work.
![Page 16: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/16.jpg)
Chapter 1
Description of Machine Scheduling
Problem
1.1 Introduction
There are many definitions for machine scheduling, but the simplest
one for understanding is that, the allocation of resources over time to
perform a collection of tasks (Baker [9]). Resources and tasks are called
machines and jobs respectively and both of them can take many forms.
For example: We can consider a computer (or computers) as a machine
(or machines) and the programs that are to be run on that computer
(or computers ) as the jobs. Another example: We can consider hospital
equipments as a machines and the patients in that hospital as the jobs.
Generally speaking, scheduling means to assign machines to jobs in order
to complete all jobs under the imposed constraints. The problem is to find
the optimal processing order of these jobs on each machine to minimize
the given objective function. There are two general constraints in classical
4
![Page 17: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/17.jpg)
Chapter One: Description of Machine Scheduling Problem j5
scheduling theory [12]. Each job is to be processed by; at most, one
machine at a time and each machine is capable of processing at most
one job at a time. A schedule is feasible if it satisfies the two general
constraints, and also if it satisfies the various requirements relating to
the specific problem type. The problem type is specified by the machine
environment, the job characteristics and an optimality criterion.
1.2 Machine Scheduling Problem
Within manufacturing scheduling, there are many different types of
problem classes. These include single machine, parallel machine, flow
shop, job shop, and open shop. Each of these problem classes is unique,
and each has its own constraints and objectives [21]. A more detailed
description of each problem class is given in the following subsections.
1.2.1 Single Machine Scheduling
The single machine scheduling problem involves scheduling a set of
tasks to a single resource. This is accomplished by determining a sequence
that includes each task, and then assigning the tasks to the resources.
Each task can be given a priority, ready time, processing time and due
date. The value of the performance measures can be computed on the
base of this information and the sequence of tasks. This problem grows in
complexity at an exponential rate as the number of tasks to be scheduled
increases [3].
![Page 18: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/18.jpg)
Chapter One: Description of Machine Scheduling Problem j6
1.2.2 Parallel Machine Scheduling
Parallel machine scheduling involves scheduling a set of tasks on two
or more machines that work in parallel with each other. The machines
perform identical operations and may or may not operate at the same
pace [20].
1.2.3 Flow Shop Scheduling (Fm)
A flow shop scheduling consists of two or more machines and a set of
jobs that must be processed on each of these machines. This arrangement
is called a flow shop because the products flow along a specific unidirec-
tional path. Each product must be processed on each machine in the
same order e.g. 1st-machine 1, 2nd-machine 2,. . ., mth-machine m. The
processing times for each job can vary from machine to machine and the
processing times on each machine can vary from job to job [22].
1.2.4 Job Shop Scheduling (Jm)
A job shop consists of two or more machines that perform specific
operations, and a set of jobs that must be processed on some or all of these
machines. Unlike the flow shop, there is no fixed path that the products
must follow through the system. Therefore the order of operations is not
fixed. This type of layout is typically used when the product variety is
high and the product volume is low [22].
![Page 19: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/19.jpg)
Chapter One: Description of Machine Scheduling Problem j7
1.2.5 Open Shop Scheduling (Om)
In this case, the ordering for each job is not identical and the sequence
of n jobs on each machine is different, and each job has to be processed
on each machine, but there is no particular order to follow, the open shop
is not as well research as the flow shop and job shop [32].
1.3 Regular and Non-regular Performance Measure
of Machine Scheduling Problems
A measure of performance is said to be regular if it is a non-decreasing
function of job completion times and the scheduling objective is to min-
imize the performance measure. Examples of regular measures are job
flow time F, schedule makespan (Cmax) and tardiness based performance
measures. A non-regular performance measure is usually not a monotone
function of the job completion times. An example of such a measure is job
earliness [23]. Some times the objective function containing more than one
criteria; one is regular and the other is non-reguler as the multiple function
(∑Fi + Emax).
1.4 Basic Scheduling Concepts
We start with introducing some important notations, where we con-
centrate on the performance criteria without elaborating on the machine
environments. We assume that there are n jobs, which we denoted by
j1, . . . , jn these jobs are to be scheduled on a set of machines that are
![Page 20: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/20.jpg)
Chapter One: Description of Machine Scheduling Problem j8
continuously available from time zero onwards and that can handle only
one job at a time.
We state here the notation that is used for the single machine, jobs
ji(i = 1, ..., n) has:
Preemption (pmtn) [58]: Preemption (or job-splitting) is allowed dur-
ing the processing of a job, if the processing of the job can be interrupted
at any time (preempt) and resumed at a later time, even on a different
machine. The amount of processing already done on the preempted job is
not lost.
Processing time (pij)[48]: The processing time of job j on machine i.
The subscript i is omitted if job j is only to be processed on one given
machine or on m parallel machines.
Due date (dj)[44]: The date when the job should ideally be completed,
the completion of job after its due date is allowed, but a penalty is in-
curred. When the due date absolutely must be met, it’s referred to as
deadline (dj), and when due date is constant for all jobs, then called com-
mon due date.
Release date (rj)[44]: Also known as ready time, the point of time that
job arrives at the machine and thus processing starts.
Weight (Wj)[3]: Denoting the importance of a job j relative to another
job.
Now for a given sequence of jobs (1, 2, · · · , n) the following can be
computed for job j.
1) The completion time Cj, Cj =∑j
i=1 pi.
![Page 21: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/21.jpg)
Chapter One: Description of Machine Scheduling Problem j9
2) The flow time Fj = Cj − rj.
3) The lateness Lj = Cj − dj.
4) The tardiness Tj = max{Cj − dj, 0}.
5) The earliness Ej = max{dj − Cj, 0}.
6) The unit penalty
Uj =
1 if Cj > dj
0 otherwise
7) The idle time
Ij =
rj if j = 1
rj − Cj−1 if rj > Cj−1, j = 2, ..., n
0 otherwise
Let σ be a given sequence, the following performance criteria appear
frequently in the literature [29].
• Cmax(σ)=maxj∈σ{Cj}(maximum completion time or makespan).
• Emax(σ)=maxj∈σ{Ej}(maximum earliness).
• Lmax(σ)=maxj∈σ{Lj}(maximum lateness).
• Tmax(σ)=maxj∈σ{Tj}(maximum tardiness).
•∑
j∈σWjCj(σ)(total weighted completion times).
•∑
j∈σWjEj(σ)(total weighted earliness).
•∑
j∈σWjTj(σ)(total weighted tardiness).
•∑
j∈σWjUj(σ)(weighted number of tardy jobs).
![Page 22: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/22.jpg)
Chapter One: Description of Machine Scheduling Problem j10
1.5 The Classification Problem (α/β/γ)
In this thesis, we adopt the terminology of Graham et. al. [26] to
classify scheduling problems. Suppose that m machines Mi (i = 1, . . . ,m)
have to process n jobs Jj (j = 1, . . . , n). A schedule problem type can
be specified using a three-field classification (α/β/γ) composed of the
machine environment, the job characteristics ,and the optimality criterion
.
1.5.1 Machine Environment (α)
The first field α = α1α2 describes the machine environment parameters
α1 ∈ {φ,P,Q,R,O,F, J}, which characterized the type of machine used
[15]:
• α1 = φ: for single machine.
• α1 = P: for identical parallel machine problem.
• α1 = Q: for uniform parallel machine problem.
• α1 = R: for unrelated parallel machine problem.
• α1 = O: for open shop problem.
• α1 = F: for flow shop problem.
• α1 = J: for job shop problem.
α2 ∈ {φ,m}
• α2 = φ: the number of machines is assumed to be variable.
![Page 23: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/23.jpg)
Chapter One: Description of Machine Scheduling Problem j11
• α2 = m: the number of machines is equal to m (m≥ 2).
1.5.2 Job Characteristics (β)
The second field β ⊆ {pmtn, rj, βprec} indicates certain job character-
istics [2].
• pmtn is present, the preemption are allowed, the processing of any job
may be interrupted at no cost and resumed at a later time. Otherwise,
no preemptions are allowed, once a job is started on a machine Mi,
the job occupies the machine until it is completed.
• rj is present, then each job may have different release dates. Other-
wise, all jobs arrive at time 0.
• If a precedence constraint is present, then there is a precedence rela-
tion (≺) among the jobs, i.e. if Jj ≺ Jk, then Jj must be completed
before Jk can be started. If βprec = chain, then (≺) forms chains.
If βprec = tree, then (≺) forms a tree. If βprec = prec, then (≺) is
an arbitrary partial order. If βprec is not present, then jobs can be
processed in any order.
1.5.3 Optimality Criteria (γ)
The third field (γ) specified the objective function or the optimality
criterion, the value we wish to optimize. The parameter γ ∈ {fmax,∑fj},
which are defined as follows:
fmax ∈ {Cmax, Fmax, Lmax, Emax, Tmax}
![Page 24: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/24.jpg)
Chapter One: Description of Machine Scheduling Problem j12
or∑fj ∈ {
∑Cj,
∑WjCj,
∑Ej,
∑WjEj,
∑Tj,
∑WjTj,
∑Ui,
∑WiUi, }
or fj is equal to sum or weighted sum for two or more of objective function.
1.5.4 Examples of Scheduling Problems
i. 1/ri/∑n
i=1Ci + Tmax: is the problem of scheduling jobs with release
dates on a single machine to minimize bi-criteria (the total completion
time and maximum tardiness).
ii. F2/rj/∑n
j=1Cj: is the problem of scheduling jobs with release dates
on a two machine flow shop to minimize the total completion times.
iii. 1/prec/∑n
i=1WiCi: is the problem of scheduling jobs with precedence
constraints, on a single machine to minimize the total (weighted)
completion time.
1.6 Assumptions About Jobs [49]
J1: The set of jobs is fixed and known.
J2: All jobs are available at the same time and are independent some
times, each job, j=1,· · ·,n is characterized by a release date rj and a
due date dj (all data being integer).
J3: Each job has the same degree of importance. This assumption is
sometimes, relaxed (i.e. Job j has a weight Wj represent the impor-
tant of job j ).
![Page 25: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/25.jpg)
Chapter One: Description of Machine Scheduling Problem j13
J4: Any operation started is not interrupted by other operations and
continues to its completion. This assumption is sometimes relaxed
(i.e. job splitting is allowed).
J5: Each job can be in one of three states:
i. Waiting for the next machine.
ii. Being operated on by a machine.
iii. Having passed its last machine.
J6: Each job can be processed by only one machine at the same time.
1.7 Assumptions About Machines [49]
M1: The number of machines is known and fixed.
M2: All machines are available at the same instance and are independent
of each other.
M3: Each machine can be in one of three statuses:
i. Waiting for the next job.
ii. Operating on a job.
iii. Having finished its last job.
M4: All machines are equally important.
M5: Each machine has to process all jobs assigned to it.
M6: Each machine can process not more than one job at a time.
![Page 26: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/26.jpg)
Chapter One: Description of Machine Scheduling Problem j14
1.8 Sequence Rules for Machine Scheduling Prob-
lems
1) The EDD rule, that is sequencing the jobs in non-decreasing order of
their due date dj, which solves the problem 1//Lmax. This rule also
minimizes Tmax for the 1//Tmax problem [35].
2) The SPT rule, that is, sequencing the jobs in non-decreasing order of
their processing times pj. This rule solves the problem 1//∑n
j=1Cj
[52].
3) The MST rule, which is sequencing the jobs in non-decreasing order
of their slack time sj = dj − pj. In single machine environment with
ready time set at zero, which solves 1//Emax problem [30].
4) The SRPT rule, that is sequencing the jobs in non-decreasing order
of shortest remaining processing times , (this rule is well known to
minimize 1/rj, pmtn/∑n
j=1Cj problem) [9].
5) The MEDD rule, that is sequencing the jobs in non-decreasing order
of smallest remaining due dates, (this rule is well known to minimize
1/rj, pmtn/Tmax problem) [10].
1.9 Problem Classes and Computational Complexity
Some problems are polynomially solvable, which means that there ex-
ists an algorithm requiring a computing effort that could be shown to grow
as a polynomial in the size of the problem. Other problems can only be
![Page 27: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/27.jpg)
Chapter One: Description of Machine Scheduling Problem j15
solved by an enumerative algorithm, which in essence has an exponential
time complexity [12]. The problems that have known polynomials algo-
rithms are said to be well solved. The complexity theory is concerned
with the decision problem instead of optimization problem. A decision
problem is a question to which the answer is either ”yes” or ”no”, the
question is: does there exist a solution with an objective function value
less than or equal to fixed value k? (The optimization problem is trans-
formed into a finite series of decision problems by varying the value k).
Decision problems can be classified into three classes:
1) The class P (Polynomial), which contains all decision problems that
are polynomially solvable.
2) The class NP (Non-deterministic polynomial), which contains deci-
sion problems which can be easily verified whether a given solution
is ”yes” or ”no” answer. It is clear that the class P is a subclass
from class NP . The scheduling problems that will be considered in
this thesis can be solved by non-deterministic algorithm and thus is
a member of NP . Cook [19] proved there are hardest problems in
NP , such problems are called NP-Complete.
3) The class Open, which contains all decision problems that did not
prove so far P or NP , for instance the problem 1/dj = d/∑n
j=1WjEj
+W′
jTj is open [15].
A problem P is NP-complete if the existence of polynomial algorithm
for P implies the existence of a polynomial algorithm for any problem in
![Page 28: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/28.jpg)
Chapter One: Description of Machine Scheduling Problem j16
NP (i.e. P=NP). The location of the border line separating the easy
problem (in P) and the hard one (in NP-Complete) has been under wide
investigation by many researches, and turns out that a minor change in the
value of an easy problem parameters often transforms this problem into
a hard one. An optimization problem is called NP-hard if the associated
decision problem is NP-Complete. Since the computation time needed to
solve a scheduling problem is very important, recent development in the
theory of computational complexity has applied to machine scheduling
problems has aroused the interest of many researchers.
1.10 Solution Approaches for Machine Scheduling
Problems
In this section a discussion is carried out for the most well known
methods that have been used to solve machine scheduling problems. Both
enumerative approaches BAB and DP and heuristic approaches are con-
sidered for solving the machine scheduling problems.
1.10.1 Complete Enumeration
Complete enumeration methods generate one by one, all feasible sched-
ules and then pick the best one. For a single machine problem of n jobs
there are n! different sequence. Hence for the corresponding m machines
problem, there are (n!)m different sequence. This method may take con-
siderable time as the time as the number (n!)m is very large even for
![Page 29: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/29.jpg)
Chapter One: Description of Machine Scheduling Problem j17
relatively small values of n and m [34].
1.10.2 Branch and Bound Methods
BAB methods can be used for solving many combinatorial optimiza-
tion problems. These methods are example of implicit enumeration ap-
proach, which find an optimal solution by examining subsets of a feasible
solution. These procedures can be conveniently represented as a search
tree. Each node of the search tree corresponds to subset of feasible solu-
tions to a problem. A branching rule specifies how the feasible solutions
at a node are partitioned into subsets, each corresponding to descendant
node of the search tree.
The scheduling problems that we consider require an objective func-
tion, to be minimized. A lower bound scheme associates the LB with each
node of the search tree. The idea is to eliminate any node for which the
lower bound is greater than or equal to the value of the best known fea-
sible solution.
The branching rule describes how feasible solution at a node is parti-
tioned into subsets. There are several types of branching rules for schedul-
ing problems, the most common of which are forward branching and back-
ward branching. In forward branching rule jobs are sequenced one by one
from the beginning, while in a backward branching rule the jobs are se-
quenced one by one from the end.
The bounding rule calculates the LB on the optimal solution for each
subproblem generated by the branching rule. The well known methods of
![Page 30: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/30.jpg)
Chapter One: Description of Machine Scheduling Problem j18
obtaining lower bounds for machine scheduling problems are:
1. Relaxation of constraints.
2. Relaxation of objectives.
3. Langrangian relaxation.
4. Dynamic programming state-space relaxation.
To minimize an objective function of particular scheduling problem,
first the UB of the minimum of this objective function is needed. This UB
is the value for trial solution. At the beginning the trial solution may be
found using a heuristic procedure.
Finally, the branch and bound method can be improved by applying
dominance rules that discard nodes before computing their lower bounds.
These dominance rules are computationally useful as they reduce storage
requirements on the computer as well as reducing computation time [16].
1.10.3 Dynamic Programming Method
Fundamentals of dynamic programming were elaborated by Bellman
in 1950s [11]. This method was used, for machine scheduling problems,
for the first time, by Held and Karp [27]. The name Dynamic Program-
ming is slightly missleading, but generally accepted. A better description
would be recursive or multistage optimization. Since it interprets opti-
mization problem as a multistage decision problem. It means that the
problem is divided into a number of stages, and at each stage a decision
is required which impacts on the decision to be made in later stage [24].
![Page 31: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/31.jpg)
Chapter One: Description of Machine Scheduling Problem j19
If dynamic programming is applied to a combinatorial problem, then in
order to calculate the optimal criterion value for any subset of size K,
first, the optimal value for each subset of size K − 1 have to be known.
Thus if our problem is characterized by a set of n elements, the number
of subset considered 2n. It means that dynamic programming algorithms
are of exponential computational complexity. More precisely, Dynamic
programming method starts with an initial subproblem, which is easy to
solve at each iteration, it determines the optimal solution for a subprob-
lem which is larger than all previously solved subproblems by utilizing all
the information about the solution of these subproblems. This continues
until the original problem is solved.
1.10.4 Heuristic Methods
It is clear that, to solve a scheduling problem one tends to use im-
plicit enumerative approaches to find an optimal solution. However, these
approaches have two disadvantages. Firstly, they are mathematically com-
plex and thus a lot of time to be invested. Secondly, when it concerns an
NP-hard problem, the computational requirements are enormous for large
sized problem, to avoid these drawbacks can appeal to heuristics. Reeves
[50] defined the heuristic method as follows:
A heuristic is a technique which seeks good (i.e. near optimal) solution
at a reasonable computational cost without being able to guarantee either
feasibility or optimality, or even in many cases to state how close to opti-
mality a particular feasible solution.
![Page 32: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/32.jpg)
Chapter One: Description of Machine Scheduling Problem j20
In recent years, the improvement in heuristic methods has become
under the name of (Local Search Methods).
![Page 33: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/33.jpg)
Chapter 2
Branch and Bound Method to
Minimize Three Criteria
2.1 Introduction
In general, multi-criteria scheduling refers to the scheduling problem
in which the advantages of a particular schedule are evaluated using more
than one performance criterion. The managerial relevance of considering
multiple criteria for scheduling has been cited in the production and opera-
tions management literature since the 1950’s. Smith (1956)[52] shows that
the choice of a criterion will affect the characteristics of a ”best schedule”;
different optimizing criteria will result in very different schedules. Van
Wassenhove and Gelders (1980)[57] provide evidence that a schedule that
performs well using a certain criterion might yield a poor result using
other criteria. Hence, lack of consideration of various criteria may lead to
solutions that are very difficult to implement in practice. Although the
importance of multi-criteria scheduling has been recognized for many years
21
![Page 34: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/34.jpg)
Chapter Two: Branch and Bound Method to Minimize Three Criteria j 22
(French, 1982[23]; Nelson et al., 1986[47], and George and Paul 2007[25]),
little attention has been given in the literature to this topic. From the
problem complexity perspective, the multiple-criteria problem becomes
much more complex than related single-criteria counterparts (Lenstra et
al., 1979[40] and Nagar et al. (1995)[46]) review the problem in its general
form, where as Lee and Vairaktarakis (1993)[41] review a special version
of the problem, where one criterion is set to its best possible value and the
other criterion is tried to be optimized under this restriction. Hoogeveen
(2005)[28] studies a number of bi-criteria scheduling problems. Also, there
are some papers about this object (Cheng et al. 2008[14], [25], and Azi-
zoglu et al. 2003 [5]).
In this chapter the problem of scheduling n independent jobs on a
single machine is considered to minimize MOF, the sum of total comple-
tion time, maximum earliness and maximum tardiness by using the BAB
method. This problem is denoted by 1/ri/∑n
i=1Ci + Emax + Tmax.
2.2 Problem Formulation
Single machine scheduling models seem to be very simple but are very
important for understanding and modeling multiple machines models. A
set N = {1, 2, · · · , n} of n independent jobs has to be scheduled on a sin-
gle machine in order to optimize a given criterion. This study concerns
the one machine scheduling problem with multiple objectives function de-
noted by (1/ri/∑n
i=1Ci + Emax + Tmax). In this problem, preemption is
not allowed, no precedence relation among jobs is assumed, only one job
![Page 35: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/35.jpg)
Chapter Two: Branch and Bound Method to Minimize Three Criteria j 23
i can be processed at a time. Each job i has a release date ri at which
it cannot be processed before, needs pi time units to be processed on the
machine, and ideally should be completion at its due date di. For each
job i can be calculate the slack time si = di − pi. The objective is to find
a schedule to minimize the sum of total of completion times (∑n
i=1Ci),
maximum earliness (Emax) and maximum tardiness(Tmax). The problem
(1/ri/∑n
i=1Ci + Emax + Tmax) can be stated as follows:
A set of n independent jobs N = {1, 2, · · · , n} are available for process-
ing at time ri, job i(i = 1, 2, · · · , n) is to be processed with uninterruption
on a single machine that can handle only one job at a time, requires pro-
cessing time pi, and ideally should be completed at its due date di. For
a given sequence π of the jobs, completion time of job i, Cπ(i), earliness
Eπ(i), and the tardiness Tπ(i) are given by:
Cπ(1) = rπ(1) + pπ(1)
Cπ(i) = max{Cπ(i−1), rπ(i)}+ pπ(i) , i = 2, · · · , n
(2.1)
Eπ(i) = max{dπ(i) − Cπ(i), 0} , i = 1, · · · , n (2.2)
Tπ(i) = max{Cπ(i) − dπ(i), 0} , i = 1, · · · , n (2.3)
The problem is strongly NP-hard because the problems 1/ri/Tmax [48]
and 1/ri/∑n
i=1Ci, [38] are stronglyNP-hard and the problem 1//∑n
i=1Ci+
Emax is NP-hard, ([1, 3, 37]).
The aim is to find a sequence π that minimizes the total cost R =∑ni=1Cπ(i) +Emax(π) + Tmax(π). The mathematical form of this problem,
which denoted by (S) can be stated as follows:
![Page 36: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/36.jpg)
Chapter Two: Branch and Bound Method to Minimize Three Criteria j 24
minR = minπ∈δ{∑n
i=1Cπ(i) + Emax(π) + Tmax(π)}
s.t
Cπ(i) ≥ rπ(i) + pπ(i) i = 1, 2, ..., n
Cπ(i) ≥ Cπ(i−1) + pπ(i) i = 2, 3, ..., n
. . . (1)
Eπ(i) ≥ dπ(i) − Cπ(i) i = 1, 2, ..., n
Tπ(i) ≥ Cπ(i) − dπ(i) i = 1, 2, ..., n
Cπ(i) > 0, Eπ(i) ≥ 0, Tπ(i) ≥ 0, rπ(i) ≥ 0, pπ(i) > 0 i = 1, 2, ..., n
. . . (S)
where π(i) denotes the position of job i in the ordering π and δ denotes
the set of all enumerated schedules.
2.3 Decomposition of Problem (S)
In this section, the problem (S) decomposed into three subproblems
with a simple structure. Some results are stated which help in solving the
problem (S).
The problem S can be decomposed into three subproblems say (SA1),
(SA2) and (SA3) where:
N1 = minπ∈δ{∑n
i=1Cπ(i)}
s.t
Cπ(i) ≥ rπ(i) + pπ(i) i = 1, 2, ..., n
Cπ(i) ≥ Cπ(i−1) + pπ(i) i = 2, 3, ..., n
rπ(i) ≥ 0, pπ(i) > 0 i = 1, 2, ..., n
. . . (SA1)
![Page 37: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/37.jpg)
Chapter Two: Branch and Bound Method to Minimize Three Criteria j 25
N2 = minπ∈δ{Emax(π)}
s.t
Cπ(i) ≥ rπ(i) + pπ(i) i = 1, 2, ..., n
Cπ(i) ≥ Cπ(i−1) + pπ(i) i = 2, 3, ..., n
Eπ(i) ≥ dπ(i) − Cπ(i) i = 1, 2, ..., n
Eπ(i) ≥ 0, rπ(i) ≥ 0, pπ(i) > 0 i = 1, 2, ..., n
. . . (SA2)
N3 = minπ∈δ{Tmax(π)}
s.t
Cπ(i) ≥ rπ(i) + pπ(i) i = 1, 2, ..., n
Cπ(i) ≥ Cπ(i−1) + pπ(i) i = 2, 3, ..., n
Tπ(i) ≥ Cπ(i) − dπ(i) i = 1, 2, ..., n
Tπ(i) ≥ 0, rπ(i) ≥ 0, pπ(i) > 0 i = 1, 2, ..., n
. . . (SA3)
Theorem (2.1)[4]
N1 + N2 + N3 ≤ R where N1, N2, N3 and R are the minimum objective
function values of (SA1),(SA2),(SA3) and (S ) respectively.
2.4 Special Cases
A machine scheduling problem of type NP-hard is not easily solv-
able and it is more difficult when the objective function is multi objective.
Using some mathematical programming techniques to find the optimal so-
lution for this kind of problem as: Dynamic programming and branch and
bound method. Sometimes special cases for this problem can be solved.
![Page 38: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/38.jpg)
Chapter Two: Branch and Bound Method to Minimize Three Criteria j 26
A special case for scheduling problem means finding an optimal schedule
directly without using mathematical programming techniques. A special
case if it exists depends on satisfying some conditions in order to make
the problem easily solvable [33]. These conditions depend on the objective
function as well as the jobs. In this section some special cases of problem
(S) are given.
Case(1): The SRD schedule gives an optimal solution for problem (S) if
pi = p and di = ip for all i in SRD.
Proof:
Since di = ipg∀i in SRD, then Emax = 0 and Tmax =∑n
i=1 Ii. Then
the problem (S) reduced to 1/ri, pi = p/∑n
i=1Ci + Tmax.
Now, since pi = p for all i in SRD, then∑n
i=1Ci =∑n
i=1
∑ij=1 Ij +
(n2+n2 )p. But (n
2+n2 )p is constant, then
∑ni=1Ci ≡
∑ni=1
∑ij=1 Ij (i.e. a
schedule that is optimal solution with respect to∑n
i=1Ci is also optimal
with respect to∑n
i=1
∑ij=1 Ij). But
∑ni=1
∑ij=1 Ij ≡ Cmax [51].
Carliar, (1982)[13] show, that SRD schedule is optimal schedule for Cmax.
Hence SRD rule gives an optimal solution for problem (S). �
Case(2): If p1 ≤ p2 ≤ · · · ≤ pn, r1 ≤ r2 ≤ · · · ≤ rn and Ci = dig∀i
in a schedule SPT , then SPT is an optimal solution for problem (S).
Proof:
Since Ci = dig∀i in SPT , then Emax = Tmax = 0. The problem (S)
reduced to 1/ri/∑n
i=1Ci, but this problem solved in SPT rule [2]. �
![Page 39: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/39.jpg)
Chapter Two: Branch and Bound Method to Minimize Three Criteria j 27
Case(3): If Ci = dig∀i in a schedule π and the preemptive is allowed,
then π gives an optimal solution for the problem 1/ri, pmtn/∑n
i=1Ci +
Emax + Tmax.
Proof:
SinceEmax = Tmax = 0 in π, then the problem (S) reduced to 1/ri, pmtn/∑ni=1Ci, but this problem was solved by SRPT rule [9]. Then π gives an
optimal solution for the problem 1/ri, pmtn/∑n
i=1Ci + Emax + Tmax pro-
vided that Ci = dig∀i ∈ π. �
Case(4): If in SPT schedule ri = rg∀i and satisfy Just In Time(JIT ),
then SPT gives an optimal solution for the problem 1/ri = r/∑n
i=1Ci +
Emax + Tmax.
Proof:
From (JIT ) we get Emax = Tmax = 0, then the problem (S) reduced
to 1/ri = r/∑n
i=1Ci. But this problem was solved by (SPT ) rule. Then
SPT gives an optimal solution for the problem 1/ri = r/∑n
i=1Ci+Emax+
Tmax. �
Case(5): Any schedule gives an optimal solution for the problem (S),
if ri = r, pi = p and di = dgg∀i = 1, 2, · · · , n.
Proof:
Since∑n
i=1Ci = nr+ (n2+n2 )p, Emax = max{d− (r+ p), 0} and Tmax =
max{(r+np)−d, 0} in any schedule. Then any schedule is optimal for the
problem 1/ri = r, pi = p, di = d/∑n
i=1Ci+Emax+Tmax (because the three
![Page 40: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/40.jpg)
Chapter Two: Branch and Bound Method to Minimize Three Criteria j 28
quantities are constants). �
Case(6): For the problem S, if ri = r, pi = pg∀i and
i. If Emax(EDD) = Emax(MST ), then EDD schedule gives the optimal
solution.
ii. If Tmax(MST ) = Tmax(EDD), then MST schedule gives the optimal
solution.
Proof: (i)
Since any sequence gives an optimal solution for 1/ri = r, pi = p/∑n
i=1Ci
problem and EDD rule gives an optimal solution for 1/ri = r, pi = p/Tmax
problem and MST gives an optimal solution for 1/ri = r, pi = p/Emax
problem. But Emax(EDD) = Emax(MST ). So EDD gives an opti-
mal solution for 1/ri = r, pi = p/∑n
i=1Ci + Emax + Tmax provided that
Emax(EDD) = Emax(MST ). �
Proof: (ii)
Since any sequence gives an optimal solution for 1/ri = r, pi = p/∑n
i=1Ci
problem and MST rule gives an optimal solution for 1/ri = r, pi = p/Emax
problem and EDD gives an optimal solution for 1/ri = r, pi = p/Tmax
problem. But Tmax(MST ) = Tmax(EDD). So MST gives an opti-
mal solution for 1/ri = r, pi = p/∑n
i=1Ci + Emax + Tmax provided that
Tmax(MST ) = Tmax(EDD). �
Case(7): For the problem S, if ri = r, di = dg∀i = 1, 2, · · · , n, and
![Page 41: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/41.jpg)
Chapter Two: Branch and Bound Method to Minimize Three Criteria j 29
i. If∑n
i=1Ci(MST ) =∑n
i=1Ci(SPT ), then MST schedule is the opti-
mal solution.
ii. If Emax(SPT ) = Emax(MST ), then SPT schedule is the optimal
solution.
Proof: (i)
From conditions ri = r, di = dg∀i = 1, 2, · · · , n, any order gives optimal
solution for problem 1/ri = r, di = d/Tmax. Now, since∑n
i=1Ci(MST ) =∑ni=1Ci(SPT ), thenMST minimize of the problem 1/ri = r, di = d/
∑ni=1Ci+
Emax+Tmax. �
Proof: (ii)
From conditions ri = r, di = dg∀i = 1, 2, · · · , n, any order gives opti-
mal solution for problem 1/ri = r, di = d/Tmax. Now, since Emax(SPT ) =
Emax(MST ), then SPT minimize of the problem 1/ri = r, di = d/∑n
i=1Ci+
Emax+Tmax. �
Case(8): If ri = r and SPT schedule gives di + pj ≤ djg∀gi ≺ j and
Tmax(SPT ) = Tmax(EDD), then SPT is optimal solution for the problem
(S).
Proof:
Since di+pj ≤ djg∀gi ≺ j in SPT schedule then di−pi ≤ dj−pjg∀i ≺
jg(pi > 0). Thus SPT gives optimal solution for both criteria Emax and∑ni=1Ci, and from condition Tmax(SPT ) = Tmax(EDD). Then SPT is op-
timal solution for the problem (S). �
![Page 42: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/42.jpg)
Chapter Two: Branch and Bound Method to Minimize Three Criteria j 30
Case(9): For the problem S, if ri = r, di = dg∀i = 1, 2, · · · , n, then any
schedule σ = (1, 2, · · · , n) with C1 < d < Cn, where C1 is Cmin and Cn is
Cmax is optimal with value of∑n
i=1Ci+Emax+Tmax is∑n
i=1Ci(σ)+Cn−C1.
Proof:
It is clear that Emax + Tmax is minimum if C1 < d < Cn with value is
Cn − C1 by considering the three possible position for d.
1. If C1 < d < Cn ⇒ Emax + Tmax = d− C1 + Cn − d = Cn − C1.
2. If d < C1 < Cn ⇒ Emax + Tmax = 0 + Cn − d > Cn − C1.
3. If C1 < · · · < Cn < d ⇒ Emax + Tmax = d− C1 + 0 > Cn − C1.
Also it is clear that SPT schedule σ is optimal for∑
σ∈S Ci(σ) as required�.
2.5 Dominance Rule
Because of branching scheme, the size of the search tree is directly
linked to the length of the current sequence (which represents the number
of nodes). Hence, a preprocessing step is performed in order to remove
as many positions as possible. Reducing the current sequence is done by
using several dominance rules. Dominance rules usually specify whether
a node can be eliminated before its lower bound is calculated. Clearly,
dominance rules are particularly useful when a node can be eliminated
which has a lower bound that is less than the optimum solution. Some of
dominance rules are valid for minimization of the sum of total completion
time, maximum earliness, and maximum tardiness.
As in the preprocessing step, similar dominance rules are also used
![Page 43: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/43.jpg)
Chapter Two: Branch and Bound Method to Minimize Three Criteria j 31
within the branch and bound procedure to cut nodes that are dominated
by others. These improvements lead to very large decrease in the number
of nodes to obtain the optimal solution [33].
Below three of dominance rules are stated in order to decrease the
number of nodes in search tree as well as decreasing the time.
Dominance Rule(1): If δk be a partial sequence which it’s jobs are
scheduled, K ⊂ N . For i,j∈ K = N−K,and let τ be the completion time
of last job in δk. If pi ≤ pj, di ≤ dj, Si ≤ Sj, and τ > max{ri, rj}. Then
job i proceed job j in the optimal solution for the problem (S).
Proof:
Let (δk, j, i) be the schedule which is obtained by interchanging jobs i
and j in (δk, i, j). All jobs other than i and j have the same completion
time in (δk, i, j) as in (δk, j, i). So the difference in completion time be-
tween (δk, i, j) and (δk, j, i) depends only on the completion time of jobs i
and j.
The total completion time of jobs i and j in (δk, i, j) is:
Ci + Cj = 2τ + 2pi + pj (2.4)
The total completion time of jobs i and j in(δk, j, i) is:
C′
i + C′
j = 2τ + 2pj + pi (2.5)
From (2.4) and (2.5), we get∑
i∈(δk,i,j)Ci −∑
i∈(δk,j,i)C′
i = pi − pj ≤ 0
Then∑
i∈(δk,i,j)
Ci ≤∑
i∈(δk,j,i)
C′
i (2.6)
![Page 44: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/44.jpg)
Chapter Two: Branch and Bound Method to Minimize Three Criteria j 32
The maximum earliness of (δk, j, i) is:
E′
max = max{Ec, E′
j, E′
i}, where Ec = maxd∈δk{Ed}
E′
j = max{dj − C′
j, 0}
E′
i = max{di − C′
i, 0}
Since C′
j ≤ C′
i and di ≤ dj, then E′
i ≤ E′
j
So E′
max = max{Ec, E′
j}.
The maximum earliness in (δk, i, j) is:
Emax = max{Ec, Ei, Ej}, where Ei = max{di − Ci, 0}
Ej = max{dj − Cj, 0}
But Ei ≤ E′
j because Si ≤ Sj, and since C′
j ≤ Cj, then Ej ≤ E′
j.
Therefore Emax = max{Ec, Ei, Ej} ≤ max{Ec, E′
j} = E′
max.
Then Emax ≤ E′
max (2.7)
For schedule (δk, i, j): The maximum tardiness is:
Tmax = max{Tm, Tj, Ti}, where Tm = maxs∈δk{Ts}
Tj = max{Cj − dj, 0}, and
Ti = max{Ci − di, 0}
For schedule (δk, j, i): The maximum tardiness is: T′
max = max{Tm, T′
j, T′
i},
where
T′
j = max{C ′
j − dj, 0} and
T′
i = max{C ′
i − di, 0}
Since C′
j ≤ C′
i and di ≤ dj, then T′
i ≥ T′
j. So T′
max = max{Tm, T′
i}.
Since Ci < C′
i ⇒ Ti < T′
i . But T′
i > Tj because Cj − dj ≤ C′
i − di.
Therefore T′
max = max{Tm, T′
i} ≥ max{Tm, Tj, Ti} = Tmax
Then T′
max ≥ Tmax (2.8)
![Page 45: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/45.jpg)
Chapter Two: Branch and Bound Method to Minimize Three Criteria j 33
From (2.6), (2.7), and (2.8) we get∑i∈(δk,i,j)
Ci + Emax + Tmax ≤∑
i∈(δk,j,i)
C′
i + E′
max + T′
max. �
Dominance Rule(2): If δk be a partial sequence which it’s jobs are
scheduled, K ⊂ N . For i,j∈ K = N − K, and let τ be the completion
time of last job in δk. If pi ≤ pj, ri ≤ rj ≤ τ and di ≤ dj ≤ τ . Then job i
proceed job j in the optimal solution for the problem (S).
Proof:
Let (δk, j, i) be the schedule which is obtained by interchanging jobs i
and j in (δk, i, j).
The earliness of jobs i and j in (δk, i, j) and (δk, j, i) is equal to zero, since
τ ≥ dj.
From (2.4) and (2.5) we get∑i∈(δk,i,j)
Ci ≤∑
i∈(δk,j,i)
C′
i (2.9)
For schedule (δk, i, j): The maximum tardiness is:
Tmax = max{Tm, Tj, Ti}, where Tm = maxs∈δk{Ts},
Tj = max{Cj − dj, 0}, and
Ti = max{Ci − di, 0}
For schedule (δk, j, i): The maximum tardiness is:
T′
max = max{Tm, T′
j, T′
i}, where
T′
j = max{C ′
j − dj, 0} and
T′
i = max{C ′
i − di, 0}
Since C′
j < C′
i and di ≤ dj, then T′
i ≥ T′
j. So T′
max = max{Tm, T′
i}.
![Page 46: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/46.jpg)
Chapter Two: Branch and Bound Method to Minimize Three Criteria j 34
Since Ci < C′
i ⇒ Ti < T′
i . But T′
i ≥ Tj since di ≤ dj.
Therefore T′
max = max{Tm, T′
i} ≥ max{Tm, Tj, Ti} = Tmax
Then T′
max ≥ Tmax (2.10)
From (2.11) and (2.12) we get∑i∈(δk,i,j)Ci + Emax + Tmax ≤
∑i∈(δk,j,i)C
′
i + E′
max + T′
max. �
Dominance Rule(3): If δk be a partial sequence which it’s jobs are
scheduled, K ⊂ N . For i,j∈ K = N − K, if pi ≤ pj, Si ≤ Sj, j, i are
early in (δk, i, j) and (δk, j, i) respectively and τ > max{ri, rj}. Then job
i proceed job j in the optimal solution for the problem (S).
Proof:
The tardiness of jobs i and j in (δk, i, j) and (δk, j, i) are equal to zero,
because the jobs j, i are early. Also we have since pi ≤ pj.∑i∈(δk,i,j)
Ci ≤∑
i∈(δk,j,i)
C′
i. (2.11)
The maximum earliness in (δk, i, j) is:
Emax = max{E,Ei, Ej}, where E = maxs∈δk{Es},
Ei = max{Si − τ, 0}
Ej = max{Sj − Ci, 0}
The maximum earliness in (δk, j, i) is:
E′
max = max{E,E ′
j, E′
i}, where E′
j = max{Sj − τ, 0}
E′
i = max{Si − C′
j, 0}
Since Si ≤ Sj, then E′
j ≥ E′
i. So E′
max = max{E,E ′
j} and E′
j ≥ Ei. But
E′
j ≥ Ej (since Ci > τ).
![Page 47: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/47.jpg)
Chapter Two: Branch and Bound Method to Minimize Three Criteria j 35
Therefore Emax = max{E,Ei, Ej} ≤ max{E,E ′
j} = E′
max.
Then Emax ≤ E′
max (2.12)
From (2.13) and (2.14) we get∑i∈(δk,i,j)Ci + Emax + Tmax ≤
∑i∈(δk,j,i)C
′
i + E′
max + T′
max. �
Example(2.1): The dominance rule illustrate in five jobs scheduling
problems
i 1 2 3 4 5
ri 0 2 5 3 8
pi 3 5 6 2 1
di 6 9 11 5 4
Solution:
51
63
ILB
UB
50 67 89 63 92
1 2 3 4 5
57 72 51 75
2 3 4 5
56 59 63
2 3 5
56
56
67
67
3
5
5
3
Figure 2.1: BAB method without dominance rule
![Page 48: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/48.jpg)
Chapter Two: Branch and Bound Method to Minimize Three Criteria j 36
51
63
ILB
UB
50 67 89 63 92
1 2 3 4 5
57 72 51 75
2 3 4 5
56 59 63
2 3 5
56
56
5
3
Figure 2.2: BAB method with dominance rule
The optimal schedule is (1, 4, 2, 5, 3) with∑5
i=1Ci + Emax + Tmax = 56
2.6 Upper Bound
In this section, four heuristic methods are used for ordering the jobs
and evaluating the cost of problem (S).
Heuristic (1): Order the jobs according to SPT rule, and find UB1 =
(∑n
i=1Ci + Emax + Tmax)(SPT ).
Heuristic (2): Order the jobs according to MST rule, and find UB2 =
(∑n
i=1Ci + Emax + Tmax)(MST ).
Heuristic (3): Order the jobs according to EDD rule, and find UB3 =
(∑n
i=1Ci + Emax + Tmax)(EDD).
![Page 49: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/49.jpg)
Chapter Two: Branch and Bound Method to Minimize Three Criteria j 37
Heuristic (4): Order the jobs according to SRD rule, and find UB4 =
(∑n
i=1Ci + Emax + Tmax)(SRD).
The heuristic which gives a minimum cost of the problem (S) among these
heuristics is chosen to be an upper bound, (i.e. UB = min{UB1, UB2, UB3,
UB4}). This UB is then used in a root node of the search tree in a branch
and bound method.
Example(2.2): The upper bound was illustrated in four jobs scheduling
problems
i 1 2 3 4
ri 0 3 3 5
pi 4 2 6 5
di 8 12 11 10
Solution:
The SPT schedule is (2,1,4,3) then UB1 =∑4
i=1Ci + Emax + Tmax = 64.
The MST schedule is (1,3,4,2) then UB2 =∑4
i=1Ci +Emax + Tmax = 55.
The EDD schedule is (1,4,3,2) then UB3 =∑4
i=1Ci +Emax + Tmax = 58.
The SRD schedule is (1,2,3,4) then UB4 =∑4
i=1Ci + Emax + Tmax = 52.
Hence UB = min{UB1, UB2, UB3, UB4} = 52.
It should be noted that an optimal sequence is (1,2,4,3) for this ex-
ample, and the optimal value is 50 which is obtained by using complete
enumeration.
![Page 50: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/50.jpg)
Chapter Two: Branch and Bound Method to Minimize Three Criteria j 38
2.7 Lower Bound
Deriving a lower bound for a problem (S) that has a multiple objective
function is very difficult since it is not easy to find the minimum cost for
the three objectives. Since the problem (S) is strongly NP-hard may be
find a lower bound that gives minimum value for one of them but not all.
In this section two lower bounds LB1 and LB2 are derived for problem
(S).
2.7.1 The First Lower Bound
The first lower bound is based on decomposing (S) into three sub-
problems (SA1), (SA2) and (SA3) as shown in Section (2.3), then N1 was
calculated to be the lower bound for (SA1) by SRPT rule [9], N2 was
calculated to be the lower bound for (SA2) by MRST rule [10], N3 was
calculated to be the lower bound for (SA3) by MEDD rule [10] and then
applying Theorem(2.1) to get the first lower bound for problem (S).
Example(2.3): The first lower bound was illustrated in four jobs schedul-
ing problems
i 1 2 3 4
ri 0 2 12 8
pi 6 3 7 9
di 8 5 9 11
Solution:
For the relax problem 1/ri, pmtn/∑n
i=1Ci, the SRPT rule shown in
Figure 2.3
![Page 51: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/51.jpg)
Chapter Two: Branch and Bound Method to Minimize Three Criteria j 39
Figure 2.3: (SRPT) rule
C1 = 9 C2 = 5 C3 = 19 C4 = 25 So N1 =∑4
i=1Ci = 58.
For the relax problem 1/ri/Emax, we assume that r = max1≤≤4{ri}
r = ri, i = 1, · · · , 4, then the problem reduce to 1/ri = r/Emax which was
solved by MST rule.
C1 = 37 C2 = 15 C3 = 22 C4 = 31
E1 = 0 E2 = 0 E3 = 0 E4 = 0 So N2 = Emax = 0.
For the relax problem 1/ri, pmtn/Tmax, the MEDD rule shown in
Figure 2.4
Figure 2.4: (MEDD) rule
C1 = 9 C2 = 5 C3 = 19 C4 = 25
T1 = 1 T2 = 0 T3 = 10 T4 = 14 So N3 = Tmax = 14.
Then LB1 = N1 +N2 +N3 = 58 + 0 + 14 = 72.
2.7.2 The second Lower Bound
The second lower bound can be calculated for the problem (S) by using
the relaxation of constraints of objective function as follows:
For the problems (SA1) and (SA3), we assume that r∗ = min1≤i≤n{ri}
![Page 52: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/52.jpg)
Chapter Two: Branch and Bound Method to Minimize Three Criteria j 40
and ri = r∗, i = 1, 2 . . . , n, to get the problems 1/ri = r∗/∑n
i=1Ci and
1/ri = r∗/Tmax, which are solved by SPT and EDD rules respectively.
For the problem (SA2), we assume that r = max1≤i≤n{ri} and r = ri, i =
1, 2, · · · , n, to get the problem 1/ri = r/Emax, which was solved by MST
rule and then applying Theorem (2.1) to get the second lower bound for
problem (S).
Hence the lower bound is LB = max{LB1, LB2}.
Example(2.3):The second lower bound illustrates in four jobs scheduling
problems
i 1 2 3 4
ri 0 1 2 1
pi 4 2 6 5
di 8 12 11 10
Solution:
Let r∗ = min1≤i≤4{ri} = 0, then SPT gives the schedule (2, 1, 4, 3) with
N1 =∑4
i=1Ci = 36 and the EDD gives the schedule (1, 4, 3, 2) with
N3 = Tmax = 5.
Let r = max1≤i≤4{ri} = 2, then MST gives the schedule (1, 3, 4, 2) with
N2 = Emax = 2.
Then LB2 = N1 +N2 +N3 = 36 + 2 + 5 = 43.
2.8 Branch and Bound Algorithm
In this section,a description of BAB algorithm is given and its imple-
mentation. The heuristic method is applied at the top of search tree (root
node) to provide the UB on cost of an optimal schedule is obtained by
![Page 53: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/53.jpg)
Chapter Two: Branch and Bound Method to Minimize Three Criteria j 41
choosing the upper bound from Section (2.6). Also at the top of the search
tree an ILB on the cost of an optimal schedule is obtained by choosing
the better of two lower bounds from Section (2.7). The algorithm uses a
forward sequencing branching rule for which nodes at level k of the search
tree corresponds to initial sequences in which jobs are sequenced in the
first k positions. The branching procedure describes the method to par-
tition a subset of possible solution. These subsets can be treated as a set
of solutions of corresponding subproblems of the original problem. The
bounding procedure indicates how to calculate the LB on the optimal so-
lution value for each subproblem generated in the branching process. The
search strategy describes the method of choosing a node of the search tree
to branch from it; we usually branch from a node with smallest LB among
the recently created nodes [4].
2.9 Computational Experience
An intensive work of numerical experimentations has been performed.
Subsection (2.9.1) shows how instances (test problems) can be randomly
generated.
2.9.1 Test Problems
There exists in the literature a classical way to randomly generate test
problems of scheduling problems.
• The processing time pj is uniformly distributed in the interval [1, 10].
![Page 54: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/54.jpg)
Chapter Two: Branch and Bound Method to Minimize Three Criteria j 42
• The release date rj is uniformly distributed in the interval [0, αP ],
where [α=0.125, 0.25, 0.50, 0.75, 1.00] and P =∑n
i=1 pi.
• The due date dj is uniformly distributed in the interval [P(1-TF-
RDD/2), P(1-TF+RDD/2)]; where P =∑n
i=1 pi. Depending on the
relative range of due date (RDD) and on the average tardiness factor
(TF).
For both parameters, the values 0.2, 0.4, 0.6, 0.8 and 1.0 are considered.
For each selected value of n, where n is the number of jobs, five problems
were generated.
2.9.2 Computational Experience with the Lower and Upper
Bounds of (BAB) Algorithm
The BAB algorithm was tested by coding it in MATLAB 7.10.0 (R2010a)
and implemented on Intel(R) Core(TM)2 Duo CPU T6670 @ 2.20 GHZ,
with RAM 2.00 GB personal computer.
Table (2.1), shows the results for problem (S) obtained by BAB al-
gorithm. The first column ”n” refers to the number of jobs, the second
column ”EX” refers to the number of examples for each instance n, where
n ∈ {5, 10, 15, 20, 25, 30}, the third column ”Optimal” refers to the opti-
mal values obtained by BAB algorithm for problem (S), the fourth column
”UB” refers to the upper bound, the fifth column ”ILB” refers to the ini-
tial lower bound, the sixth column ”Nodes” refers to the number of nodes,
the seventh column ”Time” refers to the time cost ’by second’ to solve the
problem, the last column ”Status” refers to the problem solved ’0’ or not
![Page 55: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/55.jpg)
Chapter Two: Branch and Bound Method to Minimize Three Criteria j 43
’1’. The symbols ”*” refers to the UB gives an optimal solution and ”**”
refers to the ILB gives an optimal solution. The BAB algorithm was
stopped when the sum of ”status column≥ 3”. A condition for stopping
the BAB algorithm was determined and considering that the problem is
unsolved (state is 1), that the BAB algorithm is stopped after a fixed pe-
riod of time, here after 1800 second (i.e. after 30 minutes).
If the value of UB=ILB then the optimal is UB and there is no need
to branch the serch tree of BAB algorithm.
From Table (2.1), it is noticed that the heuristic of upper bound gives
good results. It gives the value for objective function equal to optimal or
near optimal values.
Table (2.1): The performance of initial lower bound, upper bound, num-
ber of nodes and computational time in second of (BAB) algorithm
for (S).
n EX Optimal UB ILB Nodes Time Status
1 127 127* 125 36 0.00673 0
2 57 58 57** 14 0.0035 0
5 3 87 93* 87** 22 0.0080 0
4 77 77* 76 14 0.0025 0
5 60 60* 60** 0 0.0008 0
1 352 401* 336 1396 0.1488 0
2 201 226 196 1547 0.1583 0
10 3 337 368 327 414 0.0449 0
![Page 56: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/56.jpg)
Chapter Two: Branch and Bound Method to Minimize Three Criteria j 44
Table (2.1) continued
n EX Optimal UB ILB Nodes Time Status
4 389 408 376 1265 0.1385 0
5 146 195 143 67 0.0128 0
1 660 744 644 12529 1.5424 0
2 557 672 549 16211 2.3152 0
15 3 720 803 720** 210 0.0337 0
4 353 378 331 273 0.0473 0
5 567 599 532 252493 35.0969 0
1 939 1030 935 5277 0.8577 0
2 1074 1281 1064 7024 1.1478 0
20 3 769 912 762 7535 1.6953 0
4 943 1144 913 2676012 479.7449 0
5 777 929 769 12812 2.2995 0
1 1502 1908 1463 4230692 434.4436 0
2 1649 2021 1570 3974179 398.7797 0
25 3 1877 2040 1746 17151377 1800.0001 1
4 1363 1547 1359 487324 51.7407 0
5 1847 1963 1757 17791999 1800.0011 1
1 2883 3278 2767 16378467 1800.0003 1
2 2106 2578 2076 5231216 568.0532 0
30 3 1973 2420 1878 16315296 1800.0001 1
4 1873 2418 1796 4174669 470.0744 0
5 2086 2599 2056 665806 73.6937 0
![Page 57: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/57.jpg)
Chapter Two: Branch and Bound Method to Minimize Three Criteria j 45
Table (2.2) summarizes Table (2.1)
Table (2.2): Summary of Table (2.1) of (BAB) algorithm
n Av.Nodes Av.Time Unsolved problem
5 17.2 0.164 0
10 937.8000 0.1007 0
15 5.6343 7.8071 0
20 1031581 70.1028 0
25 8.7271 294.9880 2
30 8.5531 370.6000 2
Table (2.2) is the summary of Table (2.1), and shows the average of
nodes and computational times for the solved problems. It also shows the
unsolved problems among the 5 problems of each n, where n ∈ {5, 10, 15, 20,
25, 30}.
![Page 58: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/58.jpg)
Chapter 3
Neural Networks to Solve
Scheduling Problems
3.1 Introduction
3.1.1 Artificial Neural Network (ANN)
Artificial neural network is an information processing system that has
certain performance characteristics in common with biological neural net-
works [54]. Below are some characteristics that are similar in both artificial
neural network and biological neural networks [39]:
1. The processing element (neuron) receives many signals.
2. Signals may be modified by a weight at the receiving element.
3. The weighted inputs are the processing element sums.
4. Under appropriate circumstances (sufficient input), the neuron trans-
mits a signal output.
46
![Page 59: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/59.jpg)
Chapter Three: Neural Networks to Solve Scheduling Problems j 47
5. The output from a particular neuron may go to many other neurons
(the axon branches).
6. Information processing is local.
7. Memory is distributed:
• Long-term memory resides in the neurons synapses or weights.
• Short-term memory corresponds to the signals sent by the neu-
rons.
8. A synapse strength may be modified by experience.
9. Neurotransmitters for synapse may be excitatory or inhibitory.
Also, ANN can be thought of as ” black box” devices that accept
inputs and produces outputs [42]. ANN was inspired by the manner in
which the heavily interconnected, parallel structure of the human brain
processes information. There are collections of mathematical processing
units , known as neurons, which has a propensity for storing, making easily
available, experiential knowledge, emulate some of the observed proper-
ties of biological nervous systems and draw on the analogies of adaptive
biological learning [43]. A neural network element (neuron) is the smallest
processing unit of the whole network essentially forming a weighted sum
and transforming it by activation function to obtain the output in order
to gain sufficient computing power, several neurons are interconnected
together [54]. ANN resembles the brain in two respects [43]:
![Page 60: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/60.jpg)
Chapter Three: Neural Networks to Solve Scheduling Problems j 48
1. Knowledge is acquired by the network from its environment through
learning processes, where, knowledge in a NN is represented in the
values of the weights and biases, which form part of large and dis-
tributed network.
2. Inter-neuron connection strengths, known as synaptic weights, are
used to store the acquired knowledge.
Artificial neural networks have been developed as generalizations of
mathematical models of human brain or neural biology, based on the as-
sumptions that [39]:
1. Information processing occurs at many simple elements called neu-
rons.
2. Signals are passed between neurons over connection links.
3. Each connection link has an associated weight.
4. Each neuron applies an activation function (usually nonlinear).
A neural net consists of a large number of simple processing elements
called (neurons, units, cells, or nodes). Each neuron is connected to other
neurons by means of directed communication links, each with an asso-
ciated weight. The weights represented information being used by the
net to solve a problem. A neural network can be applied to a wide vari-
ety of problems, such as storing and recalling data or patterns, classify-
ing patterns, performing general mappings from input patterns to output
patterns, grouping similar patterns, or finding solutions to constrained
![Page 61: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/61.jpg)
Chapter Three: Neural Networks to Solve Scheduling Problems j 49
optimization problems. Each neuron has an internal state, called its acti-
vation or activity level, which is a function of the inputs it has received.
Typically, a neuron sends its activation as a signal to several other neu-
rons. It is important to note that a neuron can send only one signal at a
time, although that signal is broadcast to several other neurons.
For example, consider a neuron Y , illustrated in Figure (3.1), that
receives inputs from neurons X1, X2, and X3. The activations (output
signals) of these neurons are x1, x2, and x3, respectively. The weights on
the connections from X1, X2, and X3 to Y are w1, w2, and w3, respectively.
The net input, y−in = (net), to neuron Y is the sum of input signals from
neurons X1, X2,and X3 multiplied by the weights on the connections be-
tween them, i.e.,
y−in = (net) =∑3
i=1wixi.
Figure 3.1: A simple artificial neuron.
3.1.2 Architecture of Neural Network
The architecture of the neural network refers to its framework as well
as its interconnection scheme. The number of layers and the number of
nodes per layer often specify the framework [54]. Often, it is convenient
![Page 62: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/62.jpg)
Chapter Three: Neural Networks to Solve Scheduling Problems j 50
to visualize neurons as arranged in layers. Typically, neurons in the same
layer behave in the same manner. The behavior of neurons depends on
activation function and the pattern of weighted connections over which it
sends and receives signals.
The arrangement of neurons into layers and the connection patterns
between layers is called the net architecture [39]. These layers are:
Input Layer: A layer of neurons which are called input units or sensory
layer, that receives information from external sources, and passes this
information to the network for processing. These information may be
either sensory inputs or signals from other systems outside the one
being modeled [8]. The input layer perform not computation but
distributed information to other units in the successive layer [54].
Hidden Layer: Layer of the nodes which are called hidden units or pro-
cess layer, that receives information from the input layer and pro-
cesses them in a hidden way. It has no directed connections to the
outside world (inputs or outputs). All connections from the hidden
layer to other layers within the system [42]. This layer provide into
the networks the capability to map or classify nonlinear problems
[54]. The neural network may have one or more hidden layers and
some it do not have.
Output Layer: Layer of the nodes which are called output units or re-
sponse layer, that receives processing information and sends outputs
![Page 63: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/63.jpg)
Chapter Three: Neural Networks to Solve Scheduling Problems j 51
signals out of the system [42]. This layer encodes possible concepts
(or values) to be assigned to the instance under consideration. For
example each output unit represents a class of objective [54].
Hence, the neural networks can be classified with respect to the layers as
follows [39]:
i. Single Layer Net: A single layer net has one layer of connection weights.
Often, the units can be distinguished as input units, which receive
signals from the outside world, and output units, from which the re-
sponse of the net can be read. In the typical single layer net shown
in Figure (3.2), the input units are fully connected to output units
but the input units and the output units are not connected to other
units in the same layer.
Figure 3.2: Single Layer Net.
![Page 64: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/64.jpg)
Chapter Three: Neural Networks to Solve Scheduling Problems j 52
ii. Multilayer net: Is a net with one or more hidden layers (or levels) of
nodes (so-called hidden units) between the input units and the output
units. Typically, there is a layer of weights between two adjacent
levels of units (input, hidden, or output). Multilayer nets can solve
more complicated problems than can single layer nets, but training
may be more difficult. However, in some cases, training may be more
successful, because it is possible to solve a problem that a single layer
net cannot be trained to perform correctly at all [54]. The multilayer
nets illustrated in Figure (3.3).
Figure 3.3: A multilayer neural net.
![Page 65: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/65.jpg)
Chapter Three: Neural Networks to Solve Scheduling Problems j 53
3.1.3 Network Learning Methods [45]
Among the many interesting properties of a neural network is the
ability of the network to learn from its environment and to improve its
performance through learning. A neural network learns about its environ-
ment through an iterative process of adjustments applied to its weights
and thresholds. The types of learning are determined by the manner in
which the weights changes take place.
The learning process implies the following sequence of events:
1. The neural network is stimulated by an environment.
2. The neural network undergoes changes as a result of this stimulation.
3. The neural network responds in a new way to the environment, be-
cause of the changes that have occurred in its internal structure.
3.1.3.1 Error-correction learning
Figure 3.4: Error-correction learning diagram.
![Page 66: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/66.jpg)
Chapter Three: Neural Networks to Solve Scheduling Problems j 54
3.1.3.2 Supervised learning
An essential ingredient of supervised is the availability of an external
teacher, which is able to provide the neural network with a desired or tar-
get response. The network parameters are adjusted under the combined
influence of the training vector and the error signal. This adjustment is
carried out iteratively in a step-by-step fashion with the aim of eventu-
ally making the neural network emulate the teacher. This form of super-
vised learning is in fact an error-correction learning, which was already
described.
3.1.3.3 Unsupervised learning
In unsupervised or self-organized learning there is no external teacher
to oversee the learning process. In other words, there are no specific
samples of the function to be learned by the network. Rather, provision is
made for a task-independent measure of the quality of representation that
the network is required to learn and the free parameters of the network are
optimized with respect to that measure. Once the network has become
tuned to the statistical regularities of the input data, it develops the ability
to form internal representations for encoding features of the input and
thereby creates new classes automatically.
![Page 67: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/67.jpg)
Chapter Three: Neural Networks to Solve Scheduling Problems j 55
3.1.4 The activation functions [39, 45]
hhbh
Figure 3.5: Activation Functions
3.1.5 Applications of Neural Networks
The study of neural networks is an extremely interdisciplinary field,
both in its development and in its application. A brief sampling of some
of the areas in which neural networks are currently being applied suggests
the breadth of their applicability. The examples range from commercial
![Page 68: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/68.jpg)
Chapter Three: Neural Networks to Solve Scheduling Problems j 56
successes to areas of active research that show promise for the future. The
able of neural networks on solve the complex problems implies to use in
many applications, such as [39]:
Signal processing, Control, Pattern Recognition, Medicine, Speech pro-
duction, Speech Recognition, and Business.
3.1.6 Beginning of Neural Nets
The McCulloch-Pitts neuron is perhaps the earliest artificial neuron
in 1943. It displays several important features found in many neural net-
works. The requirements for McCulloch-Pitts neurons as follows [39]:
1. The activation of McCulloch-Pitts neuron is binary, and the neuron
either fires (has an activation of 1) or does not fire (has an activation
of 0).
2. McCulloch-Pitts neurons are connected by directed, weighted paths,
where the neurons are connected by directed paths (has weights).
3. A connected path is excitatory if the weight on the path is positive;
otherwise it is inhibitory. All excitatory connections into particular
neuron (one neuron) have the same weights.
4. Each neuron has a fixed threshold such that if the sum of net inputs
to the neuron is greater than the threshold, then the neuron is fire.
5. The threshold is set so that the inhibition neuron is absolute. That
is, any nonzero inhibition input will prevent the neuron from firing.
![Page 69: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/69.jpg)
Chapter Three: Neural Networks to Solve Scheduling Problems j 57
6. The signal takes one step to pass over one connection link from neuron
to other.
3.2 Related Works
Various applications, such as communications, routing, industrial con-
trol, operations research, and production planning employ scheduling con-
cepts. Most problems in these applications are confirmed to be NP-
complete or combinatorial problems. This fact implies that an optimal
solution for a large scheduling problem is rather time consuming. The
traveling salesman problem (TSP ) is a typical NP-complete problem,
comprising a Hamiltonian cycle, which seeks a tour that has a minimum
cost; obtaining the optimal solution is very time consuming. Various
schemes have been developed for solving the scheduling problem. Linear
programming is a widely used scheme for determining the cost function
based on the specific scheduling problem. Willems and Rooda translated
the job-shop scheduling problem into a linear programming format, and
then mapped it into an appropriate neural network structure to obtain
a solution [53]. Furthermore, Foo and Takefuji employed integer linear
programming neural networks to solve the scheduling problem by mini-
mizing the total starting times of all jobs with a precedence constraint
[55]. Meanwhile, Zhang et al. proposed a neural network method derived
from linear programming, in which preemptive jobs are scheduled based
on their priorities and deadline [18]. Additionally, Cardeira and Mammeri
investigated the multiprocessor real time scheduling by applying the k-
![Page 70: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/70.jpg)
Chapter Three: Neural Networks to Solve Scheduling Problems j 58
out-of N rule to a neural network [17]. Above investigations concentrated
on the preemptive jobs (processes) executed on multiple machines (mul-
tiprocessor) with job transfer permitted by applying a neural network.
Meanwhile, Hanada and Ohnishi [6] developed a parallel algorithm based
on a neural network for preemptive task scheduling problems by allowing
for a task transfer among machines. Park et al. [36] embedded a classical
local search heuristic algorithm into the TSP optimization neural network.
The next section introduces the problem statement. The neural network
method and the scheduling of some small systems are determined in the
section after.
3.3 Problem Statement
The machine scheduling problem studied in this section requires n in-
dependent jobs Ji(i = 1, 2, · · · , n) to be processed on a single machine
with the following assumptions:
i. Each job i has a release time ri.
ii. The single machine can process one job at most at a time.
iii. No preemption is allowed.
Let
π Schedule for the n jobs.
pi processing time of job i on the machine.
![Page 71: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/71.jpg)
Chapter Three: Neural Networks to Solve Scheduling Problems j 59
di due date of job i.
Ci completion time of job i, for the schedule π, Ci =∑i
j∈π,j=1 pj
Ei =max{0, di − ci}
Ti =max{0, ci − di}
The objective function can be written as:
f(π) =n∑i=1
Cπ(i) + Emax(π) + Tmax(π) (3.1)
In this chapter, we use multilayer perceptron neural networks as a
heuristic method for solving the problem (S).
3.4 Multilayer Perceptron NN for Single Machine
Multilayer perceptron neuron network consists of a set of sensory units
(source nodes) that constitute the input layer, one or more hidden asso-
ciative layers of computation nodes, and response units of computation
nodes. The input signal propagates through the network in a forward
direction, on a layer-by-layer basis. Multilayer perceptrons have been
applied successfully to solve a number of diverse, difficult problems by
training them in a supervised manner with a highly popular algorithm
known as the back-propagation algorithm. This algorithm is based on
error-correction learning rule. Error back-propagation learning consists of
two passes through the different layers of the network: a forward passes
and a backward pass.
![Page 72: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/72.jpg)
Chapter Three: Neural Networks to Solve Scheduling Problems j 60
In the forward pass, an activity pattern (input vector)is applied to
the sensory nodes of the network, and its effect propagates through the
network layer by layer. Next, a set of outputs is produced as the actual
response of the network. During the forward pass the synaptic weights of
the networks are all fixed.
During the backward pass, on other hand, the synaptic weights are
all adjusted in accordance with an error-correction rule. Specifically, the
actual response of the network is subtracted from a desired response to
produce an error signal. This error signal is then propagated backward
through the network against the direction of synaptic connections hence
the name (error back-propagation ). The synaptic weights are adjusted
to make the actual response of the network move closer to the desired
response in a statistical sense [7].
The structure of a Perceptron algorithm is presented in the following
steps [39]:
Step 0: Initialize weights and bias. (For simplicity, set weights and bias
to zero). Set learning rate α(0 < α ≤ 1). (set threshold value θ).
Step 1: While stopping condition is false, do Steps 2-6.
Step 2: For each training pair s:t, do Steps 3-5.
Step 3: Set activations of input units:
xi = si.
Step 4: Compute actual response of output unit:
y−in = b+∑n
i=1 xiwi
![Page 73: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/73.jpg)
Chapter Three: Neural Networks to Solve Scheduling Problems j 61
where n is the number of nodes in the input layer
y =
1 if y−in > θ
0 if −θ ≤ y−in ≤ θ
−1 if y−in < −θ
Step 5: Update weights and bias if an error occurred for this pattern.
If y 6= t,
wi(new) = wi(old) + αtxi,
b(new) = b(old) + αt.
else
wi(new) = wi(old),
b(new) = b(old).
Step 6: Test stopping condition:
If no weights changed in Step 5, stop; else, continue do steps (2-6).
The structure of Backpropagation algorithm is presented in the follow-
ing steps [39]:
Inputs : Training pairs of input and the desired output. Weights range,
sigmoid alpha value, learning rate, error limit.
Outputs : A learned neural network with updated weights .
Begin :
Step 0: Initialize weights. (Set to small random values).
Step 1: While stopping condition is false, do Steps 2-9.
![Page 74: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/74.jpg)
Chapter Three: Neural Networks to Solve Scheduling Problems j 62
Step 2: For each training pair, do Steps 3-8.
Feedforward:
Step 3: Each input unit (Xi, i = 1, · · · , n) receives input signal xi and
broadcasts this signal to all units in the layer above (the hidden
units).
Step 4: Each hidden unit (Zj, j = 1, 2, · · · , p) sums its weighted input
signals,
z−inj = v0j +∑n
i=1 xivij,
applies its activation function to compute its output signal for hidden
units,
zj = f(z−inj),
and sends this signal to all units in the layer above (output units).
Step 5: Each output unit (Yk, k = 1, . . . ,m) sums its weighted input
signals,
y−ink = w0j +∑p
j=1 zjwjk
and applies its activation function to compute its output signal,
yk = f(y−ink).
Error Backpropagation:
Step 6: Each output unit (Yk, k = 1, · · · ,m) receives a target pattern
corresponding to the input training pattern, computes its error infor-
mation term,
δk = (tk − yk)f′(y−ink),
![Page 75: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/75.jpg)
Chapter Three: Neural Networks to Solve Scheduling Problems j 63
calculates its weight correction term (used update wjk later),
∆wjk = αδkzj,
calculates its bias correction term (used update w0k later),
∆w0k = αδk,
and sends δk to units in the layer below.
Step 7: Each hidden unit (Zj, j = 1, · · · , p) sums its delta inputs (from
units in the layer above),
δ−inj =∑m
k=1 δkwjk,
multiplies by the derivative of its activation function to calculate its
error information term,
δj = δ−injf′(z−inj),
calculates its weight correction term(used update vij later),
∆vij = αδjxi,
and calculates its bias correction term(used update v0j later),
∆v0j = αδj.
Update weights and biases:
Step 8: Each output unit (Yk, k = 1, · · · ,m) updates its bais and weights
(j = 0, · · · , p):
wjk(new) = wjk(old) + ∆wjk.
Each hidden unit (Zj, j = 1, · · · , p) updates its bias and weights (i =
0, · · · , n):
vij(new) = wij(old) + ∆vij
Step 9: Test stopping condition.
![Page 76: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/76.jpg)
Chapter Three: Neural Networks to Solve Scheduling Problems j 64
3.5 Proposed Neural Network Design
The proposed design of ANN is to use the permutations as an input to
it. The designed ANN is a supervised learning neural net. The extracted
values are better values of each input the desired output for each learned
object. The system of ANN design consists of four main layers (Input
layer I, two hidden layers H1, H2, and Output layer O), it can be shown
as I | H1 | H2 | O completely connected net and illustrated in Figure
(3.6), the design of each layer becomes:
• Input Layer (I):The input vector X is feeding into layer I. It has l
nodes, xi(i = 1, · · · , l).
• First Hidden Layer (H1):It has m hidden nodes, hj(j = 1, · · · ,m).
This number is practically chosen by made many training trials on
system even to reach to highest system accuracy).
• Second Hidden Layer (H2):It has t hidden nodes, zk(k = 1, · · · , t).
This number is practically chosen by made many training trials on
the system even to reach to highest system accuracy.
• Output Layer (O):It has s nodes, yr(r = 1, · · · , s).
• Interconnecting weights between the nodes of layer I and the nodes
of layer H1 are denoted as wlm.
• Interconnecting weights between the nodes of layer H1 and the nodes
of layer H2 are denoted as vmt.
![Page 77: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/77.jpg)
Chapter Three: Neural Networks to Solve Scheduling Problems j 65
• Interconnecting weights between the node of layer H2 and the nodes
of layer O are denoted as uts.
• Internal threshold value θ for every nodes in layers I,H1, H2, and O
in the interval [0.1,0.9].
• Internal bias value (b) for every layers.
Figure 3.6: The proposed design of ANN
3.6 Artificial Neural Network Training
At this phase the system will be trained to solve the new problems
that coming later from extracted data. At this time the initial weights
![Page 78: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/78.jpg)
Chapter Three: Neural Networks to Solve Scheduling Problems j 66
of ANN will be created randomly from the interval [-0.3,0.3], with error
limit = 10−6. These values have been chosen because better practical
results was achieved by using them.
The structure of system ANN training algorithm is presented in the
following steps:
Inputs: Data base buffer, contains training data set with Number Trial
recommended of n values and optimal value as desired output where
(n=3,4,5,6,7,8).
Outputs: Trained ANN with updated weights, saved in database buffer.
Begin
Step0: Open Data base buffer as BuffDB.
Step1: Create ANN with I|H1|H2|O design.
Step2: Create Sample Learning Pair Array SampleArr contains Input
Vector and Desired Output.
Step3: Initialize Weights vector randomly from [-0.3,0.3].
Step4: Set error limit e←− 10−6,itration←− 10000.
Step5: Initialize array Index of SampleArr array to the beginning of
it.
Step6: For each selected training objects record CurrRecord in BuffDB
do SampleArr [Index].
Step7: Read Input Vector from CurrRecord. SampleArr [Index].
![Page 79: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/79.jpg)
Chapter Three: Neural Networks to Solve Scheduling Problems j 67
Step8: DesieredOutput Read optimal from CurrRecord.
Step9: Index←− Index+1.
Step10: End For
Step11: Call Backpropagation Algorithm Save Current Network with
new values in BuffDB.
Step12: Update BuffDB Database file.
Step13: End.
Table (3.1): Results of neural networks learning.
n No. nodes H1 H2 No. Trial Error Iteration Threshold Accuracy
in I permutation No. Limit %
30 40 25 997 ∗ 10−6 614 0.8 97
3 9 30 40 6 50 995 ∗ 10−6 730 0.8 98
30 40 75 994 ∗ 10−6 800 0.8 98
30 40 100 992 ∗ 10−6 863 0.8 100
40 30 25 998 ∗ 10−6 1000 0.8 93
4 12 40 30 24 50 999 ∗ 10−6 1350 0.8 93
40 30 75 975 ∗ 10−6 1560 0.8 95
40 30 100 970 ∗ 10−6 2100 0.8 96
60 40 25 999 ∗ 10−6 613 0.7 92
5 15 60 40 120 50 983 ∗ 10−6 730 0.7 93
60 40 75 990 ∗ 10−6 819 0.7 95
60 40 100 950 ∗ 10−6 819 0.7 96
![Page 80: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/80.jpg)
Chapter Three: Neural Networks to Solve Scheduling Problems j 68
Table (3.1) continued
n No. nodes H1 H2 No. Trial Error Iteration Threshold Accuracy
in I permutation No. Limit %
40 20 25 1795 ∗ 10−6 1830 0.9 95
6 18 40 20 720 50 1773 ∗ 10−6 1900 0.9 96
40 20 75 1705 ∗ 10−6 1960 0.9 97
40 20 100 1683 ∗ 10−6 2000 0.9 98
40 20 25 975 ∗ 10−6 1350 0.8 95
7 21 40 20 5040 50 969 ∗ 10−6 1225 0.8 97
40 20 75 963 ∗ 10−6 1900 0.8 98
40 20 100 950 ∗ 10−6 2025 0.8 100
40 20 25 889 ∗ 10−6 5300 0.9 93
8 24 40 20 40320 50 713 ∗ 10−6 6300 0.9 95
40 20 75 725 ∗ 10−6 7020 0.9 97
40 20 100 727 ∗ 10−6 10000 0.9 98
3.7 The Matrix of Weights
In this section we describe the matrices weights for three and four jobs.
where the matrices weights are
W =
w11 w12 . . . w130
w21 w22 . . . w230
...... . . . ...
w91 w92 . . . w930
=
0.1079 −0.6495 . . . −0.5301
−0.8642 −1.1514 . . . −1.1809
...... . . . ...
−0.0182 1.2561 . . . 0.7903
![Page 81: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/81.jpg)
Chapter Three: Neural Networks to Solve Scheduling Problems j 69
Figure 3.7: Input of 3 jobs
V=
v11 v12 . . . v140
v21 v22 . . . v240...
... . . . ...
v301 v302 . . . v3040
=
−0.4903 −0.3486 . . . −0.6382
0.6706 0.6023 . . . −0.4330
...... . . . ...
0.0034 0.3282 . . . −0.2753
U=
u11 u12 u13
u21 u22 u23...
......
u401 u402 u340
=
2.5975 2.2437 −0.9240
1.7057 0.7443 −1.5584
......
...
1.7064 1.9374 1.6869
where the matrices weights are
W =
w11 w12 . . . w140
w21 w22 . . . w240
...... . . . ...
w121 w122 . . . w1240
=
2.1518 2.2450 . . . −0.4642
0.4515 4.3378 . . . −0.4591
...... . . . ...
0.6582 1.9460 . . . −0.2511
![Page 82: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/82.jpg)
Chapter Three: Neural Networks to Solve Scheduling Problems j 70
Figure 3.8: Input of 4 jobs
V=
v11 v12 . . . v130
v21 v22 . . . v230...
... . . . ...
v401 v402 . . . v4030
=
0.3416 0.9118 . . . −0.1471
−1.9146 −1.8133 . . . −0.2785
...... . . . ...
0.5719 0.1553 . . . 0.2418
U=
u11 u12 u13
u21 u22 u23...
......
u301 u302 u303
=
3.0227 −3.1546 −1.6656
3.2392 −3.5525 −2.1573
......
...
0.5255 0.2680 −0.8063
![Page 83: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/83.jpg)
Chapter 4
Conclusions and Future Work
4.1 Conclusions
In this thesis, the problem of scheduling jobs on one machine for a
variety of three criteria was considered.
A Branch and Bound algorithm was proposed to find the optimal so-
lution for the problem 1/ri/∑n
i=1Ci+Emax+Tmax with two lower bounds
(LB1, LB2), four upper bounds (UB1, UB2, UB3, UB4) and three domi-
nances rule. Nine special cases for the last problem were derived and
proved.
For the multi-criteria scheduling problem 1/ri/∑n
i=1Ci +Emax +Tmax
the neural network method was proposed to find the near optimal for small
size problem.
4.2 Future Works
An interesting future research topic would involve experimentation
with the following problems:
71
![Page 84: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/84.jpg)
Chapter Four: Conclusions and Future Work j 72
1) 1/ri/F (∑n
i=1Ci, Emax,∑n
i=1 Ui)
2) 1/ri/F (∑n
i=1Ci,∑n
i=1 Ui, Tmax)
3) 1/ri/F (Cmax, Tmax, Emax)
![Page 85: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/85.jpg)
References
[1] Ahmet Burak and Koksalan M., ”Using Genetic Algorithm for
Single-Machine Bicriteria Scheduling Problems, European Journal
of Operational Res,145,543-556 (2003).
[2] AL Zuwaini M.K,” A comparative Study of Local Search Methods
for Some Machine Scheduling Problems with the Usage of Hybridiza-
tion As A Tool”., Ph. D. thesis, University of Baghdad, College of
Education (Ibn Al-Haitham), Dep. of Mathematics (2006).
[3] AL-Assaf S.,” Solving Multiple Objectives Scheduling Problems”
M.Sc. thesis, Dept. of mathematics, College of Sciences, AL-
Mustansiriayah University (2007).
[4] Araibi S.M.,” Machine Scheduling Problem to Minimize Two and
Three Objectives Function”., M.Sc. thesis, Dept. of Mathematical,
College of Education for Pure Sciences, Thi-Qar University(2012).
[5] Azizoglu M.,Kondakci S., and Kokslan M.,” Single machine schedul-
ing with maximum earliness and number tardy”, Computers and
Industrial Engineering ,45 , 257-268 (2003).
73
![Page 86: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/86.jpg)
References j74
[6] A. Hanada and K. Ohnishi, Near optimal job shop scheduling using
neural network parallel computing, International Conference on Pro-
ceedings of the Industrial Electronics, Control, and Instrumentation,
Vol. 1, 1993, pp. 315320.
[7] AL-samer and Fadhel Abbas Jumea ’a, ” Inellegant Techniques for
Incipient Fault Diagnosis of Power ”, Ph.D Thesis, University of
Technology, Baghdad, 2006.
[8] Baqer T.A.,”Hybrid System Applications to Reactive Power Com-
pensation in Distribution Systems”, Ph. D. Thesis Department of
Electromechanical Engineering, University of Technology-Baghdad
(2008).
[9] Baker K.R.,”Introduction to Sequencing and Scheduling”, John
Wily, New York (1974).
[10] Baker K.R., and Z. Su, ”Sequencing with due-dates and early start
times to minimize maximum tardiness,” Naval Research Logistics,
vol. 21, no. 1, pp. 171-176, (1974).
[11] Bellman R. and Dreyfus, ”Applied Dynamic Programming Prince-
ton” University press, Princeton, N.J., (1962).
[12] Blazewics J. Ecker K.H., P Esch E. Schmidt G. and Weglarz J.,
”Scheduling Computer and Manufacture Processes” Spring velary
Berlin. Heidelberg (1996).
![Page 87: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/87.jpg)
References j75
[13] Carliar, J. ”The one-machine sequencing problem”. European J. of
operational Research 11 pp.42-47 (1982).
[14] Cheng ,et al.,” A survey of scheduling problems with setup times
or costs”, European Journal of Operational Research 187, 985-1032
(2008).
[15] Chen B., Potts C. N and Woeginger G.J.,”A Review of Machine
Scheduling: Complexity Algorithms and Approximability”, Hand-
book of Combinatorial Optimization D. Z. Du and P. M. Pardalos
(Eds) (1998).
[16] Chachan H.A,” Exact and Approximation Algorithms for Schedul-
ing with and without Setup times Problems”., Ph. D. thesis, AL-
Mustansiriayah University, College of Science, Dep. of Mathematics
(2007).
[17] Cardeira C. and Z. Mammeri, Neural networks for multiprocessor
real-time scheduling, Proceedings of the Sixth Euromicro Workshop
on Real-Time Systems, pp. 59-64, (1994).
[18] Chang-shui Zhang, Ping-fan Yan, and Tong Chang, Solving job-
shop scheduling problem with priority using neural network, IEEE
International Conference on Neural Networks, pp. 1361-1366, (1991).
[19] Cook S.A. ”The Complexity of The Theorem-Proving Procedure”,
proc. 3rd annual ACM Symp. Theory compute, 151-158 (1971).
![Page 88: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/88.jpg)
References j76
[20] David K., Cliff S., and Joel W.,”Scheduling Algorithms”, Mas-
sachusetts Institute of Technology, (1998).
[21] David M.A.,”A multi-objective genetic algorithm to solve single ma-
chine scheduling problems using a fuzzy fitness function ”, M.Sc.
thesis, Dept. of Industrial and systems Engineering and the Russ,
College of Engineering and Technology of Ohio University (2007).
[22] Delphi A.M.,” Algorithms to Solve Multicriteria Scheduling
Problems on Single Machine”, M.S.c. thesis University of Al-
mustansiriyah, College of sinence, Dep. Of Mathematics (2011).
[23] French, S. ”Sequencing and Scheduling an Introduction to Mathe-
matics of Job Shop”, John Wiley and Sons, New York (1982).
[24] Garey M.R., Johnson D.S., and Sethi R. ” The Complexity of flow
shop and job shop Scheduling”, Math. Oper. Res. 1, 117-129 , (1976).
[25] George S. and Paul S., ” Pareto optima for total weighted completion
time and maximum lateness on a single machine”, Discrete Applied
Mathematics 155, 2341 - 2354(2007).
[26] Graham R.L., Lawler E.L., Lenstra J.K., and Rinnooy K.A.,
Optimization and approximation in deterministic sequencing and
scheduling: A survey, Annals of Discrete Mathematics 5, 287-
326(1979).
![Page 89: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/89.jpg)
References j77
[27] Held M., and Karp R.M., ”A dynamic programming approach to
sequencing problems”, Journal of the Society for Industrial and Ap-
plied Mathematics 10, 196-210 (1962).
[28] Hoogeveen J.A, ” Invited Review Multi-criteria scheduling ”, Euro-
pean Journal of Operational Research 167 , 592-623(2005).
[29] Hoogeveen J.A., ”Minimizing Maximum Promptness and Maximum
Lateness on a single Machine”, Mathematics of Operation Research
21, 100-114 (1996 a).
[30] Hoogeveen J.A., ”Minimizing maximum earliness and maximum
lateness on a single machine”, CWI, BS-R9001 (1990).
[31] Hoogeveen J.A, ”Single machine scheduling to minimize a function
of two or three maximum cost criteria”, Journal of Algorithms, 21,
415-433 (1996).
[32] Hussein Jameel,” Flow shop scheduling problems Using Exact and
Local search methods”, M.Sc. thesis University of Al-mustansiriyah,
College of sinence, Dep. Of Mathematics (2008).
[33] Husein N.A., ” Machine Scheduling Problem to Minimize Multiple
Objective Function ”, M.Sc. thesis, Dept of Mathematical, College
of Education ( Ibn AL Haitham), Baghdad University(2012).
[34] Hummady L. Z., ”Using Genetic algorithm to solve (NP-Compete)
problem”, M.Sc. thesis Univ. of Al- Mustansiriyah, College of Sci-
ence, Dep. of Mathematics (2005).
![Page 90: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/90.jpg)
References j78
[35] Jackson J.R.,”Scheduling a production line to minimize maximum
tardiness”, Research Report 43, Management Sciences Research
Project, UCLA (1955).
[36] Jeon Gue Park, Jong Man Park, Dou Seok Kim, Chong Hyun Lee,
Sang Weon Suh, and Mun Sung Han, Dynamic neural network with
heuristic, IEEE International Conference on Neural Networks, Vol.
7, 1994, pp. 46504654.
[37] Koksalan M., Azizoglu M., and Kondakci S., ” Minimizing flow
time and maximum earliness on a single machine ”, IIE Transac-
tion 30,192-200(1998).
[38] Lawler E.L., J.K. Lenstra, A.H.G. Rinnooy Kan and D.B. Shmoys
(1993), ”Sequencing and Scheduling: Algorithms and Complexity”
in S.C. Graves, A.H.G. Rinnooy Kan and P. Zipkin (Eds.): Hand-
books in Operations Research and Management Science vol 4: Lo-
gistics of Production and inventory, North-Holland, Amsterdam.
[39] L. Fausett,” Fundamentals of Neural Network; Architectures, Algo-
rithms and Applications ”, 1991.
[40] Lenstra J.K.,Rinnooy Kan, and Brucker P.,”Complexity of ma-
chine scheduling problems”, Annals of Discrete Mathematics 1, 343-
362(1979).
![Page 91: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/91.jpg)
References j79
[41] Lee C.Y. and Vairaktarakis G.L.,” Complexity of single machine
hierarchical scheduling: A survey”, Complexity in Numerical Opti-
mization, 19, 269-298(1993).
[42] Long Cheng, Zeng-Guang Hou, and Min Tan,” A Delayed Projection
Neural Network for Solving Linear Variational Inequalities”, IEEE
TRANSACTIONS ON NEURAL NETWORKS, VOL.20, NO.6,
JUNE 2009.
[43] Manish, Kumar and Devendra, P. Garg,” Intelligent Learning of
Fuzzy Logic Controllers Via Neural Network and Genetic Algo-
rithm”, Proceedings of 2004 JUSFA 2004 Japan-USA Symposium
on Flexible Automation Denver, Colorado, July 19-21,2004.
[44] Mohammed, H.A.A., ”Using genetic and local search algorithms as
tools for providing optimality for jobs scheduling”, M.Sc. thesis,
Univ. of Al-Mustansiriyah, College of Science, Dept. of Mathematics
(2005).
[45] M.Hajek, ” Neural Networks” , 2005.
[46] Nagar A., Jorge H., and Sunderesh H., ” Multiple and bi-criteria
scheduling: A literature survey”, European Journal of Operational
Research North-Holland,81,88-104 (1995).
[47] Nelson R.T., Sarin R.K., and Daniels R.L., ”Scheduling with mul-
tiple performance measures: The one-machine case”, Management
Science 32, 464-479(1986).
![Page 92: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/92.jpg)
References j80
[48] Pinedo, M.L.,”Scheduling theory, algorithms, and systems”,
Springer Science+Business Media, LLC., New York (2008).
[49] Ramadhan A.M.,”Single Machine Scheduling using Branch and
Bound Techniques”, M.Sc. thesis, Dept. of mathematics, College
of Science, AL-Mustansiriyah University (1998).
[50] Reeves, C.R.,”Modern Heuristic Techniques for Combinatorial Prob-
lems”, John Wiley and Sons Inc., New York (1993).
[51] Rinnooy Kan, A.H.G. ”Machine scheduling problem: classification,
complexity and computations, Martinus Mijhoff, the Mague. Holl
and (1976).
[52] Smith W.E., ”Various optimizers for single stage production”, Naval
Research Logistics Quarterly 3/1, 59-66 (1956).
[53] Stanislaw H. zak, Viriya Upatising, and Stefen Hui, ” Solving
Linea Programming Problems with Neural Network: A Compara-
tive Study” , IEEE TRANSACTIONS ON NEURAL NETWORKS,
VOL.20, NO.6, JANUARY 1995.
[54] Sukumar and Kamalasadan, ” Application of Artificial In-
telligence Techniques in Power System ”, Special Study
Report, Asian Institute of Technology, Bangkok, Thai-
land,http://www.uwf.edu/skamalasadan/specialstudymasters.pdf,
November 1998.
![Page 93: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/93.jpg)
References j81
[55] T.M. Willems and J.E. Rooda, Neural networks for job-shop schedul-
ing, Control Eng. Practice 2 (1) (1994) 3139.
[56] T’kindt, V. and Billaut, J.-C.,Multicriteria Scheduling: Theory,
Models and Algorithms. Springer, Berlin (2002).
[57] Van Wassenhove L.N., and Gelders F., ”Solving a bicriterion
scheduling problem”, European Journal of Operational Research
4/1, 42-48(1980).
[58] Yeleswarapu and Radhika M., ”Scheduling of 2-operation jobs on a
single machine to minimize the number of tardy jobs”, M.S.c. thesis
thesis University of South Florida, College of Engineering, Dep. of
Industrial and Management Systems Engineering (2003).
![Page 94: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/94.jpg)
المستخلص
واحدة ماكنة على Jobs)) النتاجات من n جدولة مسألة دراسة رسالةال ههذ في مت
لهذه .(MOF) (Multiple Objective Function)لتصغير دالة متعددة االهداف
مجموع مسألة لOptimal solution) )يجاد الحل االمثل إالهدف االول , الدراسة هدفان
أزمنة للنتاجات كونت عندماخير أكبر زمن تأو تبكيرزمن كبر أو لنتاجاتاأتمام زمن
هذه المسألة تمثل بالصيغة .بتجزئة عمل النتاجات السماح عدم متساوية و غير تحضير
(1/ / والهدف , (حسب علمنا)وهي مسألة لم تدرس من قبل ( + +
.بأستخدام الشبكات العصبية لمسألةنفس اول تقريبية لحل يجادإالثاني
من القيود الدنيا ينخوارزمية التفرع والتقيـد مع اثن اقترحنا ,األولبالنسبة للهدف
ه الرسالة المقترحة في هذ ( , , , )من القيود العليا وأربعة ( , )
والتي أنتجت حلول خاصة حاالت تسعةوبرهان تمكنا من اشتقاق كذلك. اليجاد الحل االمثل
قي تقليص عدد تساعد هيمنةللاعد وق ةثالث مع .بدون استخدام خوارزمية التفرع والتقيد مثلى
قيد ن خوارزمية التفرع والتبأ أثبتتنتائج االختبارات الحسابية .في شجرة البحث اتعتفرال
هذه .دقيقة (30)قل او يساوي أنتاج في وقت (30) لغاية ألةالمقترحة فعالة في حل المس
(.NP-hard Strongly) عام من النوع بشكل المسألة
(NP-hard Strongly) المعقد النوع من هي مسألتنا أن بما ,ما بالنسبة للهدف الثانيأ
عمليا .تقريبية حلول إليجاد Neural Networks))ة الشبكات العصبية قيطر باستخدام قمنا
حل تستطيعة الشبكات العصبية يقطر بأن وجد الحسابية التجارب خالل ومن
جيدةتكون ة الشبكات العصبيةيقطر أنكما الحظنا , معقول وبوقت تانتاج(8) لغاية المسألة
.قريبة من الحل االمثل لدالة الهدف لحلوإذ أنها تعطي , في بعض المسائل
![Page 95: Minimizing Three Simultaneous Criteria in Machine ...utq.edu.iq/Final2/111.pdf · Minimizing Three Simultaneous Criteria in ... K. Al-Zuwaini and Dr. Kadhim M. Al-Mousawi for their](https://reader034.vdocuments.us/reader034/viewer/2022051523/5a7361667f8b9abb538e9024/html5/thumbnails/95.jpg)
مجهوريةالعراق
العلميوزارةالتعلميالعايلوالبحث
الرصفةلكيةالرتبيةللعلوم/ رجامعةذيقا
تصغريثالثةأ هدافمتساويةال مهيةيفمسأ ةلجدوةلاملاكنة
مقدمةرساةل
ىل جامعةذيقار/للعلومالرصفةلكيةالرتبية/تقسمالرايضيااإ
ويهجزءمنمتطلباتنيلدرجةاملاجس ترييفعلومالرايضيات
منقبل
جعفرصاحلعنيد
أ رشاف
ادلكتورال س تاذادلكتورال س تاذاملساعد
املوسوياكظمهمديهامشمحمداكظمزغريالزويين
م3102 ھ0121