SlideShare a Scribd company logo
Lecture 2 – Intro to optimization, MOOP
DSc PhD Eng. Joanna Szłapczyńska, GUT Assoc. Prof.
PG WETI, KASK, Gdańsk 2025, e-mail: joanna.szlapczynska@pg.edu.pl
Multi-objective Optimization
Algorithms (MOOA)
algorytmy optymalizacji
wielokryterialnej (AOW)
Lecture plan
2
 Introduction to optimization and decision making
 Multi-Criteria Decision Making (MCDM)
 Optimization problems in general
 Multi-Objective Optimization (MOO) model
 Simplifying the MOO
 AI meta-heuristics used in MOO, selected meta-heuristic MOO algorithms
 Programming frameworks and libraries for MOO
 Methods of taking into account the decision-maker's preferences in MOO
 Performance and efficiency testing of MOO algorithms
Multi-objective Optimization Algorithms (MOOA/AOW)
Lecture 2
Classification of optimization methods
3 Multi-objective Optimization Algorithms (MOOA/AOW) http://dx.doi.org/10.1155/2021/5521951
Single-Objective Optimization (SOO)
4
 SOO is when we have just one optimization goal (objective) function
 In real problems there is usually no pure SOO
 Different objectives can be reduced to a single one by means of
• aggregated objective e.g. weighted average of different objectives
• handling some of the objectives as constraints
 Problems can be constrained
 The result is (in most cases) a single solution
Multi-objective Optimization Algorithms (MOOA/AOW)
Multi-Objective Optimization (MOO)
5
 MOO is when there are more than 2 objectives
• for 5 and more objectives it is called many-objective instead
 Every objective is taken into account separately
 MOO can also be constrained
 It is hardly possible that a single solution would optimize all the objective functions at once
• so-called Pareto dominance is utilized to find solutions to MOO
Multi-objective Optimization Algorithms (MOOA/AOW)
Classic Pareto dominance
6
 Concept of dominance was originally created by Vilfredo Pareto (1848–1923),
Italian civil engineer and economist
 In Pareto Multi-Objective Optimization Problem (MOOP)
• let’s say we have two MOOP solutions x and y
• we state that solution x dominates solution y if and only if
x is better than y for at least one objective
and x is not worse than y for all the other objectives
 Minimization of every goal is assumed by default
 The result is a set of solutions that are not Pareto dominated (non-dominated)
Multi-objective Optimization Algorithms (MOOA/AOW)
Pareto 80/20 rule (Pareto principle)
7
 Another concept – the „80/20 rule” aka „Pareto principle” - was named after
Vilfredo Pareto
• proposed in 1941 by J.M. Juran
• for many outcomes, roughly 80% of consequences come from 20% of causes
• V. Pareto published in one of his last works that approximately 80% of the land
in the Kingdom of Italy was owned by 20% of the population
 This principle applies to many economic, political and engineering issues
 Pareto 80/20 rule ≠ Pareto dominance !!!
Multi-objective Optimization Algorithms (MOOA/AOW)
Classic Pareto dominance (formally)
8
 Assumptions of classic Pareto dominance
• minimization for every goal function
• all the solutions are feasible (no direct constraint handling!)
 In Pareto MOOP (with n objectives) solution x dominates y if and only if
 The result is a set of Pareto non-dominated (optimal) solutions, for which a Pareto Front (PF)
is created
 Pareto non-dominated solutions are often called efficient
∀𝑖:1,..,𝑛 𝑓𝑖 𝑥 ≤ 𝑓𝑖 𝑦 and ∃𝑗:1,..,𝑛 𝑓𝑗 𝑥 < 𝑓𝑗 𝑦
Multi-objective Optimization Algorithms (MOOA/AOW)
Strong and weak Pareto dominance
9
 Strong Pareto dominance is when we get the improvement for all of the objectives
 Weak Pareto dominance is when Pareto dominance does occur, but it is not strong
 Classic Pareto dominance is either weak or strong
∀𝑖:1,..,𝑛 𝑓𝑖 𝑥 < 𝑓𝑖 𝑦 ⟺ 𝑓(𝑥) ≺ 𝑓(𝑦) ⟺ 𝑥 ≺ 𝑦
∀𝑖:1,..,𝑛 𝑓𝑖 𝑥 ≤ 𝑓𝑖 𝑦 and ∃𝑗:1,..,𝑛 𝑓𝑗 𝑥 < 𝑓𝑗 𝑦 and ∃𝑘:1,..,𝑛;𝑘≠𝑗 𝑓𝑘 𝑥 = 𝑓𝑘 𝑦 ⇔
⇔ 𝑓(𝑥) ≼ 𝑓(𝑦) ⟺ 𝑥 ≼ 𝑦
Multi-objective Optimization Algorithms (MOOA/AOW)
Pareto dominance – example 1 (strong dominance)
10
https://commons.wikimedia.org/w/index.php?curid=143545224
Multi-objective Optimization Algorithms (MOOA/AOW)
Pareto dominance – example 2 (weak dominance)
11 Multi-objective Optimization Algorithms (MOOA/AOW)
Pareto dominance – example 3 (indifference)
12
https://commons.wikimedia.org/w/index.php?curid=143545224
Multi-objective Optimization Algorithms (MOOA/AOW)
Decision space vs. objective space
13
 k: number of decision variables
 n: number of objective functions
 Decision space
• k-dimensional
• comprises of potential solutions to the problem (values of decision variables)
 Objective space (aka solution space)
• n-dimensional
• space in which the objective function vectors are represented
Definitions
Multi-objective Optimization Algorithms (MOOA/AOW)
Decision space vs. objective space
14
http://dx.doi.org/10.1109/IPIN.2017.8115908 https://doi.org/10.1016/B978-0-323-91781-0.00002-8
Multi-objective Optimization Algorithms (MOOA/AOW)
Pareto optimal set vs. Pareto front
15
https://doi.org/10.1016/j.asoc.2019.105631
Multi-objective Optimization Algorithms (MOOA/AOW)
Constraints in MOO
16
 Constraints can be considered for
• decision space
• objective space
• combined or elements in-between the decision and objective spaces
 All constraints should be satisfied for a feasible solution
 Feasible region represents the set of all solutions (in decision space) that satisfy the
constraints
 Feasible Pareto Front is the part of PF (in objective space) that is constructed by the feasible
region
 Basic Pareto-dominance definition does not have constraint handling built in
Multi-objective Optimization Algorithms (MOOA/AOW)
Basic constraint handling techniques in MOO
17
 Penalty functions
• additional elements added to the objective function(s) to degrade their values in
cases when a constraint is violated
 Domination of constraints
• basic Pareto dominance can be extended to handle constraints by differentiating
the impact and level of violation for particular constraints
 Brute force
• infeasible solutions/PF elements are removed from the final set of solution/PF
Multi-objective Optimization Algorithms (MOOA/AOW)
Can we solve MOO problems in a deterministic way?
18
 What does it mean to solve MOO problem?
• to find the feasible Pareto optimal set of solutions (determine feasible Pareto Front)
 Can we solve MOO problem by a deterministic method?
• in general – no, we cannot 
• in some particular cases – yes, we can 
only if strong assumptions are imposed on the model e.g. as in
Multi-Objective Linear Programming (MOLP) multi-objective gradient method
Multi-objective Optimization Algorithms (MOOA/AOW)
Multi-Objective Linear Programming (MOLP)
19
 MOLP model
• objective (goal) functions, linearly dependent on decision variables
• constraints linearly dependent on decision variables
• MOLP model example
• Decision variables: x1, x2, x3
• Objective functions: f1: -x1 -2x2  min
f2:-x1 +2x3  min
f3: x1 -x3  min
• Constraints: x1 +x2  1
x2  2
x1 -x2 +x3  4
Multi-objective Optimization Algorithms (MOOA/AOW)
MOLP - deterministic solvers
20
 Benson’s algorithm
• finds the efficient extreme points in the outcome set
• free Benson’s solver (C, Matlab) for MOLP http://bensolve.org/
 Variations of SIMPLEX algorithm for MOLP
• pivot base solution until no further improvements are possible
Multi-objective Optimization Algorithms (MOOA/AOW)
MOLP – Benson’s algorithm (1)
21 Multi-objective Optimization Algorithms (MOOA/AOW)
 The main tasks of a typical iteration of the Benson’s algorithm are as follows:
• solve a single linear program that is parameterized by a vertex of the current outer
approximation and
• apply vertex enumeration to obtain the vertices of the next outer approximation.
 In certain situations, the execution time of the algorithm is dominated by the time for
solving the sequence of linear programs
MOLP – Benson’s algorithm (2)
22
https://doi.org/10.1007/978-3-319-95165-2_46
Multi-objective Optimization Algorithms (MOOA/AOW)
Original LP SIMPLEX algorithm
23
1. Find a base LP solution
2. Improve the base LP solution
• pivot the base in one dimension – select a new base!
• recalculate the solution and current value of goal function
3. Check if current solution is optimal
• yes -> THE END
• no -> go to 2.
Multi-objective Optimization Algorithms (MOOA/AOW)
MOLP – extended SIMPLEX algorithm
24
 Similar to the original LP SIMPLEX, but
• how to select an initial base?
• how to pivot the base?
• how to check if the solution is optimal?
Multi-objective Optimization Algorithms (MOOA/AOW)
MOLP – extended SIMPLEX algorithm (M. Ehrgot)
25
https://www.lamsade.dauphine.fr/~projet_cost/ALGORITHMIC_DECISION_THEORY/pdf/Ehrgott/HanLecture2_ME.pdf
Multi-objective Optimization Algorithms (MOOA/AOW)
!
Can we simplify MOO problems to solve it more easily?
26
 Solving MOO in deterministic fashion is not that easy
• we can try simplifying MOO in order to utilize some more straight-forward solvers
 Possible techniques of simplifying MOO
• Weighted Objectives Method
• Lexicographic or Hierarchical Optimization Method
• Epsilon-Constrained (aka Trade-Off) Method
 Each simplifying technique converts MOO into SOO
Multi-objective Optimization Algorithms (MOOA/AOW)
Simplifying MOO - Weighted Objectives Method (1)
27
 In Weighted Objectives Method (WOM), instead of n objective functions, we introduce
and optimize a new single objective function G(x) using
• values of the original n objective functions fi, i=1..n (x)
• weights assigned to the objectives wi, i=1..n
𝐺 𝑥 = ෍
𝑖=1
𝑛
𝑤𝑖𝑓𝑖(𝑥)
where
𝑤𝑖 ∈ 0.0; 1.0 , ෍
𝑖=1
𝑛
𝑤𝑖 = 1.0
Multi-objective Optimization Algorithms (MOOA/AOW)
Simplifying MOO - Weighted Objectives Method (2)
28
 In WOM only G(x) is optimized
• we can use any method available for single-objective optimization
• graphically: the result is in the intersection of the feasible solutions region with
the hyperplane depending on the weights w
• applying different weights -> different results
• WOM is fast and very simple in implementation 
• Yet the biggest challenge is to determine the weights w 
Multi-objective Optimization Algorithms (MOOA/AOW)
Simplifying MOO – Lexicographic Optimization Method
29
 Lexicographic Optimization Method (LOM) always returns a Pareto optimal solution
 Instead of specifying weights we re-order the objective functions
• f1() the most important one
• …
• fn() the least important one
 We proceed with finding optimum just for single objective function!
• we find optimum only for the f1() objective function
• for the next (i-th) objective we find optimum just for the fi() and add new constraints for
optimum values for all preceding objective functions
• Stop the procedure if for step the solution is just a single point
𝑓𝑗 𝑥 ≤ 𝑓𝑗 𝑥𝑗
∗
, 𝑤ℎ𝑒𝑟𝑒 𝑗 = 1, . . , 𝑖 − 1
Multi-objective Optimization Algorithms (MOOA/AOW)
Simplifying MOO – LOM, example (1)
30
 Objectives:
• f1=4x1+x2 -> min
• f2=2x1+x2 -> min
• f3=x1+x2 -> min
 Constraints:
• C1: 7−x1−5x2 ≤0
• C2: 10−4x1−x2 ≤0
• C3: −7x1+6x2−9 ≤0
• C4: −x1+6x2−24 ≤0
Multi-objective Optimization Algorithms (MOOA/AOW)
Simplifying MOO – LOM, example (2)
31
 Step 1
• f1=4x1+x2 -> min
• result: BC line segment, f1
*=10.02
 Step 2
• f2=2x1+x2 -> min
• new constraint C5: f1≤ 10.02
• result: point B (x1*=2.26; x2*=0.947)
• f1
*=10.02; f2
*=5.47; f3
*=3.21
 END
Multi-objective Optimization Algorithms (MOOA/AOW)
Simplifying MOO – Hierarchical Optimization Method
32
 HOM extends LOM in the way the newly introduced constraints are built
• LOM
• HOM
• 𝜙𝑗 is the variance value allowed for the j-th objective function (if 𝜙𝑗=0 then LOM)
𝑓𝑗 𝑥 ≤ 𝑓𝑗 𝑥𝑗
∗
, 𝑤ℎ𝑒𝑟𝑒 𝑗 = 1, . . , 𝑖 − 1
𝑓𝑗 𝑥 ≤ (𝟏 − 𝝓𝒋)𝑓𝑗 𝑥𝑗
∗
, 𝑤ℎ𝑒𝑟𝑒 𝑗 = 1, . . , 𝑖 − 1
Multi-objective Optimization Algorithms (MOOA/AOW)
Simplifying MOO – LOM & HOM: pros & cons
33
 Advantages 
• quite handy: we can use it instead of specifying weights
• does not require normalized objective functions
• always provides a Pareto optimal solution
 Disadvantages 
• several single-objective problems have to be solved to obtain just one solution
point
• additional constraints have to be imposed
• when we change the order of objective functions then a different result
(Pareto optimal solution) will be returned
Multi-objective Optimization Algorithms (MOOA/AOW)
Simplifying MOO – Epsilon-Constrained Method
34
 Again, as in LOM/HOM, we select just one (r-th) objective to be optimized
• all the other objectives (i=1, .., n, i≠r) are turned into (n-1) new constraints
• i (i=1, .., n, i≠r) defines threshold for the i-th objective
 Threshold values i that can be accepted for i-th objective functions are established
• limits the objective space
 Method simple in implementation and fast 
 But we lose possibility to really optimize most of the objectives 
Multi-objective Optimization Algorithms (MOOA/AOW)
Simplifying MOO - Epsilon-Constrained, example
35
Original MOO
 Objectives:
• f1=4x1+x2 -> min
• f2=2x1+x2 -> min
• f3=x1+x2 -> min
 Constraints:
• C1: 7−x1−5x2 ≤0
• C2: 10−4x1−x2 ≤0
• C3: −7x1+6x2−9 ≤0
• C4: −x1+6x2−24 ≤0
Trade-off MOO
 Objectives:
• f1=4x1+x2 -> min
 Constraints:
• C1: 7−x1−5x2 ≤0
• C2: 10−4x1−x2 ≤0
• C3: −7x1+6x2−9 ≤0
• C4: −x1+6x2−24 ≤0
• C5(new): f2≤  2
• C6(new): f3≤  3
Multi-objective Optimization Algorithms (MOOA/AOW)
How in general can we solve MOO?
36
 Deterministic MOO methods
• limited to specific types of models e.g. purely linear ones
 Simplification of MOO (MOO SOO)
• cannot provide a complete nondominated solution set
• requires some kind of preferences by DM
 Thus meta-heuristic approach can be a good solution for MOO
• good approximation of Pareto front 
• universal - able to handle any MOO 
• not as fast or as simple as some deterministic/simplified approach 
• by definition cannot guarantee to provide a solution at all 
Multi-objective Optimization Algorithms (MOOA/AOW)
Meta-heuristics for MOO
37
 Multi-Objective Meta-Heuristics (MOMH) are AI methods able to solve real-life
multi-objective problems i.e. by approximating Pareto Fronts
• Genetic algorithm (GA)/ Evolutionary algorithm (EA)
• Ant Colony Optimization (ACO)
• Artificial Bee Colony (ABC)
• Grey Wolf Optimization (GWO)
• Particle Swarm Optimization (PSO)
• Differential Evolution (DE)
• Memetic Algorithm (MA)
• Local Search (LS)
• Harmony Search (HS)
• Simulated Annealing (SA)
Multi-objective Optimization Algorithms (MOOA/AOW)
Meta-heuristics in AI world
38
Artificial Intelligence (AI)
Machine Learning (ML)
Deep Learning (DL)
Reinforcement
Learning (RL)
Meta-heuristics
GA & EA ACO
PSO SA & TS
Generative AI
Multi-objective Optimization Algorithms (MOOA/AOW)
AI timeline
39
1943
Model of binary
neuron
1951
First neural
network computer
(SNARC)
1958
Perceptron
& early MLP
1959
ADALINE
& MADALINE
1972
Early RNN
1979 - 80
Neocognitron
& early CNN
1986
„Deep Learning”
1992
Early
Transformer
2017
Transformer
2018 - 2024
GPT-1, …, GPT-4
1952
stochastic
optimization
1963
Random search
1966
Evolutionary
programming
1975
GA
1986
TS
1983
SA
1981
RL
1991
EA
1992
ACO
1995
PSO
1996-2024
MOMH
ANN perspective
M-H + RL perspective
Place of Multi-Objective Meta-Heuristics (MOMH) in AI
Multi-objective Optimization Algorithms (MOOA/AOW)
Thank you for your attention
joanna.szlapczynska@pg.edu.pl

More Related Content

PDF
MOO_Lectures for the information and introduction of multi objective optimiza...
PDF
Multi optimization lectures for the the understanding of the multi variable d...
PDF
igor-kupczynski-msc-put-thesis
PDF
Multiobjective optimization and trade offs using pareto optimality
PDF
Model-Based User Interface Optimization: Part IV: ADVANCED TOPICS - At SICSA ...
PPTX
MOMDPSO_IDETC_2014_Weiyang
PDF
Moea introduction by deb
PPTX
UNIT-2 Quantitaitive Anlaysis for Mgt Decisions.pptx
MOO_Lectures for the information and introduction of multi objective optimiza...
Multi optimization lectures for the the understanding of the multi variable d...
igor-kupczynski-msc-put-thesis
Multiobjective optimization and trade offs using pareto optimality
Model-Based User Interface Optimization: Part IV: ADVANCED TOPICS - At SICSA ...
MOMDPSO_IDETC_2014_Weiyang
Moea introduction by deb
UNIT-2 Quantitaitive Anlaysis for Mgt Decisions.pptx

Similar to Multi optimization lectures for the the understanding of the multi variable decision making (20)

PPTX
linearprogramingproblemlpp-180729145239.pptx
PPTX
Hyperheuristics in Logistics - kassem danach
PDF
Application Issues For Multiobjective Evolutionary Algorithms
PDF
Operation research history and overview application limitation
PDF
Anirban part1
PDF
Nonlinear Programming: Theories and Algorithms of Some Unconstrained Optimiza...
PDF
Lpp through graphical analysis
DOC
Assignment oprations research luv
PDF
VET4SBO Level 2 module 2 - unit 1 - v1.0 en
PDF
Optimization and its applications usefulness for researchers
PPT
1 resource optimization 2
PDF
Multi Objective Optimization and Pareto Multi Objective Optimization with cas...
PPTX
Linear Programming Presentation - 24-8-22 (1).pptx
PDF
Argumentation in Artificial Intelligence: From Theory to Practice (Practice)
PDF
A HYBRID COA/ε-CONSTRAINT METHOD FOR SOLVING MULTI-OBJECTIVE PROBLEMS
PDF
A Literature Survey of Benchmark Functions For Global Optimization Problems
PDF
A HYBRID COA/ε-CONSTRAINT METHOD FOR SOLVING MULTI-OBJECTIVE PROBLEMS
PPTX
OR-I_Lecture_Note_01.pptx
PPTX
Fdp session rtu session 1
PDF
linearprogramingproblemlpp-180729145239.pptx
Hyperheuristics in Logistics - kassem danach
Application Issues For Multiobjective Evolutionary Algorithms
Operation research history and overview application limitation
Anirban part1
Nonlinear Programming: Theories and Algorithms of Some Unconstrained Optimiza...
Lpp through graphical analysis
Assignment oprations research luv
VET4SBO Level 2 module 2 - unit 1 - v1.0 en
Optimization and its applications usefulness for researchers
1 resource optimization 2
Multi Objective Optimization and Pareto Multi Objective Optimization with cas...
Linear Programming Presentation - 24-8-22 (1).pptx
Argumentation in Artificial Intelligence: From Theory to Practice (Practice)
A HYBRID COA/ε-CONSTRAINT METHOD FOR SOLVING MULTI-OBJECTIVE PROBLEMS
A Literature Survey of Benchmark Functions For Global Optimization Problems
A HYBRID COA/ε-CONSTRAINT METHOD FOR SOLVING MULTI-OBJECTIVE PROBLEMS
OR-I_Lecture_Note_01.pptx
Fdp session rtu session 1
Ad

Recently uploaded (20)

PPTX
CARTOGRAPHY AND GEOINFORMATION VISUALIZATION chapter1 NPTE (2).pptx
PPTX
FINAL REVIEW FOR COPD DIANOSIS FOR PULMONARY DISEASE.pptx
PPTX
UNIT-1 - COAL BASED THERMAL POWER PLANTS
PPTX
Sustainable Sites - Green Building Construction
PPTX
Lecture Notes Electrical Wiring System Components
PDF
Operating System & Kernel Study Guide-1 - converted.pdf
PPTX
Internet of Things (IOT) - A guide to understanding
PPTX
UNIT 4 Total Quality Management .pptx
PPTX
Construction Project Organization Group 2.pptx
PDF
Embodied AI: Ushering in the Next Era of Intelligent Systems
PDF
BMEC211 - INTRODUCTION TO MECHATRONICS-1.pdf
PPT
Project quality management in manufacturing
DOCX
573137875-Attendance-Management-System-original
PDF
Mohammad Mahdi Farshadian CV - Prospective PhD Student 2026
PPT
CRASH COURSE IN ALTERNATIVE PLUMBING CLASS
PDF
R24 SURVEYING LAB MANUAL for civil enggi
PDF
composite construction of structures.pdf
PPTX
web development for engineering and engineering
PDF
PRIZ Academy - 9 Windows Thinking Where to Invest Today to Win Tomorrow.pdf
PDF
The CXO Playbook 2025 – Future-Ready Strategies for C-Suite Leaders Cerebrai...
CARTOGRAPHY AND GEOINFORMATION VISUALIZATION chapter1 NPTE (2).pptx
FINAL REVIEW FOR COPD DIANOSIS FOR PULMONARY DISEASE.pptx
UNIT-1 - COAL BASED THERMAL POWER PLANTS
Sustainable Sites - Green Building Construction
Lecture Notes Electrical Wiring System Components
Operating System & Kernel Study Guide-1 - converted.pdf
Internet of Things (IOT) - A guide to understanding
UNIT 4 Total Quality Management .pptx
Construction Project Organization Group 2.pptx
Embodied AI: Ushering in the Next Era of Intelligent Systems
BMEC211 - INTRODUCTION TO MECHATRONICS-1.pdf
Project quality management in manufacturing
573137875-Attendance-Management-System-original
Mohammad Mahdi Farshadian CV - Prospective PhD Student 2026
CRASH COURSE IN ALTERNATIVE PLUMBING CLASS
R24 SURVEYING LAB MANUAL for civil enggi
composite construction of structures.pdf
web development for engineering and engineering
PRIZ Academy - 9 Windows Thinking Where to Invest Today to Win Tomorrow.pdf
The CXO Playbook 2025 – Future-Ready Strategies for C-Suite Leaders Cerebrai...
Ad

Multi optimization lectures for the the understanding of the multi variable decision making

  • 1. Lecture 2 – Intro to optimization, MOOP DSc PhD Eng. Joanna Szłapczyńska, GUT Assoc. Prof. PG WETI, KASK, Gdańsk 2025, e-mail: joanna.szlapczynska@pg.edu.pl Multi-objective Optimization Algorithms (MOOA) algorytmy optymalizacji wielokryterialnej (AOW)
  • 2. Lecture plan 2  Introduction to optimization and decision making  Multi-Criteria Decision Making (MCDM)  Optimization problems in general  Multi-Objective Optimization (MOO) model  Simplifying the MOO  AI meta-heuristics used in MOO, selected meta-heuristic MOO algorithms  Programming frameworks and libraries for MOO  Methods of taking into account the decision-maker's preferences in MOO  Performance and efficiency testing of MOO algorithms Multi-objective Optimization Algorithms (MOOA/AOW) Lecture 2
  • 3. Classification of optimization methods 3 Multi-objective Optimization Algorithms (MOOA/AOW) http://dx.doi.org/10.1155/2021/5521951
  • 4. Single-Objective Optimization (SOO) 4  SOO is when we have just one optimization goal (objective) function  In real problems there is usually no pure SOO  Different objectives can be reduced to a single one by means of • aggregated objective e.g. weighted average of different objectives • handling some of the objectives as constraints  Problems can be constrained  The result is (in most cases) a single solution Multi-objective Optimization Algorithms (MOOA/AOW)
  • 5. Multi-Objective Optimization (MOO) 5  MOO is when there are more than 2 objectives • for 5 and more objectives it is called many-objective instead  Every objective is taken into account separately  MOO can also be constrained  It is hardly possible that a single solution would optimize all the objective functions at once • so-called Pareto dominance is utilized to find solutions to MOO Multi-objective Optimization Algorithms (MOOA/AOW)
  • 6. Classic Pareto dominance 6  Concept of dominance was originally created by Vilfredo Pareto (1848–1923), Italian civil engineer and economist  In Pareto Multi-Objective Optimization Problem (MOOP) • let’s say we have two MOOP solutions x and y • we state that solution x dominates solution y if and only if x is better than y for at least one objective and x is not worse than y for all the other objectives  Minimization of every goal is assumed by default  The result is a set of solutions that are not Pareto dominated (non-dominated) Multi-objective Optimization Algorithms (MOOA/AOW)
  • 7. Pareto 80/20 rule (Pareto principle) 7  Another concept – the „80/20 rule” aka „Pareto principle” - was named after Vilfredo Pareto • proposed in 1941 by J.M. Juran • for many outcomes, roughly 80% of consequences come from 20% of causes • V. Pareto published in one of his last works that approximately 80% of the land in the Kingdom of Italy was owned by 20% of the population  This principle applies to many economic, political and engineering issues  Pareto 80/20 rule ≠ Pareto dominance !!! Multi-objective Optimization Algorithms (MOOA/AOW)
  • 8. Classic Pareto dominance (formally) 8  Assumptions of classic Pareto dominance • minimization for every goal function • all the solutions are feasible (no direct constraint handling!)  In Pareto MOOP (with n objectives) solution x dominates y if and only if  The result is a set of Pareto non-dominated (optimal) solutions, for which a Pareto Front (PF) is created  Pareto non-dominated solutions are often called efficient ∀𝑖:1,..,𝑛 𝑓𝑖 𝑥 ≤ 𝑓𝑖 𝑦 and ∃𝑗:1,..,𝑛 𝑓𝑗 𝑥 < 𝑓𝑗 𝑦 Multi-objective Optimization Algorithms (MOOA/AOW)
  • 9. Strong and weak Pareto dominance 9  Strong Pareto dominance is when we get the improvement for all of the objectives  Weak Pareto dominance is when Pareto dominance does occur, but it is not strong  Classic Pareto dominance is either weak or strong ∀𝑖:1,..,𝑛 𝑓𝑖 𝑥 < 𝑓𝑖 𝑦 ⟺ 𝑓(𝑥) ≺ 𝑓(𝑦) ⟺ 𝑥 ≺ 𝑦 ∀𝑖:1,..,𝑛 𝑓𝑖 𝑥 ≤ 𝑓𝑖 𝑦 and ∃𝑗:1,..,𝑛 𝑓𝑗 𝑥 < 𝑓𝑗 𝑦 and ∃𝑘:1,..,𝑛;𝑘≠𝑗 𝑓𝑘 𝑥 = 𝑓𝑘 𝑦 ⇔ ⇔ 𝑓(𝑥) ≼ 𝑓(𝑦) ⟺ 𝑥 ≼ 𝑦 Multi-objective Optimization Algorithms (MOOA/AOW)
  • 10. Pareto dominance – example 1 (strong dominance) 10 https://commons.wikimedia.org/w/index.php?curid=143545224 Multi-objective Optimization Algorithms (MOOA/AOW)
  • 11. Pareto dominance – example 2 (weak dominance) 11 Multi-objective Optimization Algorithms (MOOA/AOW)
  • 12. Pareto dominance – example 3 (indifference) 12 https://commons.wikimedia.org/w/index.php?curid=143545224 Multi-objective Optimization Algorithms (MOOA/AOW)
  • 13. Decision space vs. objective space 13  k: number of decision variables  n: number of objective functions  Decision space • k-dimensional • comprises of potential solutions to the problem (values of decision variables)  Objective space (aka solution space) • n-dimensional • space in which the objective function vectors are represented Definitions Multi-objective Optimization Algorithms (MOOA/AOW)
  • 14. Decision space vs. objective space 14 http://dx.doi.org/10.1109/IPIN.2017.8115908 https://doi.org/10.1016/B978-0-323-91781-0.00002-8 Multi-objective Optimization Algorithms (MOOA/AOW)
  • 15. Pareto optimal set vs. Pareto front 15 https://doi.org/10.1016/j.asoc.2019.105631 Multi-objective Optimization Algorithms (MOOA/AOW)
  • 16. Constraints in MOO 16  Constraints can be considered for • decision space • objective space • combined or elements in-between the decision and objective spaces  All constraints should be satisfied for a feasible solution  Feasible region represents the set of all solutions (in decision space) that satisfy the constraints  Feasible Pareto Front is the part of PF (in objective space) that is constructed by the feasible region  Basic Pareto-dominance definition does not have constraint handling built in Multi-objective Optimization Algorithms (MOOA/AOW)
  • 17. Basic constraint handling techniques in MOO 17  Penalty functions • additional elements added to the objective function(s) to degrade their values in cases when a constraint is violated  Domination of constraints • basic Pareto dominance can be extended to handle constraints by differentiating the impact and level of violation for particular constraints  Brute force • infeasible solutions/PF elements are removed from the final set of solution/PF Multi-objective Optimization Algorithms (MOOA/AOW)
  • 18. Can we solve MOO problems in a deterministic way? 18  What does it mean to solve MOO problem? • to find the feasible Pareto optimal set of solutions (determine feasible Pareto Front)  Can we solve MOO problem by a deterministic method? • in general – no, we cannot  • in some particular cases – yes, we can  only if strong assumptions are imposed on the model e.g. as in Multi-Objective Linear Programming (MOLP) multi-objective gradient method Multi-objective Optimization Algorithms (MOOA/AOW)
  • 19. Multi-Objective Linear Programming (MOLP) 19  MOLP model • objective (goal) functions, linearly dependent on decision variables • constraints linearly dependent on decision variables • MOLP model example • Decision variables: x1, x2, x3 • Objective functions: f1: -x1 -2x2  min f2:-x1 +2x3  min f3: x1 -x3  min • Constraints: x1 +x2  1 x2  2 x1 -x2 +x3  4 Multi-objective Optimization Algorithms (MOOA/AOW)
  • 20. MOLP - deterministic solvers 20  Benson’s algorithm • finds the efficient extreme points in the outcome set • free Benson’s solver (C, Matlab) for MOLP http://bensolve.org/  Variations of SIMPLEX algorithm for MOLP • pivot base solution until no further improvements are possible Multi-objective Optimization Algorithms (MOOA/AOW)
  • 21. MOLP – Benson’s algorithm (1) 21 Multi-objective Optimization Algorithms (MOOA/AOW)  The main tasks of a typical iteration of the Benson’s algorithm are as follows: • solve a single linear program that is parameterized by a vertex of the current outer approximation and • apply vertex enumeration to obtain the vertices of the next outer approximation.  In certain situations, the execution time of the algorithm is dominated by the time for solving the sequence of linear programs
  • 22. MOLP – Benson’s algorithm (2) 22 https://doi.org/10.1007/978-3-319-95165-2_46 Multi-objective Optimization Algorithms (MOOA/AOW)
  • 23. Original LP SIMPLEX algorithm 23 1. Find a base LP solution 2. Improve the base LP solution • pivot the base in one dimension – select a new base! • recalculate the solution and current value of goal function 3. Check if current solution is optimal • yes -> THE END • no -> go to 2. Multi-objective Optimization Algorithms (MOOA/AOW)
  • 24. MOLP – extended SIMPLEX algorithm 24  Similar to the original LP SIMPLEX, but • how to select an initial base? • how to pivot the base? • how to check if the solution is optimal? Multi-objective Optimization Algorithms (MOOA/AOW)
  • 25. MOLP – extended SIMPLEX algorithm (M. Ehrgot) 25 https://www.lamsade.dauphine.fr/~projet_cost/ALGORITHMIC_DECISION_THEORY/pdf/Ehrgott/HanLecture2_ME.pdf Multi-objective Optimization Algorithms (MOOA/AOW) !
  • 26. Can we simplify MOO problems to solve it more easily? 26  Solving MOO in deterministic fashion is not that easy • we can try simplifying MOO in order to utilize some more straight-forward solvers  Possible techniques of simplifying MOO • Weighted Objectives Method • Lexicographic or Hierarchical Optimization Method • Epsilon-Constrained (aka Trade-Off) Method  Each simplifying technique converts MOO into SOO Multi-objective Optimization Algorithms (MOOA/AOW)
  • 27. Simplifying MOO - Weighted Objectives Method (1) 27  In Weighted Objectives Method (WOM), instead of n objective functions, we introduce and optimize a new single objective function G(x) using • values of the original n objective functions fi, i=1..n (x) • weights assigned to the objectives wi, i=1..n 𝐺 𝑥 = ෍ 𝑖=1 𝑛 𝑤𝑖𝑓𝑖(𝑥) where 𝑤𝑖 ∈ 0.0; 1.0 , ෍ 𝑖=1 𝑛 𝑤𝑖 = 1.0 Multi-objective Optimization Algorithms (MOOA/AOW)
  • 28. Simplifying MOO - Weighted Objectives Method (2) 28  In WOM only G(x) is optimized • we can use any method available for single-objective optimization • graphically: the result is in the intersection of the feasible solutions region with the hyperplane depending on the weights w • applying different weights -> different results • WOM is fast and very simple in implementation  • Yet the biggest challenge is to determine the weights w  Multi-objective Optimization Algorithms (MOOA/AOW)
  • 29. Simplifying MOO – Lexicographic Optimization Method 29  Lexicographic Optimization Method (LOM) always returns a Pareto optimal solution  Instead of specifying weights we re-order the objective functions • f1() the most important one • … • fn() the least important one  We proceed with finding optimum just for single objective function! • we find optimum only for the f1() objective function • for the next (i-th) objective we find optimum just for the fi() and add new constraints for optimum values for all preceding objective functions • Stop the procedure if for step the solution is just a single point 𝑓𝑗 𝑥 ≤ 𝑓𝑗 𝑥𝑗 ∗ , 𝑤ℎ𝑒𝑟𝑒 𝑗 = 1, . . , 𝑖 − 1 Multi-objective Optimization Algorithms (MOOA/AOW)
  • 30. Simplifying MOO – LOM, example (1) 30  Objectives: • f1=4x1+x2 -> min • f2=2x1+x2 -> min • f3=x1+x2 -> min  Constraints: • C1: 7−x1−5x2 ≤0 • C2: 10−4x1−x2 ≤0 • C3: −7x1+6x2−9 ≤0 • C4: −x1+6x2−24 ≤0 Multi-objective Optimization Algorithms (MOOA/AOW)
  • 31. Simplifying MOO – LOM, example (2) 31  Step 1 • f1=4x1+x2 -> min • result: BC line segment, f1 *=10.02  Step 2 • f2=2x1+x2 -> min • new constraint C5: f1≤ 10.02 • result: point B (x1*=2.26; x2*=0.947) • f1 *=10.02; f2 *=5.47; f3 *=3.21  END Multi-objective Optimization Algorithms (MOOA/AOW)
  • 32. Simplifying MOO – Hierarchical Optimization Method 32  HOM extends LOM in the way the newly introduced constraints are built • LOM • HOM • 𝜙𝑗 is the variance value allowed for the j-th objective function (if 𝜙𝑗=0 then LOM) 𝑓𝑗 𝑥 ≤ 𝑓𝑗 𝑥𝑗 ∗ , 𝑤ℎ𝑒𝑟𝑒 𝑗 = 1, . . , 𝑖 − 1 𝑓𝑗 𝑥 ≤ (𝟏 − 𝝓𝒋)𝑓𝑗 𝑥𝑗 ∗ , 𝑤ℎ𝑒𝑟𝑒 𝑗 = 1, . . , 𝑖 − 1 Multi-objective Optimization Algorithms (MOOA/AOW)
  • 33. Simplifying MOO – LOM & HOM: pros & cons 33  Advantages  • quite handy: we can use it instead of specifying weights • does not require normalized objective functions • always provides a Pareto optimal solution  Disadvantages  • several single-objective problems have to be solved to obtain just one solution point • additional constraints have to be imposed • when we change the order of objective functions then a different result (Pareto optimal solution) will be returned Multi-objective Optimization Algorithms (MOOA/AOW)
  • 34. Simplifying MOO – Epsilon-Constrained Method 34  Again, as in LOM/HOM, we select just one (r-th) objective to be optimized • all the other objectives (i=1, .., n, i≠r) are turned into (n-1) new constraints • i (i=1, .., n, i≠r) defines threshold for the i-th objective  Threshold values i that can be accepted for i-th objective functions are established • limits the objective space  Method simple in implementation and fast   But we lose possibility to really optimize most of the objectives  Multi-objective Optimization Algorithms (MOOA/AOW)
  • 35. Simplifying MOO - Epsilon-Constrained, example 35 Original MOO  Objectives: • f1=4x1+x2 -> min • f2=2x1+x2 -> min • f3=x1+x2 -> min  Constraints: • C1: 7−x1−5x2 ≤0 • C2: 10−4x1−x2 ≤0 • C3: −7x1+6x2−9 ≤0 • C4: −x1+6x2−24 ≤0 Trade-off MOO  Objectives: • f1=4x1+x2 -> min  Constraints: • C1: 7−x1−5x2 ≤0 • C2: 10−4x1−x2 ≤0 • C3: −7x1+6x2−9 ≤0 • C4: −x1+6x2−24 ≤0 • C5(new): f2≤  2 • C6(new): f3≤  3 Multi-objective Optimization Algorithms (MOOA/AOW)
  • 36. How in general can we solve MOO? 36  Deterministic MOO methods • limited to specific types of models e.g. purely linear ones  Simplification of MOO (MOO SOO) • cannot provide a complete nondominated solution set • requires some kind of preferences by DM  Thus meta-heuristic approach can be a good solution for MOO • good approximation of Pareto front  • universal - able to handle any MOO  • not as fast or as simple as some deterministic/simplified approach  • by definition cannot guarantee to provide a solution at all  Multi-objective Optimization Algorithms (MOOA/AOW)
  • 37. Meta-heuristics for MOO 37  Multi-Objective Meta-Heuristics (MOMH) are AI methods able to solve real-life multi-objective problems i.e. by approximating Pareto Fronts • Genetic algorithm (GA)/ Evolutionary algorithm (EA) • Ant Colony Optimization (ACO) • Artificial Bee Colony (ABC) • Grey Wolf Optimization (GWO) • Particle Swarm Optimization (PSO) • Differential Evolution (DE) • Memetic Algorithm (MA) • Local Search (LS) • Harmony Search (HS) • Simulated Annealing (SA) Multi-objective Optimization Algorithms (MOOA/AOW)
  • 38. Meta-heuristics in AI world 38 Artificial Intelligence (AI) Machine Learning (ML) Deep Learning (DL) Reinforcement Learning (RL) Meta-heuristics GA & EA ACO PSO SA & TS Generative AI Multi-objective Optimization Algorithms (MOOA/AOW)
  • 39. AI timeline 39 1943 Model of binary neuron 1951 First neural network computer (SNARC) 1958 Perceptron & early MLP 1959 ADALINE & MADALINE 1972 Early RNN 1979 - 80 Neocognitron & early CNN 1986 „Deep Learning” 1992 Early Transformer 2017 Transformer 2018 - 2024 GPT-1, …, GPT-4 1952 stochastic optimization 1963 Random search 1966 Evolutionary programming 1975 GA 1986 TS 1983 SA 1981 RL 1991 EA 1992 ACO 1995 PSO 1996-2024 MOMH ANN perspective M-H + RL perspective Place of Multi-Objective Meta-Heuristics (MOMH) in AI Multi-objective Optimization Algorithms (MOOA/AOW)
  • 40. Thank you for your attention joanna.szlapczynska@pg.edu.pl