SlideShare a Scribd company logo
Basic Concepts in Optimization – Part I
Benoˆıt Chachuat <benoit@mcmaster.ca>
McMaster University
Department of Chemical Engineering
ChE 4G03: Optimization in Chemical Engineering
Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 1 / 23
Outline
1 Local and Global Optima
2 Numerical Methods: Improving Search
3 Notions of Convexity
Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 2 / 23
Local Optima
Neighborhood
The neighborhood Nδ(x◦) of a point x◦ consists of all nearby points; that
is, all points within a small distance δ > 0 of x◦:
Nδ(x◦
)
∆
= {x : x − x◦
< δ}
Local Optimum
A point x∗ is a [strict] local minimum for the function f : IRn
→ IR on the
set S if it is feasible (x∗ ∈ S) and if sufficiently small neighborhoods
surrounding it contain no points that are both feasible and [strictly] lower
in objective value:
∃δ > 0 : f (x∗
) ≤ f (x), ∀x ∈ S ∩ Nδ(x∗
)
[ ∃δ > 0 : f (x∗
) < f (x), ∀x ∈ S ∩ Nδ(x∗
)  {x∗
} ]
Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 3 / 23
Illustration of a (Strict) Local Minimum, x∗
δ
S
xx
Nδ(x∗)
f (x)
x∗
f (x∗)
f (x∗) < f (x), ∀x ∈ S ∩ Nδ(x∗)  {x∗}
Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 4 / 23
Global Optima
Global Optimum
A point x∗ is a [strict] global minimum for the function f : IRn
→ IR on the
set S if it is feasible (x∗ ∈ S) and if no other feasible solution has [strictly]
lower objective value:
f (x∗
) ≤ f (x), ∀x ∈ S
[ f (x∗
) < f (x), ∀x ∈ S  {x∗
} ]
Remarks:
1 Global minima are always local minima
2 Local minima may not be global minima
3 Analog definitions hold for local/global optima to maximize problems
Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 5 / 23
Illustration of a (Strict) Global Minimum, x∗
f (x∗)
S
x
x∗
f (x∗) < f (x), ∀x ∈ S  {x∗}
f (x)
Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 6 / 23
Global vs. Local Optima
Class Exercise: Identify the various types of minima and maxima for f on
S
∆
= [xmin, xmax]
f (x)
xmin xmax
x1 x2 x3 x4 x5
Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 7 / 23
How to Find Optima?
Review: Three Methods for Optimization
1 Graphical Solutions
Great display + see multiple optima
But impractical for nearly all practical problems
2 Analytical Solutions (e.g., Newton, Euler, etc.)
Exact solution + easy analysis for changes in (uncertain) parameters
But not possible for most practical problems
3 Numerical Solutions
The only practical method for complex models!
But only guarantees local optima + challenges in finding effects of
(uncertain) parameters
Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 8 / 23
Numerical Optimization: The Dilemma!
Consider the optimization problem:
min
0≤x1,x2≤5
f (x1, x2)
∆
=
1
1 + (x1 − 1)2 + (x2 − 1)2
+
0.5
1 + (x1 − 4)2 + (x2 − 3)2
0
1
2
3
4
5 0
1
2
3
4
5
0
0.2
0.4
0.6
0.8
1
1.2
f (x1, x2)
x1 x2
Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 9 / 23
Numerical Optimization: The Dilemma!
Typically, only some local information is know about the objective function
typically at a current point x◦ = (x◦
1 , x◦
2 )!
Question: Which move do I make next?
current point
0
1
2
3
4
5 0
1
2
3
4
5
0
0.2
0.4
0.6
0.8
1
1.2
f (x1, x2)
x1 x2
Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 9 / 23
Numerical Optimization: The Basic Approach
Improving Search
Improving search methods are numerical algorithms that begin at a
feasible solution to a given optimization model, and advance along a
search path of feasible points with ever-improving function value
0
1
2
3
4
5 0
1
2
3
4
5
0
0.2
0.4
0.6
0.8
1
1.2
f (x1, x2)
x1 x2
Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 10 / 23
Direction-Step Paradigm
At the current point x(k), how do I decide:
the direction of change
the magnitude of change
whether further improvement is possible?
The Basic Equation
Improving search advances from current point x(k) to new point x(k+1) as:
x(k+1)
=






x
(k+1)
1
x
(k+1)
2
...
x
(k+1)
n






= x(k)
+ α∆x =






x
(k)
1
x
(k)
2
...
x
(k)
n






+ α





∆x1
∆x2
...
∆xn





where:
∆x defines a move direction of solution change at x(k) ( ∆x = 1)
α > 0 determines a move magnitude, how far to pursue this direction
Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 11 / 23
Direction of Change, ∆x
Improving Directions
Vector ∆x ∈ IRn
is an improving direction at current point x(k) if the
objective function value at x(k) + α∆x is superior to that of x(k), for all
α > 0 sufficiently small
(maximize problem) ∃¯α > 0 : f (x(k)
+ α∆x) > f (x(k)
), ∀α ∈ (0, ¯α]
0
1
2
3
4
5 0
1
2
3
4
5
0
0.2
0.4
0.6
0.8
1
1.2
f (x1, x2)
x1 x2
current point
∆x, improving direction
Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 12 / 23
Direction of Change, ∆x
Improving Directions
Vector ∆x ∈ IRn
is an improving direction at current point x(k) if the
objective function value at x(k) + α∆x is superior to that of x(k), for all
α > 0 sufficiently small
(maximize problem) ∃¯α > 0 : f (x(k)
+ α∆x) > f (x(k)
), ∀α ∈ (0, ¯α]
x(k)
x(k+1)
x1
x2
set of improving directions at x(k)
set of improving directions at x(k+1)
Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 12 / 23
Direction of Change, ∆x (cont’d)
Feasible Directions
Vector ∆x ∈ IRn
is an feasible direction at current point x(k) if point
x(k) + α∆x violates no model constraint for all α > 0 sufficiently small
∃¯α > 0 : x(k)
+ α∆x ∈ S, ∀α ∈ (0, ¯α]
x(k)
x1x1
x2
set of feasible directions at x(k)
Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 13 / 23
Optimality Criterion
Necessary Condition of Optimality (NCO)
No optimization model solution at which an improving feasible direction is
available can be a local optimum
x∗
x1
x2
set of feasible directions at x∗
set of improving directions at x∗
Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 14 / 23
Continuous Improving Search Algorithm
Step 0: Initialization.
◮ Choose any starting feasible point x(0)
and let index k ← 0.
Step 1: Move Direction.
◮ If no improving feasible direction ∆x exists at current point x(k)
, stop.
◮ Otherwise, construct an improving feasible direction at x(k)
as ∆x(k+1)
.
Step 2: Step Size.
◮ If there is no limit on step sizes for which direction ∆x(k+1)
continues
to both improve the objective function and retain feasibility, stop —
The model is unbounded.
◮ Otherwise, choose the largest step size α(k+1)
.
Step 3: Update.
◮ x(k+1)
← x(k)
+ α(k+1)
∆x(k+1)
◮ Increment index k ← k + 1 and return to step 1.
Remarks:
This basic algorithm may terminate at a suboptimal point
Moreover, it does not distinguish between local and global optima
Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 15 / 23
A Word of Caution!
Caution: A point at which no improving feasible direction is available may
not be a local optimum!
x∗
x1
x2 set of feasible directions at x∗
set of improving directions at x∗
Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 16 / 23
Finding out Optima!
Class Exercise: Determine whether each of the following points is
apparently a local/global minimum? a local/global maximum? neither?
10 20 30 40
30
40
20
60
40
50
50
50
60
90
60
70
100
70
80
x1
x2
Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 17 / 23
Convex Sets
A set S ⊂ IRn
is said to be convex if every point
on the line connecting any two points x, y in S
is itself in S,
γx + (1 − γ)y ∈ S, ∀γ ∈ (0, 1)
x
yS
Nonconvex Set: Some points on the
line connecting x, y do not lie in S
S
x
y
Nonconnected sets are nonconvex;
e.g., the discrete set {0, 1, 2, . . .}2
Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 18 / 23
Convex and Concave Functions
Convex Functions
A function f : S → IR, defined on a convex set S ⊂ IRn
, is said to be
convex on S if the line segment connecting f (x) and f (y) at any two
points x, y ∈ S lies above the function between x and y,
f (γx + (1 − γ)y) ≤ γf (x) + (1 − γ)f (y), ∀γ ∈ (0, 1)
Strict convexity:
f (γx + (1 − γ)y) < γf (x) + (1 − γ)f (y), ∀x, y ∈ S, ∀γ ∈ (0, 1)
Concave Functions
f is said to be [strictly] concave on S if (−f ) is [strictly] convex on S,
f (γx + (1 − γ)y) ≥ [>]γf (x) + (1 − γ)f (y), ∀x, y ∈ S, ∀γ ∈ (0, 1)
Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 19 / 23
Convex and Concave Functions (cont’d)
Case of a strictly convex function
on the convex set S
Case of a nonconvex function on S,
yet convex on the convex set S′
SS
S′
γx1 + (1 − γ)x2
f (γx1 + (1 − γ)x2)
γf (x1) + (1 − γ)f (x2)
x1 x1x2 x2
f (x)f (x)
Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 20 / 23
Sets Defined by Constraints
Define the set S
∆
= {x ∈ IRn
: g(x) ≤ 0}, with g a
convex function on IRn
. Then, S is a convex set
Why?
g(x) = 0
S
Consider any two points x, y ∈ S. By the convexity of g,
g(γx + (1 − γ)y) ≤ γg(x) + (1 − γ)g(y), ∀γ ∈ (0, 1)
Since g(x) ≤ 0 and g(y) ≤ 0,
g(x) + (1 − γ)g(y) ≤ 0, ∀γ ∈ (0, 1)
Therefore, γx + (1 − γ)y ∈ S for every γ ∈ (0, 1); i.e., S is convex
Class Exercise: Give a condition on g for the following set to be convex:
S
∆
= {x ∈ IRn
: g(x) ≥ 0}
Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 21 / 23
Sets Defined by Constraints (cont’d)
What is the condition on h for the following
set to be convex:
S
∆
= {x ∈ IRn
: h(x) = 0}
The set S is convex if and only if h is affine
x
y h(x) = 0
S
points not in S
Convex Sets Defined by Constraints
Consider the set
S
∆
= {x ∈ IRn
: g1(x) ≤ 0, . . . , gm(x) ≤ 0, h1(x) = 0, . . . , hp(x) = 0}
Then, S is convex if:
g1, . . . , gm are convex on IRn
h1, . . . , hp are affine
Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 22 / 23
Convexity and Global Optimality
Consider the constrained program:
max
x
f (x)
s.t. gj (x) ≤ 0, j = 1, . . . , m
hj (x) = 0, j = 1, . . . , p
If f and g1, . . . , gm are convex on IRn
, and h1, . . . , hp are affine, then
this program is said to be a convex program
Sufficient Condition for Global Optimality
A [strict] local minimum to a convex program is also a [strict] global
minimum
On the other hand, a nonconvex program may or may not have local
optima that are not global optima
Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 23 / 23

More Related Content

PDF
Applied numerical methods lec12
PDF
WITMSE 2013
PPTX
numericai matmatic matlab uygulamalar ali abdullah
PPT
Chapter 3
PPT
Topic5
PDF
Jacobi and gauss-seidel
PPTX
Initial value problems
PDF
QMC: Operator Splitting Workshop, A Splitting Method for Nonsmooth Nonconvex ...
Applied numerical methods lec12
WITMSE 2013
numericai matmatic matlab uygulamalar ali abdullah
Chapter 3
Topic5
Jacobi and gauss-seidel
Initial value problems
QMC: Operator Splitting Workshop, A Splitting Method for Nonsmooth Nonconvex ...

What's hot (20)

PDF
Engineering Mathematics-IV_B.Tech_Semester-IV_Unit-IV
PPT
Ch02
PDF
Study Material Numerical Differentiation and Integration
PDF
Applied numerical methods lec14
PDF
Error analysis statistics
DOCX
B.tech ii unit-4 material vector differentiation
PDF
B.Tech-II_Unit-IV
PPTX
Computational mathematic
PPS
Differentiation
PPTX
Finite difference method
PDF
NUMERICAL METHODS
DOCX
Btech_II_ engineering mathematics_unit4
PDF
B.Tech-II_Unit-I
DOCX
B.tech ii unit-3 material multiple integration
PPTX
shooting method with Range kutta method
PDF
Interpolation with Finite differences
PDF
Engineering Mathematics-IV_B.Tech_Semester-IV_Unit-I
PDF
B.Tech-II_Unit-V
PDF
B.Tech-II_Unit-III
Engineering Mathematics-IV_B.Tech_Semester-IV_Unit-IV
Ch02
Study Material Numerical Differentiation and Integration
Applied numerical methods lec14
Error analysis statistics
B.tech ii unit-4 material vector differentiation
B.Tech-II_Unit-IV
Computational mathematic
Differentiation
Finite difference method
NUMERICAL METHODS
Btech_II_ engineering mathematics_unit4
B.Tech-II_Unit-I
B.tech ii unit-3 material multiple integration
shooting method with Range kutta method
Interpolation with Finite differences
Engineering Mathematics-IV_B.Tech_Semester-IV_Unit-I
B.Tech-II_Unit-V
B.Tech-II_Unit-III
Ad

Similar to 02 basics i-handout (20)

PDF
Doubly Accelerated Stochastic Variance Reduced Gradient Methods for Regulariz...
PPTX
Optimization tutorial
PDF
Deep Learning Opening Workshop - ProxSARAH Algorithms for Stochastic Composit...
PDF
Optimization Techniques.pdf
PPT
Integration
PDF
Subgradient Methods for Huge-Scale Optimization Problems - Юрий Нестеров, Cat...
PDF
QMC: Operator Splitting Workshop, Using Sequences of Iterates in Inertial Met...
PPTX
Lesson 2_Eval Functions.pptx
PPT
Optimization Introduction power point presentation
PPT
Optimization Introduction and the basics .ppt
PDF
Low Complexity Regularization of Inverse Problems - Course #2 Recovery Guaran...
PDF
Fractional programming (A tool for optimization)
PPT
Application of derivatives 2 maxima and minima
PDF
Chapter 4 Simplex Method ppt
PDF
Gaussian quadratures
PDF
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
PDF
Spm add-maths-formula-list-form4-091022090639-phpapp01
PDF
Spm add-maths-formula-list-form4-091022090639-phpapp01
PDF
Additional Mathematics form 4 (formula)
PDF
A new implementation of k-MLE for mixture modelling of Wishart distributions
Doubly Accelerated Stochastic Variance Reduced Gradient Methods for Regulariz...
Optimization tutorial
Deep Learning Opening Workshop - ProxSARAH Algorithms for Stochastic Composit...
Optimization Techniques.pdf
Integration
Subgradient Methods for Huge-Scale Optimization Problems - Юрий Нестеров, Cat...
QMC: Operator Splitting Workshop, Using Sequences of Iterates in Inertial Met...
Lesson 2_Eval Functions.pptx
Optimization Introduction power point presentation
Optimization Introduction and the basics .ppt
Low Complexity Regularization of Inverse Problems - Course #2 Recovery Guaran...
Fractional programming (A tool for optimization)
Application of derivatives 2 maxima and minima
Chapter 4 Simplex Method ppt
Gaussian quadratures
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Spm add-maths-formula-list-form4-091022090639-phpapp01
Spm add-maths-formula-list-form4-091022090639-phpapp01
Additional Mathematics form 4 (formula)
A new implementation of k-MLE for mixture modelling of Wishart distributions
Ad

More from sheetslibrary (20)

PDF
Organizationandmanagement 101113012453-phpapp01
PPTX
Griffin11ech01 150122062422-conversion-gate01
PDF
Chapter7foundationsofplanningppt07 100224000005-phpapp01
PDF
Chapter6decisionmakingtheessenceofthemanagersjobppt06 100223215436-phpapp01
PDF
Chapter02 130706024358-phpapp02
PDF
Ch2managementhistory 130304100224-phpapp02
PDF
Ch1introductiontomanagementandorganizations 130304095937-phpapp01
PDF
Singlevaropt
PDF
Opt simple single_000
DOCX
Introduction to calculus
PDF
Intro diffcall3b
PDF
PDF
Business math by md aziz
PDF
Business math by mahmud
PDF
PDF
Real number-classification
PDF
Properties of-logarithms
PDF
Non linearequationsmatlab
PDF
New doc 11
PDF
Media,265106,en
Organizationandmanagement 101113012453-phpapp01
Griffin11ech01 150122062422-conversion-gate01
Chapter7foundationsofplanningppt07 100224000005-phpapp01
Chapter6decisionmakingtheessenceofthemanagersjobppt06 100223215436-phpapp01
Chapter02 130706024358-phpapp02
Ch2managementhistory 130304100224-phpapp02
Ch1introductiontomanagementandorganizations 130304095937-phpapp01
Singlevaropt
Opt simple single_000
Introduction to calculus
Intro diffcall3b
Business math by md aziz
Business math by mahmud
Real number-classification
Properties of-logarithms
Non linearequationsmatlab
New doc 11
Media,265106,en

Recently uploaded (20)

PPTX
Lesson notes of climatology university.
PPTX
Pharma ospi slides which help in ospi learning
PPTX
school management -TNTEU- B.Ed., Semester II Unit 1.pptx
PDF
Black Hat USA 2025 - Micro ICS Summit - ICS/OT Threat Landscape
PPTX
1st Inaugural Professorial Lecture held on 19th February 2020 (Governance and...
PDF
Microbial disease of the cardiovascular and lymphatic systems
PDF
Computing-Curriculum for Schools in Ghana
PPTX
Microbial diseases, their pathogenesis and prophylaxis
PDF
STATICS OF THE RIGID BODIES Hibbelers.pdf
PPTX
Cell Structure & Organelles in detailed.
PPTX
Renaissance Architecture: A Journey from Faith to Humanism
PDF
FourierSeries-QuestionsWithAnswers(Part-A).pdf
PPTX
Final Presentation General Medicine 03-08-2024.pptx
PDF
TR - Agricultural Crops Production NC III.pdf
PDF
Abdominal Access Techniques with Prof. Dr. R K Mishra
PDF
Basic Mud Logging Guide for educational purpose
PDF
Supply Chain Operations Speaking Notes -ICLT Program
PDF
BÀI TẬP BỔ TRỢ 4 KỸ NĂNG TIẾNG ANH 9 GLOBAL SUCCESS - CẢ NĂM - BÁM SÁT FORM Đ...
PDF
102 student loan defaulters named and shamed – Is someone you know on the list?
PDF
The Lost Whites of Pakistan by Jahanzaib Mughal.pdf
Lesson notes of climatology university.
Pharma ospi slides which help in ospi learning
school management -TNTEU- B.Ed., Semester II Unit 1.pptx
Black Hat USA 2025 - Micro ICS Summit - ICS/OT Threat Landscape
1st Inaugural Professorial Lecture held on 19th February 2020 (Governance and...
Microbial disease of the cardiovascular and lymphatic systems
Computing-Curriculum for Schools in Ghana
Microbial diseases, their pathogenesis and prophylaxis
STATICS OF THE RIGID BODIES Hibbelers.pdf
Cell Structure & Organelles in detailed.
Renaissance Architecture: A Journey from Faith to Humanism
FourierSeries-QuestionsWithAnswers(Part-A).pdf
Final Presentation General Medicine 03-08-2024.pptx
TR - Agricultural Crops Production NC III.pdf
Abdominal Access Techniques with Prof. Dr. R K Mishra
Basic Mud Logging Guide for educational purpose
Supply Chain Operations Speaking Notes -ICLT Program
BÀI TẬP BỔ TRỢ 4 KỸ NĂNG TIẾNG ANH 9 GLOBAL SUCCESS - CẢ NĂM - BÁM SÁT FORM Đ...
102 student loan defaulters named and shamed – Is someone you know on the list?
The Lost Whites of Pakistan by Jahanzaib Mughal.pdf

02 basics i-handout

  • 1. Basic Concepts in Optimization – Part I Benoˆıt Chachuat <benoit@mcmaster.ca> McMaster University Department of Chemical Engineering ChE 4G03: Optimization in Chemical Engineering Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 1 / 23 Outline 1 Local and Global Optima 2 Numerical Methods: Improving Search 3 Notions of Convexity Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 2 / 23 Local Optima Neighborhood The neighborhood Nδ(x◦) of a point x◦ consists of all nearby points; that is, all points within a small distance δ > 0 of x◦: Nδ(x◦ ) ∆ = {x : x − x◦ < δ} Local Optimum A point x∗ is a [strict] local minimum for the function f : IRn → IR on the set S if it is feasible (x∗ ∈ S) and if sufficiently small neighborhoods surrounding it contain no points that are both feasible and [strictly] lower in objective value: ∃δ > 0 : f (x∗ ) ≤ f (x), ∀x ∈ S ∩ Nδ(x∗ ) [ ∃δ > 0 : f (x∗ ) < f (x), ∀x ∈ S ∩ Nδ(x∗ ) {x∗ } ] Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 3 / 23 Illustration of a (Strict) Local Minimum, x∗ δ S xx Nδ(x∗) f (x) x∗ f (x∗) f (x∗) < f (x), ∀x ∈ S ∩ Nδ(x∗) {x∗} Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 4 / 23
  • 2. Global Optima Global Optimum A point x∗ is a [strict] global minimum for the function f : IRn → IR on the set S if it is feasible (x∗ ∈ S) and if no other feasible solution has [strictly] lower objective value: f (x∗ ) ≤ f (x), ∀x ∈ S [ f (x∗ ) < f (x), ∀x ∈ S {x∗ } ] Remarks: 1 Global minima are always local minima 2 Local minima may not be global minima 3 Analog definitions hold for local/global optima to maximize problems Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 5 / 23 Illustration of a (Strict) Global Minimum, x∗ f (x∗) S x x∗ f (x∗) < f (x), ∀x ∈ S {x∗} f (x) Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 6 / 23 Global vs. Local Optima Class Exercise: Identify the various types of minima and maxima for f on S ∆ = [xmin, xmax] f (x) xmin xmax x1 x2 x3 x4 x5 Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 7 / 23 How to Find Optima? Review: Three Methods for Optimization 1 Graphical Solutions Great display + see multiple optima But impractical for nearly all practical problems 2 Analytical Solutions (e.g., Newton, Euler, etc.) Exact solution + easy analysis for changes in (uncertain) parameters But not possible for most practical problems 3 Numerical Solutions The only practical method for complex models! But only guarantees local optima + challenges in finding effects of (uncertain) parameters Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 8 / 23
  • 3. Numerical Optimization: The Dilemma! Consider the optimization problem: min 0≤x1,x2≤5 f (x1, x2) ∆ = 1 1 + (x1 − 1)2 + (x2 − 1)2 + 0.5 1 + (x1 − 4)2 + (x2 − 3)2 0 1 2 3 4 5 0 1 2 3 4 5 0 0.2 0.4 0.6 0.8 1 1.2 f (x1, x2) x1 x2 Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 9 / 23 Numerical Optimization: The Dilemma! Typically, only some local information is know about the objective function typically at a current point x◦ = (x◦ 1 , x◦ 2 )! Question: Which move do I make next? current point 0 1 2 3 4 5 0 1 2 3 4 5 0 0.2 0.4 0.6 0.8 1 1.2 f (x1, x2) x1 x2 Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 9 / 23 Numerical Optimization: The Basic Approach Improving Search Improving search methods are numerical algorithms that begin at a feasible solution to a given optimization model, and advance along a search path of feasible points with ever-improving function value 0 1 2 3 4 5 0 1 2 3 4 5 0 0.2 0.4 0.6 0.8 1 1.2 f (x1, x2) x1 x2 Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 10 / 23 Direction-Step Paradigm At the current point x(k), how do I decide: the direction of change the magnitude of change whether further improvement is possible? The Basic Equation Improving search advances from current point x(k) to new point x(k+1) as: x(k+1) =       x (k+1) 1 x (k+1) 2 ... x (k+1) n       = x(k) + α∆x =       x (k) 1 x (k) 2 ... x (k) n       + α      ∆x1 ∆x2 ... ∆xn      where: ∆x defines a move direction of solution change at x(k) ( ∆x = 1) α > 0 determines a move magnitude, how far to pursue this direction Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 11 / 23
  • 4. Direction of Change, ∆x Improving Directions Vector ∆x ∈ IRn is an improving direction at current point x(k) if the objective function value at x(k) + α∆x is superior to that of x(k), for all α > 0 sufficiently small (maximize problem) ∃¯α > 0 : f (x(k) + α∆x) > f (x(k) ), ∀α ∈ (0, ¯α] 0 1 2 3 4 5 0 1 2 3 4 5 0 0.2 0.4 0.6 0.8 1 1.2 f (x1, x2) x1 x2 current point ∆x, improving direction Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 12 / 23 Direction of Change, ∆x Improving Directions Vector ∆x ∈ IRn is an improving direction at current point x(k) if the objective function value at x(k) + α∆x is superior to that of x(k), for all α > 0 sufficiently small (maximize problem) ∃¯α > 0 : f (x(k) + α∆x) > f (x(k) ), ∀α ∈ (0, ¯α] x(k) x(k+1) x1 x2 set of improving directions at x(k) set of improving directions at x(k+1) Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 12 / 23 Direction of Change, ∆x (cont’d) Feasible Directions Vector ∆x ∈ IRn is an feasible direction at current point x(k) if point x(k) + α∆x violates no model constraint for all α > 0 sufficiently small ∃¯α > 0 : x(k) + α∆x ∈ S, ∀α ∈ (0, ¯α] x(k) x1x1 x2 set of feasible directions at x(k) Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 13 / 23 Optimality Criterion Necessary Condition of Optimality (NCO) No optimization model solution at which an improving feasible direction is available can be a local optimum x∗ x1 x2 set of feasible directions at x∗ set of improving directions at x∗ Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 14 / 23
  • 5. Continuous Improving Search Algorithm Step 0: Initialization. ◮ Choose any starting feasible point x(0) and let index k ← 0. Step 1: Move Direction. ◮ If no improving feasible direction ∆x exists at current point x(k) , stop. ◮ Otherwise, construct an improving feasible direction at x(k) as ∆x(k+1) . Step 2: Step Size. ◮ If there is no limit on step sizes for which direction ∆x(k+1) continues to both improve the objective function and retain feasibility, stop — The model is unbounded. ◮ Otherwise, choose the largest step size α(k+1) . Step 3: Update. ◮ x(k+1) ← x(k) + α(k+1) ∆x(k+1) ◮ Increment index k ← k + 1 and return to step 1. Remarks: This basic algorithm may terminate at a suboptimal point Moreover, it does not distinguish between local and global optima Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 15 / 23 A Word of Caution! Caution: A point at which no improving feasible direction is available may not be a local optimum! x∗ x1 x2 set of feasible directions at x∗ set of improving directions at x∗ Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 16 / 23 Finding out Optima! Class Exercise: Determine whether each of the following points is apparently a local/global minimum? a local/global maximum? neither? 10 20 30 40 30 40 20 60 40 50 50 50 60 90 60 70 100 70 80 x1 x2 Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 17 / 23 Convex Sets A set S ⊂ IRn is said to be convex if every point on the line connecting any two points x, y in S is itself in S, γx + (1 − γ)y ∈ S, ∀γ ∈ (0, 1) x yS Nonconvex Set: Some points on the line connecting x, y do not lie in S S x y Nonconnected sets are nonconvex; e.g., the discrete set {0, 1, 2, . . .}2 Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 18 / 23
  • 6. Convex and Concave Functions Convex Functions A function f : S → IR, defined on a convex set S ⊂ IRn , is said to be convex on S if the line segment connecting f (x) and f (y) at any two points x, y ∈ S lies above the function between x and y, f (γx + (1 − γ)y) ≤ γf (x) + (1 − γ)f (y), ∀γ ∈ (0, 1) Strict convexity: f (γx + (1 − γ)y) < γf (x) + (1 − γ)f (y), ∀x, y ∈ S, ∀γ ∈ (0, 1) Concave Functions f is said to be [strictly] concave on S if (−f ) is [strictly] convex on S, f (γx + (1 − γ)y) ≥ [>]γf (x) + (1 − γ)f (y), ∀x, y ∈ S, ∀γ ∈ (0, 1) Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 19 / 23 Convex and Concave Functions (cont’d) Case of a strictly convex function on the convex set S Case of a nonconvex function on S, yet convex on the convex set S′ SS S′ γx1 + (1 − γ)x2 f (γx1 + (1 − γ)x2) γf (x1) + (1 − γ)f (x2) x1 x1x2 x2 f (x)f (x) Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 20 / 23 Sets Defined by Constraints Define the set S ∆ = {x ∈ IRn : g(x) ≤ 0}, with g a convex function on IRn . Then, S is a convex set Why? g(x) = 0 S Consider any two points x, y ∈ S. By the convexity of g, g(γx + (1 − γ)y) ≤ γg(x) + (1 − γ)g(y), ∀γ ∈ (0, 1) Since g(x) ≤ 0 and g(y) ≤ 0, g(x) + (1 − γ)g(y) ≤ 0, ∀γ ∈ (0, 1) Therefore, γx + (1 − γ)y ∈ S for every γ ∈ (0, 1); i.e., S is convex Class Exercise: Give a condition on g for the following set to be convex: S ∆ = {x ∈ IRn : g(x) ≥ 0} Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 21 / 23 Sets Defined by Constraints (cont’d) What is the condition on h for the following set to be convex: S ∆ = {x ∈ IRn : h(x) = 0} The set S is convex if and only if h is affine x y h(x) = 0 S points not in S Convex Sets Defined by Constraints Consider the set S ∆ = {x ∈ IRn : g1(x) ≤ 0, . . . , gm(x) ≤ 0, h1(x) = 0, . . . , hp(x) = 0} Then, S is convex if: g1, . . . , gm are convex on IRn h1, . . . , hp are affine Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 22 / 23
  • 7. Convexity and Global Optimality Consider the constrained program: max x f (x) s.t. gj (x) ≤ 0, j = 1, . . . , m hj (x) = 0, j = 1, . . . , p If f and g1, . . . , gm are convex on IRn , and h1, . . . , hp are affine, then this program is said to be a convex program Sufficient Condition for Global Optimality A [strict] local minimum to a convex program is also a [strict] global minimum On the other hand, a nonconvex program may or may not have local optima that are not global optima Benoˆıt Chachuat (McMaster University) Basic Concepts in Optimization – Part I 4G03 23 / 23