Optimization Engineering For Machine Learning and AI
Optimization Engineering For Machine Learning and AI
Optimization is a core fundamental area for machine learning and AI in general. Moreover, Convex optimization is a subfield of mathematical optimization that studies the problem of minimizing or maximizing convex functions over convex sets. Many classes of convex optimization problems admit polynomial-time algorithms, whereas mathematical optimization is in general NP-hard.
In the first lesson/lecture of this course, we will talk about the following points:
What is Optimization?
Examples on Optimization
Factors of Optimization
Reliable/Efficient Problems
Goals & Topics of this Course
Brief History on Optimization
=======
In the second lesson/lecture, we will be covering important points on convex sets, which are the following:
00:00:00 Affine Combination
00:01:33 Affine Set
00:08:21 Convex Combination
00:09:25 Convex Set
00:13:45 Convex Hull
00:16:28 Example 1-Convex Cones
00:16:55 Conic Combination
00:20:47 Example 2-Hyperplanes
00:24:22 Example 3-Euclidean Ball
00:26:37 Example 4-Ellipsoid
00:30:40 Norms
00:35:51 Example 5-Polyhedra
00:41:18 Example 6-Positive Semidefinite cone
00:54:31 Operations preserving convexity
01:15:10 Closed & Open set
01:21:10 Solid sets
01:26:25 Pointed set
01:26:57 Proper cones
01:27:28 Generalized Inequalities
01:34:12 Minimum & Minimal Elements
01:46:28 Partial Order
01:48:53 Properties of Generalized Inequalities
01:53:09 Dual Cones
02:04:31 Dual Inequalities
=======
In the third lesson/lecture of this course on convex optimization, we will be covering important points on convex functions, which are the following:
00:01:14 Definition of Convex Function
00:03:31 Examples of Convex Function
00:13:50 Convexity in Higher Dimensions
00:24:30 First-order Condition
00:27:08 Second-order Conditions
00:35:17 Epigraphs
00:37:25 Jensen's Inequality
00:39:49 Operations preserving Convexity
00:52:21 Conjugate Convex function
01:02:09 Quasi Convex functions
01:11:14 Log-Convex functions
01:16:16 Convexity with respect to generalized inequalities
=======
In Lecture 4 of this course on convex optimization, we will be covering the fundamental principles of convex optimization, which include the following:
00:00 Standard form
04:19 Feasible point
05:07 Globally Optimum point
05:50 Locally Optimum point
15:04 Explicit & Implicit constraints
30:10 Optimality criterion for differentiable cost functions
34:48 Supporting Hyperplane
=======
In Lecture 5 of this course on convex optimization, we will be covering Linear Programming and the Simplex algorithm, which was introduced by George Dantzig. The outline of the lecture is as follows:
00:00:00 What is a Linear Program (LP) ?
00:07:24 LP feasible set
00:10:22 LP forms
00:10:50 Standard form LP
00:10:50 Standard form LP
00:11:24 Slack variables
00:13:08 Inequality form LP
00:13:34 Omitting inequality constraints
00:20:38 LP Example: The Diet Problem
00:25:49 The SIMPLEX Algorithm: Method and the usage of Non-basic, Slack, and Artificial variables
00:33:59 The SIMPLEX Algorithm - Example: Iteration 0
00:40:37 The SIMPLEX Algorithm - Example: Iteration 1
00:48:18 The SIMPLEX Algorithm - Example: Iteration 2
00:55:27 The SIMPLEX Algorithm - Example: Iteration 3
01:00:13 MATLAB: Implementing SIMPLEX
01:53:15 MATLAB: Verifying with linprog
01:58:48 George Bernard Dantzig
01:59:12 SIMPLEX: Geometric Interpretation
02:01:09 SIMPLEX: Time Complexity
=======
In Lecture 6 of this course on convex optimization, we will cover the essentials of Quadratic Programming.The outline of the lecture is as follows:
00:00 Intro
00:32 What is a Quadratic Program (QP) ?
03:24 QP reformulation
06:05 Illustrating the optimal solution
16:54 Solving a QP on MATLAB
25:43 Outro
=======
In Lecture 7 of this course on convex optimization, we will cover the essentials of Quadratically Constrained Quadratic Programs, i.e. QCQPs.The outline of the lecture is as follows:
00:00 Intro
00:33 What is a Quadratically Constrained Quadratic Program (QCQP) ?
05:16 QCQP Feasible Set
06:01 MATLAB Illustration of QCQP Feasible Set
13:39 QCQP Application 1: Minimizing a linear function over a centered ellipsoid
30:42 QCQP Application 2: Minimizing a linear function over an uncentered ellipsoid
37:16 QCQP Application 3: Minimizing a quadratic function over a centered ellipsoid
42:36 Outro
=======
In Lecture 8 of this course on convex optimization, we will cover Second Order Cone Programming, i.e. SOCPs. The outline of the lecture is as follows:
00:00:00 What is Second Order Cone Programming (SOCP) ?
00:02:37 QCQP as an SOCP
00:20:25 Robust Linear Programming as an SOCP
00:31:06 Linear Programming with Random Constraints as an SOCP
00:41:09 Sum of Norms minimization as an SOCP
00:47:27 Max of Norms minimization as an SOCP
00:49:40 Hyperbolic Constrained Programs as SOCPs
00:58:59 Quadratic Fractional Problems as SOCPs
01:02:16 Outro
==========
In Lecture 9 of this course on convex optimization, we will cover Geometric Programs, i.e. GPs. The outline of the lecture is as follows:
00:00 Intro
01:51 Monomials and Posynomials
10:45 GP problem formulation (polynomial form)
19:50 Relevant papers
23:12 GP in convex form
29:27 Example 1: Frobenius Norm Diagonal Scaling
33:25 Example 2: Volume Maximization given aspect ratios and area limitations
38:12 Summary
=====
In Lecture 10 of this course on convex optimization, we will cover Generalized Geometric Programs, i.e. GPs. The outline of the lecture is as follows:
00:00 Intro
01:16 Generalized Posynomials
08:46 Generalized Geometric Program (GGP)
09:45 GGP as a GP
17:40 Example 1: Floor Planning (GGP)
23:48 Example 2: Power Control (GP)
33:00 Example 3: Digital Circuit Design (GGP)
37:26 Mixed-Integer Geometric Program
39:27 Outro
======
In Lecture 11 of this course on convex optimization, we will cover Semidefinite programming, i.e. SDPs. The outline of the lecture is as follows:
00:00 Intro
01:05 Generalized Inequality Constraints
05:18 Conic Programs
07:59 Linear Matrix Inequality (LMI)
09:56 LMI brief history (Lyapunov, Kalman, Ricatti etc..)
18:10 Semidefinite Programming (SDP)
21:56 SOCP as SDP
29:30 Eigenvalue Minimization
32:43 Matrix Norm Minimization
34:39 Outro
=====
In Lecture 12 of this course on convex optimization, we will cover various topics related to Vector optimization, such as Pareto optimal points and the Pareto frontier, which is a well known boundary studied in Game theory, risk and trade-off analysis, portfolio analysis, etc. The topics covered are outlined as follows:
00:00:00 Intro
00:01:55 What is Vector Optimization ?
00:06:38 Optimal points & the set of achievable objective values
00:13:27 Pareto optimal points
00:18:56 BLUE estimator (example)
00:28:09 Scalarization
00:32:03 Pareto Frontier (Boundary)
00:38:28 Minimal Upper Bound on a set of matrices (example)
00:43:36 Plotting a Pareto front of regularized least squares on MATLAB (1st way: the genetic algorithm)
00:47:43 Plotting a Pareto front of regularized least squares on MATLAB (2nd way: using fminsearch)
00:53:43 Multicriterion optimization
01:01:39 Scalarization for Multicriterion optimization
01:06:51 Analytical Pareto Front of Regularized Least Squares
01:09:44 Plotting a Pareto front of regularized least squares on MATLAB (3rd way: Analytically)
01:12:08 Outro
======
In Lecture 13 of this course on convex optimization, we will cover various topics related to Vector optimization, such as Pareto optimal points and the Pareto frontier, which is a well known boundary studied in Game theory, risk and trade-off analysis, portfolio analysis, etc. The topics covered are outlined as follows:
00:00 Intro
00:29 Reminder: Multicriterion Optimization
03:17 Multicriterion Optimization: A closer look
09:02 Optimal Trade-off Analysis
12:38 Outro
========
In Lecture 14 of this course on Convex Optimization, we introduce the Lagrangian duality theory. In essence, for each optimization problem (convex or not), we can associate a certain function referred to as the Lagrangian function. This function, in turn, has a dual function (which serves as an infimum over the variable of interest x). It turns out that, for any optimisation problem, the dual function is a lower bound on the optimal value of the optimisation problem in hand. This lecture focuses on many examples that derive the Lagrangian and the associated dual functions. MATLAB implementations are also presented to give useful insights. This lecture is outlined as follows:
00:00 Intro
01:00 Lagrangian function and duality
04:02 Lagrangian dual function
06:46 Lower bound on the optimal value
09:16 MATLAB: Lower bound verification
15:28 Example 1 - Least Squares
17:48 Example 2 - Linear Programming
20:48 Example 3 - Two-way Partitioning
26:04 Relationship between conjugate function and the dual function
31:22 Example 4 - Equality Constrained Norm minimization
33:37 Example 5 - Entropy Maximization
35:44 Outro
========
In Lecture 15 of this course on Convex Optimization, we talk about a very very important topic in convex optimisation that is the Lagrange Dual Problem. This lecture is outlined as follows:
00:00:00 Intro
00:00:44 Revision: Lagrange Dual Function
00:01:30 The Dual Problem
00:06:54 Example 1: Dual Problem of Standard LP
00:08:59 Example 2: Dual Problem of Inequality LP
00:13:59 Weak Duality
00:16:24 Example 3: The 2-way Partitioning Problem (Revisited)
00:21:42 Strong Duality
00:23:15 Slater’s Condition
00:24:32 What is a Relative Interior (Convex Analysis by Tyrell Rockefellar) ?
00:28:16 Generalization of Slater’s Condition
00:29:26 Example 4: Duality of LS problems
00:38:33 Example 5: Duality of LP problems
00:54:52 Example 6: Duality of QCQP problems
00:59:22 Example 7 : Duality of the Entropy Maximization Problem
01:03:48 Example 8 : Duality of the Trust Region Problem (non-convex problem)
01:11:51 Outro
======
In Lecture 16 of this course on Convex Optimization, we talk about a very practical topic, when it comes to numerical optimization algorithms, and that is the ε-suboptimal inequality, which could report how good of an estimate we have. Said differently, the Lagrangian dual feasible points (λ,ν) provides a proof or certificate of the dual gap.
00:00 Intro
01:59 Lagrangian & Dual Functions
03:40 How good of an estimate ? (Certificate)
07:09 ε-suboptimal
10:09 Stopping Criterion
17:23 Outro
==============
In Lecture 17 of this course on Convex Optimization, we talk about Complementary Slackness, which could be used a test for optimality, or it could even tell us which constraints are active and which are not !!
This lecture is outlined as follows:
00:00 Intro
00:46 What is Complementary Slackness ?
08:15 A Genie Example
14:45 Another Genie Example
16:00 Outro
=====
In Lecture 18 of this course on Convex Optimization, we talk about KKT conditions for nonconvex and convex optimization problems.
This lecture is outlined as follows:
00:00 Intro
00:57 KKT Conditions for NonConvex Problems
04:32 KKT Conditions for Convex Problems
07:48 Example
10:47 Outro
=======
In Lecture 19 of this course on Convex Optimization, we talk about Perturbation and Sensitivity Analysis of general and convex optimization problems.
This lecture is outlined as follows:
00:00 Intro
02:34 Perturbed Optimization Problem
16:33 Global Perturbation Behavior
37:35 Local Perturbation Behavior
49:36 Shadow Price Interpretation
53:40 Outro
=======
In Lecture 20 of this course on Convex Optimization, we talk about Equivalent Reformulations of general and convex optimization problems.
This lecture is outlined as follows:
00:00:00 Intro
00:01:46 Reformulation 1: Introducing new variables
00:25:06 Log-Sum-Exponential Cost
00:33:23 Norm Minimization
00:49:39 Reformulation 1 (cont'd): Introducing constraint variables
01:05:11 Reformulation 2: Cost Transformation
01:14:23 Reformulation 3: Constraint Absorption
01:32:05 Summary
========
In Lecture 21 of this course on Convex Optimization, we talk about the theorem of weak alternatives of general optimization problems.
This lecture is outlined as follows:
00:00 Introduction
04:02 Feasibility Problem
05:41 Optimization Feasibility Problem
07:55 Dual Function
08:41 Note on Strong Alternatives
10:43 Dual Problem
13:12 Weak Duality
13:41 Relating (S) with (T)
15:16 Weak Alternatives
17:31 Why Weak Alternatives ?
19:33 Summary
23:18 Outro
======
In Lecture 22 of this course on Convex Optimization, we talk about the theorem of strong alternatives of convex optimization problems.
This lecture is outlined as follows:
01:43 Introduction
02:13 Strengthening Weak Alternatives
03:21 Strong Alternatives Conditions
08:27 Strict Inequality Case
09:49 Strong Alternatives of Linear Inequalities
13:31 Strong Alternatives of Strict Linear Inequalities
22:46 Strong Alternatives of Intersection of Ellipsoids
34:53 Summary
38:46 Outro
=======
In Lecture 23 of this course on Convex Optimization, we focus on algorithms that solve unconstrained minimization type problems. The lecture evolves around unconstrained minimization problems that might or might not enjoy closed form solutions. Descent methods are discussed along with exact line search and backtracking. MATLAB implementations are given along the way.
This lecture is outlined as follows:
00:00:00 Introduction
00:01:06 Unconstrained Minimization
00:01:36 Iterative Algorithm Assumptions
00:04:28 Gradient Equivalence
00:09:04 Unconstrained Least Squares
00:20:13 Unconstrained Geometric Program
00:28:10 Initial Subset Assumption
00:35:16 Intuitive Solution of Logarithmic Barrier Minimization
00:40:42 Generalization of Logarithmic Barriers
00:42:57 Descent Methods
00:50:42 Gradient Descent
00:52:59 Exact Line Search
00:56:23 Backtracking
01:00:25 MATLAB: Gradient Descent with Exact Line Search
01:17:35 MATLAB: Gradient Descent with Backtracking
01:20:12 MATLAB: Gradient Descent with Explicit Step Size Update
01:28:07 Summary
01:30:59 Outro
===============
References:
[1] Boyd, Stephen, and Lieven Vandenberghe. Convex optimization. Cambridge university press, 2004.
[2] Nesterov, Yurii. Introductory lectures on convex optimization: A basic course. Vol. 87. Springer Science & Business Media, 2013. Reference no. 3:
[3] Ben-Tal, Ahron, and Arkadi Nemirovski. Lectures on modern convex optimization: analysis, algorithms, and engineering applications. Vol. 2. Siam, 2001.
A master class to learn convex optimization for ML and its applications to different fields and areas of engineering
Url: View Details
What you will learn
- Convex optimization theory and concepts for machine learning and AI
- Engineering mathematics of convex optimization for ML, DL, and AI
- Convex optimization methods and techniques in ML, DL, and AI
Rating: 5
Level: All Levels
Duration: 22 hours
Instructor: RS Academy
Courses By: 0-9 A B C D E F G H I J K L M N O P Q R S T U V W X Y Z
About US
The display of third-party trademarks and trade names on this site does not necessarily indicate any affiliation or endorsement of hugecourses.com.
View Sitemap