The BRADLEY DEPARTMENT of ELECTRICAL and COMPUTER ENGINEERING

# ECE 5454 Optimization Techniques for Electrical and Computer Engineering | ECE | Virginia Tech

### Course Information

#### Description

Convex optimization theory and algorithms and their application to electrical and computer engineering. Sparse optimization methods, Eigen-decomposition techniques, the expectation-maximization algorithms, stochastic optimization techniques, and special techniques relevant to large-scale optimization.

#### Why take this course?

Optimization theory and algorithms play a key role in a wide range of electrical and computer engineering problems. The need for mastering the advanced optimization techniques is even more pressing with the coming of the big data era. Optimization formulations and methods are proving to be vital in designing algorithms to extract essential knowledge from huge volumes of data. This course will go beyond the conventional convex optimization techniques and cover recent developments driven by the big data analysis, with applications tailored to ECE students. The topics in this course are chosen to provide students with a unified framework to comprehend the optimization problems in electrical and computer engineering, especially in machine learning. This course will give students a solid foundation in the general concepts of optimization theory, equip students with an arsenal of techniques to solve real problems, and pave the way for students to develop new optimization techniques.

### Learning Objectives

• Analyze the advantages and disadvantages associated with the large-scale optimization techniques when applied to problems from Electrical and Computer Engineering (ECE) applications
• Implement selected optimization algorithms commonly used in machine learning and other areas of ECE
• Design and implement appropriate optimization approaches for specific ECE applications

### Course Topics

#### Percentage of Course

1. Convex Optimization 10%
2. Decomposition of Large-scales Optimization Problem into Sequential Simple Sub-problems (a) Sequential Minimal Optimization (Example: SVM); (b) Coordinate Descent Methods (Example: LASSO, Graphical LASSO) 15%
3. Solving Optimization Problems through Eigenvalue-decomposition; (a) Convexity of Function of Eigenvalues; (b) Optimal Data Reconstruction (Example: PCA, LDA, ISOMAP); (c) Normal cuts for image segmentation 15%
4. Gradient Method; (a) Conjugate gradient; (b) Backpropagation and dropout techniques (Example: deep learning 15%
5. Majorize-Minimization/Minorize-Maximization; (a) Expectation-maximization algorithm (Using Gaussian mixture model as example); (b) Non-negative matrix factorization 15%
6. Stochastic optimization methods; (a) Metropolis-Hastings algorithm and Gibbs sampling; (b) Stimulated annealing; (c) Genetic algorithm; (d) Tabu search; (e) Swarm algorithm 20%
7. Emerging Optimization Techniques; (a) Survey students. One example is submodular optimization 10%