The BRADLEY DEPARTMENT of ELECTRICAL and COMPUTER ENGINEERING

ECE 6524 Deep Learning | ECE | Virginia Tech

Graduate PROGRAMS

Course Information

Description

Advanced concepts in Machine Learning and Deep Learning. Models (multi-layer perceptrons, convolutional neural networks, recurrent neural networks, long short-term memory networks, memory networks), learning algorithms (backpropagation, stochastic sub-gradient descent, dropout), connections to structured predictions (Boltzmann machines, "unrolled" belief propagation), and applications to perception and Artificial Intelligence (AI)problems (image classification, detection, and segmentation; image captioning; visual question answering; automatic game playing).

Why take this course?

Deep Learning is a branch of machine learning based on a set of algorithms and techniques that attempt to model high-level abstractions in data by using multiple processing layers. We are witnessing an explosion in data - from billions of images shared online to Petabytes of tweets, medical records and GPS tracks, generated by companies, users and scientific communities. Deep Learning is rapidly emerging as one of the most successful and widely applicable technique across a range of applications. Many universities are expanding programs in deep learning, and employers are hiring at a frenzied pace in this domain. Students trained in a principled understanding of deep machine learning techniques will be better equipped to make fundamental contributions to research in machine learning, and applied areas such as perception (vision, text, speech), robotics, bioinformatics, etc.

Prerequisites

5424G or CS 5824

5424G or CS 5824

Major Measurable Learning Objectives

  • Analyze and contrast broad classes of deep learning models (multilayer perceptrons vs ConvNets vs RNNs)
  • Derive and implement backprogation-based parameter learning and modern optimization techniques in such models
  • Summarize and review state-of-art approaches in deep learning
  • Discuss and critique research papers on these topics
  • Identify open research questions in these areas

Course Topics

Topic

Percentage of Course

1. History of Neural Networks and Background: a) Perceptron; b) Multi-layer Perceptron; c) Backprop; d) Universal Function Approximators 10%
2. Deep Learning Models: a) Convolutional Neural Networks; 20%
b) Recurrent Neural Networks 10%
c) Long Short-Term Memory networks 10%
d) Memory Networks and Boltzmann Machines 10%
3. Modern Learning and Optimization Techniques: a) Rectified Linear units; b) Dropout; c) Distillation and model compression 15%
4. Applications to Perception, Robotics, and Al: a) Image Classification, Detection, Segmentation; b) Image to Sentence Generation; c) Robotics, Reinforcement Learning 10%
5. Design and implementation of a technical project 15%