The BRADLEY DEPARTMENT of ELECTRICAL and COMPUTER ENGINEERING

ECE Alumnus Michael Henry to Host Seminar | ECE | Virginia Tech

ECE NEWS

ECE Alumnus Michael Henry to Host Seminar

Dr.
Dr. Michael Henry

In this talk, ECE Alumnus Michael Henry will dive deep into the technical innovations behind Mythic. He will also give his perspective on the trends of artificial intelligence in the markets from a hardware perspective, as well as the highs and lows of running a fast scaling startup.

Seminar Information

  • Date: Friday, April 26, 2019
  • Time: 4:00 pm
  • Location: Goodwin Hall – Room 190

Title: Mythic: Pioneering Analog compute at the edge to overcome the end of digital scaling.

Abstract: Deep neural networks and deep learning have already shown a tremendous ability to serve as the foundation for highly-advanced perceptual algorithms. They will be essential for the most important technological developments over the next 10 years: autonomous systems (cars, drones, last-mile delivery), factory robotics, AR/VR, security, smart-buildings, and IoT to name a few. One of the major challenges of realizing this future falls on the semiconductor industry: delivering the massive compute performance needed for real-time neural networks in a form-factor suitable for devices with size, power, and cost constraints. When faced with massive compute loads and tough real-time requirements, the challenges come down to performance (both throughput and latency), power dissipation, cost, and ease of integration.

In deep neural networks (DNNs), the limiting factor is memory. The actual arithmetic itself is quick and low power: DNNs operate primarily on simple arithmetic like multiplies and additions and bit-depths are low (especially for inference). The challenge is memory: getting neural network weights, which can exceed 50 million, to the right processing element at the right time to be multiplied against the input and intermediate data. Mythic solution is built on new methods of analog computation: calibrated analog currents that are steered across a string of flash transistors and modulated by the stored threshold voltages. This effectively performs the matrix math of DNN inference in a massively parallel fashion that is both low power and low latency. Keeping the computation close to memory is key, and by turning flash memory cells into multiply-accumulate units, we have taken this to the extreme, resulting in massive performance, power, and cost advantages.

Bio: Mike Henry is CEO and Co-founder of Mythic, an AI hardware startup based in Redwood City, CA and Austin, TX. Under his leadership, Mythic has raised $56M in investment from top-tier VCs, built a team of 80 engineers, and developed novel chip technology that beats incumbents by 100x. Mike received his B.S. in Computer Engineering in 2007 and Ph.D. in Computer Engineering in 2011, both at Virginia Tech.