Song Han

Song Han received the Ph.D. degree in Electrical Engineering from Stanford University advised by Prof. Bill Dally. Song's research focuses on energy-efficient deep learning, at the intersection between machine learning and computer architecture. He proposed Deep Compression that can compress deep neural networks by an order of magnitude without losing the prediction accuracy. He designed Efficient Inference Engine (EIE), a hardware architecture that can perform inference directly on the compressed sparse model, which saves memory bandwidth and results in significant speedup and energy saving. His work has been featured by TheNextPlatform, TechEmergence, Embedded Vision and O’Reilly. He led research efforts in model compression and hardware acceleration for deep learning that won the Best Paper Award at ICLR’16 and the Best Paper Award at FPGA’17. Before joining Stanford, Song graduated from Tsinghua University. Song's PhD thesis on "Efficient Methods and Hardware for Deep Learning" could be found here

Google Publications