Associate Professor, Department of Electrical Engineering and Computer Science
Who they work with
Song Han is an associate professor in MIT’s Department of Electrical Engineering and Computer Science. His research focuses on efficient deep learning computing. He has proposed “deep compression” as a way to reduce neural network size by an order of magnitude, and the hardware implementation “efficient inference engine” that first exploited model compression and weight sparsity in deep learning accelerators. He has received best paper awards at the International Conference on Learning Representations and Field-Programmable Gate Arrays symposium. He is also a recipient of an NSF Career Award and MIT Tech Review’s 35 Innovators Under 35 award. Many of his pruning, compression, and acceleration techniques have been integrated into commercial artificial intelligence chips. He earned a PhD in electrical engineering from Stanford University.
- Wang, H., Zhang, Z., Han, S., (2021). SpAtten: Efficient Sparse Attention Architecture with Cascade Token and Head Pruning. IEEE International Symposium on High-Performance Computer Architecture (HPCA)
- Cai, H., Gan, C., Wang, T., Zhang, Z., Han, S. (2020) Once For All: Train One Network And Specialize It For Efficient Deployment on Diverse Hardware Platforms. International Conference on Learning Representations (ICLR).
- Li, M., Lin, J., Ding, Y., Liu, Z., Zhu, J., Han, S. (2020) GAN Compression: Learning Efficient Architectures for Conditional GANs. IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
- Zhang, Z., Wang, H., Han, S., Dally, W. (2020) SpArch: Efficient Architecture for Sparse Matrix Multiplication. IEEE International Symposium on High-Performance Computer Architecture (HPCA).
- Liu, Z., Tang, H., Lin, Y., Han, S. (2019) Point Voxel CNN for Efficient 3D Deep Learning. Neural Information Processing System (NeurIPS).
- Wang, H., Wang, K., Yang, J., Shen, L., Sun, N., Lee, H-S., and Han, S. (2020) Transferable Transistor Sizing with Graph Neural Networks and Reinforcement Learning Design Automation Conference (DAC)
- February 10, 2021: MIT News, A language learning system that pays attention — more efficiently than ever before.
- June 24, 2020: VentureBeat, MIT researchers claim augmentation technique can train GANs with less data.
- April 23, 2020: MIT News, Reducing AI’s Carbon Footprint.
- October 11, 2019: MIT News, Faster video recognition for the smartphone era.
- April 23, 2019: MIT News, Improving security as artificial intelligence moves to smartphones.