Associate Professor, Department of Electrical Engineering and Computer Science
Song Han is an associate professor in MIT’s Department of Electrical Engineering and Computer Science. His research focuses on efficient deep learning computing. He has proposed “deep compression” as a way to reduce neural network size by an order of magnitude, and the hardware implementation “efficient inference engine” that first exploited model compression and weight sparsity in deep learning accelerators. He has received best paper awards at the International Conference on Learning Representations and Field-Programmable Gate Arrays symposium. He is also a recipient of an NSF Career Award and MIT Tech Review’s 35 Innovators Under 35 award. Many of his pruning, compression, and acceleration techniques have been integrated into commercial artificial intelligence chips. He earned a PhD in electrical engineering from Stanford University.
- Han, S., Xiao, G., Seznec, M., Wu, H., Demouth, J. (2023) SmoothQuant: Accurate and Efficient Post-Training Quantization for Large Language Models. International Conference on Machine Learning (ICML)
- Liu, Z., Tang, H., Amini, A., Yang, X., Mao, H., Rus, D., Han, S. (2023) BEVFusion: Multi-Task Multi-Sensor Fusion with Unified Bird’s-Eye View Representation. IEEE International Conference on Robotics and Automation (ICRA)
- Liu, Z., Yang, X., Tang, H., Yang, S., Han, S. (2023) FlatFormer: Flattened Window Attention for Efficient Point Cloud Transformer. IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
- February 10, 2021: MIT News, A language learning system that pays attention — more efficiently than ever before.
- June 24, 2020: VentureBeat, MIT researchers claim augmentation technique can train GANs with less data.
- April 23, 2020: MIT News, Reducing AI’s Carbon Footprint.
- October 11, 2019: MIT News, Faster video recognition for the smartphone era.
- April 23, 2019: MIT News, Improving security as artificial intelligence moves to smartphones.