Cross-channel Communication Networks

Deep Learning


  • Jianwei Yang
  • Zhile Ren
  • Chuang Gan
  • Hongyuan Zhu
  • Ji Lin
  • Devi Parikh

Published on


In deep neural networks, neurons pass information to others by sending their feature responses. While a lot of progress has been made recently by making networks deeper, information from neurons is only propagated from lower levels to higher levels in a hierarchical feed-forward manner. Such schemes donot allow for interactions among neurons within the same layer — there typically are no connections among those neurons. In this paper, we propose Neural Communication Network (NCN), a simple yet effective module to encourage inter-neuron communication at the same layer. Concretely, NCN enables neurons to exchange information through a micro neural network, which consists of a feature encoder, a message transmitter and a feature decoder, before propagating the information to neurons in the next layer. Using NCN, neurons are able to acquire feature responses of other neurons at the same layer, and learn to represent the input data with compact and discriminative features. Extensive experiments for multiple computer vision tasks, ablation studies and visualizations shows that our proposed mechanism allows shallower networks to aggregate useful information within each layers, and performance is on par or even better than baseline deep networks.

Please cite our work using the BibTeX below.

title = {Cross-channel Communication Networks},
author = {Yang, Jianwei and Ren, Zhile and Gan, Chuang and Zhu, Hongyuan and Parikh, Devi},
booktitle = {Advances in Neural Information Processing Systems 32},
editor = {H. Wallach and H. Larochelle and A. Beygelzimer and F. d\textquotesingle Alch\'{e}-Buc and E. Fox and R. Garnett},
pages = {1297--1306},
year = {2019},
publisher = {Curran Associates, Inc.},
url = {}
Close Modal