Deep Differentiable Logic Gate Networks
Authors
Authors
- Hildegard Kühne
- Felix Petersen
- Christian Borgelt
- Oliver Deussen
Authors
- Hildegard Kühne
- Felix Petersen
- Christian Borgelt
- Oliver Deussen
Published on
12/04/2022
Categories
Recently, research has increasingly focused on developing efficient neural network architectures. In this work, we explore logic gate networks for machine learning tasks by learning combinations of logic gates. These networks comprise logic gates such as “AND” and “XOR”, which allow for very fast execution. The difficulty in learning logic gate networks is that they are conventionally non-differentiable and therefore do not allow training with gradient descent. Thus, to allow for effective training, we propose differentiable logic gate networks, an architecture that combines real-valued logics and a continuously parameterized relaxation of the network. The resulting discretized logic gate networks achieve fast inference speeds, e.g., beyond a million images of MNIST per second on a single CPU core.
Please cite our work using the BibTeX below.
@inproceedings{
petersen2022deep,
title={Deep Differentiable Logic Gate Networks},
author={Felix Petersen and Christian Borgelt and Hilde Kuehne and Oliver Deussen},
booktitle={Advances in Neural Information Processing Systems},
editor={Alice H. Oh and Alekh Agarwal and Danielle Belgrave and Kyunghyun Cho},
year={2022},
url={https://openreview.net/forum?id=vF3WefcoePW}
}