difflogic 单核CPU推理每秒超一百万张MNIST

2023-09-13 18:52:59 浏览数 (1)

Logic gate networks allow for very fast classification, with speeds beyond a million images per second on a single CPU core (for MNIST at > 97.5% accuracy).

In comparison to the fastest neural networks at 98.4% on MNIST, our method is more than 12× faster than the best binary neural networks and 2 − 3 orders of magnitude faster than the theoretical speed of sparse neural networks.

https://github.com/Felix-Petersen/difflogic

difflogic - A Library for Differentiable Logic Gate Networks

This repository includes the official implementation of our NeurIPS 2022 Paper "Deep Differentiable Logic Gate Networks" (Paper @ ArXiv).

The goal behind differentiable logic gate networks is to solve machine learning tasks by learning combinations of logic gates, i.e., so-called logic gate networks. As logic gate networks are conventionally non-differentiable, they can conventionally not be trained with methods such as gradient descent. Thus, differentiable logic gate networks are a differentiable relaxation of logic gate networks which allows efficiently learning of logic gate networks with gradient descent. Specifically, difflogic combines real-valued logics and a continuously parameterized relaxation of the network. This allows learning which logic gate (out of 16 possible) is optimal for each neuron. The resulting discretized logic gate networks achieve fast inference speeds, e.g., beyond a million images of MNIST per second on a single CPU core.

difflogic is a Python 3.6 and PyTorch 1.9.0 based library for training and inference with logic gate networks. The library can be installed with:

代码语言:javascript复制
pip install difflogic

⚠️ Note that difflogic requires CUDA, the CUDA Toolkit (for compilation), and torch>=1.9.0 (matching the CUDA version).

For additional installation support, see INSTALLATION_SUPPORT.md.

0 人点赞