LUT神经网络
背景
-
FPGA的基本单元是LUT(查找表),如果把LUT看成一种逻辑运算单元
- 查找表的真值表可以表示静态权重
- 查找表的部份输入表示动态权重
- FPGA的可重构特性,相对于AI处理器
- 可以把部份或者全部的动态权重(操作数)固化成静态的真值表,大大提高PPA
- 不同的模型和参数,可以通过重新配置FPGA的逻辑单元
- 只适合推理,理论上训练的权重不能静态化,PPA优势不大
- 把FPGA的LUT作为AI芯片的核心算力单元
- 有利于硬件的标准化
参考
- https://github.com/ryuz/BinaryBrain
-
バイナリニューラルネットとハードウェアの関係
https://www.slideshare.net/kentotajiri/ss-77136469 -
BinaryConnect: Training Deep Neural Networks with binary weights during propagations
https://arxiv.org/pdf/1511.00363.pdf -
Binarized Neural Networks
https://arxiv.org/abs/1602.02505 -
Binarized Neural Networks: Training Deep Neural Networks with Weights and Activations Constrained to +1 or -1
https://arxiv.org/abs/1602.02830 -
XNOR-Net: ImageNet Classification Using Binary Convolutional Neural Networks
https://arxiv.org/abs/1603.05279 -
Xilinx UltraScale Architecture Configurable Logic Block User Guide
https://japan.xilinx.com/support/documentation/user_guides/ug574-ultrascale-clb.pdf