This document summarizes a presentation on 1-bit semantic segmentation. It discusses quantizing neural networks to 1-bit to enable on-device AI with small, low-power processors. It describes building and training binarized neural networks, comparing their performance to FP32 networks, and implementing a hardware architecture for real-time 1-bit semantic segmentation on an FPGA board. The results show the potential for low-cost, embedded semantic segmentation through neural network quantization and specialized hardware design.
Related topics: