RNTI

MODULAD
Apports des alternatives à la rétropropagation dans l'apprentissage des réseaux de neurones binaires
In EGC 2023, vol. RNTI-E-39, pp.449-458
Abstract
Current artificial neural networks are trained with parameters encoded as floating point numbers that occupy lots of memory space. Due to these models becoming more voluminous, it is becoming very difficult to consider training and using artificial neural networks on edge devices such as smartphones. Binary neural networks promise to reduce the size of deep neural network models as well as increasing inference speed while decreasing energy consumption and so allow the deployment of more powerful models on edge devices. However, binary neural networks are still difficult to train using the usual backpropagation algorithm. We provide experimental comparative results for three algorithms including the backpropagation baseline on the MNIST and CIFAR-10 datasets. The results demonstrate that binary neural networks can be trained using alternative algorithms to backpropagation and lead to better performance.