Deep Learning and Neural Architecture Search for Optimizing Binary Neural Network Image Super Resolution

Yuanxin Su, Li Minn Ang*, Kah Phooi Seng, Jeremy Smith

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

The evolution of super-resolution (SR) technology has seen significant advancements through the adoption of deep learning methods. However, the deployment of such models by resource-constrained devices necessitates models that not only perform efficiently, but also conserve computational resources. Binary neural networks (BNNs) offer a promising solution by minimizing the data precision to binary levels, thus reducing the computational complexity and memory requirements. However, for BNNs, an effective architecture is essential due to their inherent limitations in representing information. Designing such architectures traditionally requires extensive computational resources and time. With the advancement in neural architecture search (NAS), differentiable NAS has emerged as an attractive solution for efficiently crafting network structures. In this paper, we introduce a novel and efficient binary network search method tailored for image super-resolution tasks. We adapt the search space specifically for super resolution to ensure it is optimally suited for the requirements of such tasks. Furthermore, we incorporate Libra Parameter Binarization (Libra-PB) to maximize information retention during forward propagation. Our experimental results demonstrate that the network structures generated by our method require only a third of the parameters, compared to conventional methods, and yet deliver comparable performance.

Original languageEnglish
Article number369
JournalBiomimetics
Volume9
Issue number6
DOIs
Publication statusPublished - Jun 2024

Keywords

  • binary neural network
  • deep learning
  • image super resolution
  • neural architecture search

Cite this