• English
  • Deutsch
  • Log In
    Password Login
    Research Outputs
    Fundings & Projects
    Researchers
    Institutes
    Statistics
Repository logo
Fraunhofer-Gesellschaft
  1. Home
  2. Fraunhofer-Gesellschaft
  3. Abschlussarbeit
  4. Training Mixed Precision Neural Networks with Energy Constraints for a FeFET-Crossbar-Based Accelerator
 
  • Details
  • Full
Options
2021
Master Thesis
Title

Training Mixed Precision Neural Networks with Energy Constraints for a FeFET-Crossbar-Based Accelerator

Abstract
Dedicated hardware accelerators, quantization of network parameters and optimized dataflows can increase the efficiency of deep convolutional neural network (CNN) inference on mobile or embedded devices. Especially mixed-precision networks achieve a high network compression with only small accuracy degradations. To learn a precision distribution policy among all layer's weights and activations feedback on the training process is necessary. Therefore, a differentiable energy consumption model of a scalable, two-level, FeFET crossbar-based accelerator is developed and a differentiable, pre-defined, accelerator-specific dataflow algorithm is implemented. Both are directly integrated into a standard, gradient-descent, mixed-precision training method via an energy consumption loss. The addition al loss enables the training process to learn optimal precision parameters with respect to a trade-off between model accuracy and energy consumption. The proposed training method is evaluated for the MNIST dataset and the by the quantization induced limits are discussed. How the training method must be parametrized to ensure a stable training process is presented. When compared to an energy-unaware, mixed-precision training method, which uses memory constraints to learn the precision distribution, the energy-aware method proposed in this work achieves a 1.5 times higher energy reduction.
Thesis Note
München, TU, Master Thesis, 2021
Author(s)
Pscheidl, Ferdinand  
Fraunhofer-Einrichtung für Mikrosysteme und Festkörper-Technologien EMFT  
Publishing Place
München
Project(s)
ANDANTE
ANDANTE  
Funder
European Commission EC  
Bundesministerium für Bildung und Forschung BMBF (Deutschland)  
File(s)
Download (2.87 MB)
Rights
Use according to copyright law
DOI
10.24406/publica-fhg-283548
Language
English
Fraunhofer-Einrichtung für Mikrosysteme und Festkörper-Technologien EMFT  
Keyword(s)
  • Neuromorphics

  • Artificial Neural Networks

  • mixed-precision training

  • supervised learning

  • neural network hardware

  • edge computing

  • accelerator architectures

  • hardware-aware training

  • Cookie settings
  • Imprint
  • Privacy policy
  • Api
  • Contact
© 2024