• English
  • Deutsch
  • Log In
    Password Login
    Research Outputs
    Fundings & Projects
    Researchers
    Institutes
    Statistics
Repository logo
Fraunhofer-Gesellschaft
  1. Home
  2. Fraunhofer-Gesellschaft
  3. Konferenzschrift
  4. Neural Network Compression via Learnable Wavelet Transforms
 
  • Details
  • Full
Options
2020
Conference Paper
Title

Neural Network Compression via Learnable Wavelet Transforms

Abstract
Wavelets are well known for data compression, yet have rarely been applied to the compression of neural networks. This paper shows how the fast wavelet transform can be used to compress linear layers in neural networks. Linear layers still occupy a significant portion of the parameters in recurrent neural networks (RNNs). Through our method, we can learn both the wavelet bases and corresponding coefficients to efficiently represent the linear layers of RNNs. Our wavelet compressed RNNs have significantly fewer parameters yet still perform competitively with the state-of-the-art on synthetic and real-world RNN benchmarks (Source code is available at https://github.com/v0lta/Wavelet-network-compression). Wavelet optimization adds basis flexibility, without large numbers of extra weights.
Author(s)
Wolter, Moritz  
Fraunhofer-Institut für Algorithmen und Wissenschaftliches Rechnen SCAI  
Lin, Shaohui
School of Computing National University of Singapore, Singapore, Singapore
Yao, Angela
School of Computing, National University of Singapore, Singapore, Singapore
Mainwork
Artificial Neural Networks and Machine Learning - ICANN 2020. Proceedings. Pt.II  
Conference
International Conference on Artificial Neural Networks (ICANN) 2020  
DOI
10.1007/978-3-030-61616-8_4
Language
English
Fraunhofer-Institut für Algorithmen und Wissenschaftliches Rechnen SCAI  
Keyword(s)
  • Wavelets

  • network compression

  • Cookie settings
  • Imprint
  • Privacy policy
  • Api
  • Contact
© 2024