• English
  • Deutsch
  • Log In
    Password Login
    Research Outputs
    Fundings & Projects
    Researchers
    Institutes
    Statistics
Repository logo
Fraunhofer-Gesellschaft
  1. Home
  2. Fraunhofer-Gesellschaft
  3. Konferenzschrift
  4. ProxSGD: Training Structured Neural Networks under Regularization and Constraints
 
  • Details
  • Full
Options
2020
Conference Paper
Title

ProxSGD: Training Structured Neural Networks under Regularization and Constraints

Abstract
In this paper, we consider the problem of training structured neural networks (NN) with nonsmooth regularization (e.g. `1-norm) and constraints (e.g. interval constraints). We formulate training as a constrained nonsmooth nonconvex optimization problem, and propose a convergent proximal-type stochastic gradient descent (ProxSGD) algorithm. We show that under properly selected learning rates, with probability 1, every limit point of the sequence generated by the proposed Prox-SGD algorithm is a stationary point. Finally, to support the theoretical analysis and demonstrate the flexibility of ProxSGD, we show by extensive numerical tests how ProxSGD can be used to train either sparse or binary neural networks through an adequate selection of the regularization function and constraint set.
Author(s)
Yang, Yang  
Fraunhofer-Institut für Techno- und Wirtschaftsmathematik ITWM  
Yuan, Yaxiong
University of Luxembourg
Chatzimichailidis, Avraam  
Fraunhofer-Institut für Techno- und Wirtschaftsmathematik ITWM  
sloun, Ruud van
Eindhoven University of Technology
Lei, Lei
University of Luxembourg
Chatzinotas, Symeon
University of Luxembourg
Mainwork
International Conference on Learning Representations, ICLR 2020. Online resource  
Conference
International Conference on Learning Representations (ICLR) 2020  
Link
Link
Language
English
Fraunhofer-Institut für Techno- und Wirtschaftsmathematik ITWM  
  • Cookie settings
  • Imprint
  • Privacy policy
  • Api
  • Contact
© 2024