• English
  • Deutsch
  • Log In
    Password Login
    Research Outputs
    Fundings & Projects
    Researchers
    Institutes
    Statistics
Repository logo
Fraunhofer-Gesellschaft
  1. Home
  2. Fraunhofer-Gesellschaft
  3. Anderes
  4. Asynchronous parallel stochastic gradient descent
 
  • Details
  • Full
Options
2015
Paper (Preprint, Research Paper, Review Paper, White Paper, etc.)
Title

Asynchronous parallel stochastic gradient descent

Title Supplement
A numeric core for scalable distributed machine learning algorithms. Published on arXiv
Abstract
The implementation of a vast majority of machine learning (ML) algorithms boils down to solving a numerical optimization problem. In this context, Stochastic Gradient Descent (SGD) methods have long proven to provide good results, both in terms of convergence and accuracy. Recently, several parallelization approaches have been proposed in order to scale SGD to solve very large ML problems. At their core, most of these approaches are following a map-reduce scheme. This paper presents a novel parallel updating algorithm for SGD, which utilizes the asynchronous single-sided communication paradigm. Compared to existing methods, Asynchronous Parallel Stochastic Gradient Descent (ASGD) provides faster (or at least equal) convergence, close to linear scaling and stable accuracy.
Author(s)
Keuper, Janis  
Fraunhofer-Institut für Techno- und Wirtschaftsmathematik ITWM  
Pfreundt, Franz-Josef  
Fraunhofer-Institut für Techno- und Wirtschaftsmathematik ITWM  
Link
Link
Language
English
Fraunhofer-Institut für Techno- und Wirtschaftsmathematik ITWM  
  • Cookie settings
  • Imprint
  • Privacy policy
  • Api
  • Contact
© 2024