• English
  • Deutsch
  • Log In
    Password Login
    Research Outputs
    Fundings & Projects
    Researchers
    Institutes
    Statistics
Repository logo
Fraunhofer-Gesellschaft
  1. Home
  2. Fraunhofer-Gesellschaft
  3. Anderes
  4. Balancing the communication load of asynchronously parallelized machine learning algorithms
 
  • Details
  • Full
Options
2015
Paper (Preprint, Research Paper, Review Paper, White Paper, etc.)
Title

Balancing the communication load of asynchronously parallelized machine learning algorithms

Title Supplement
Published on arXiv
Abstract
Stochastic Gradient Descent (SGD) is the standard numerical method used to solve the core optimization problem for the vast majority of machine learning (ML) algorithms. In the context of large scale learning, as utilized by many Big Data applications, efficient parallelization of SGD is in the focus of active research. Recently, we were able to show that the asynchronous communication paradigm can be applied to achieve a fast and scalable parallelization of SGD. Asynchronous Stochastic Gradient Descent (ASGD) outperforms other, mostly MapReduce based, parallel algorithms solving large scale machine learning problems. In this paper, we investigate the impact of asynchronous communication frequency and message size on the performance of ASGD applied to large scale ML on HTC cluster and cloud environments. We introduce a novel algorithm for the automatic balancing of the asynchronous communication load, which allows to adapt ASGD to changing network bandwidths and latencies.
Author(s)
Keuper, Janis  
Fraunhofer-Institut für Techno- und Wirtschaftsmathematik ITWM  
Pfreundt, Franz-Josef  
Fraunhofer-Institut für Techno- und Wirtschaftsmathematik ITWM  
File(s)
Download (831.95 KB)
Rights
Use according to copyright law
DOI
10.24406/h-413961
Language
English
Fraunhofer-Institut für Techno- und Wirtschaftsmathematik ITWM  
  • Cookie settings
  • Imprint
  • Privacy policy
  • Api
  • Contact
© 2024