Options
2009
Conference Paper
Title
A distributed machine learning framework
Abstract
A distributed online learning framework for support vector machines (SVMs) is presented and analyzed. First, the generic binary classification problem is decomposed into multiple relaxed subproblems. Then, each of them is solved iteratively through parallel update algorithms with minimal communication overhead. This computation can be performed by individual processing units, such as separate computers or processor cores, in parallel and possibly having access to only a subset of the data. Convergence properties of continuous- and discrete-time variants of the proposed parallel update schemes are studied. A sufficient condition is derived under which synchronous and asynchronous gradient algorithms converge to the approximate solution. Subsequently, a class of stochastic update algorithms, which may arise due to distortions in the information flow between units, is shown to be globally stable under similar sufficient conditions. Active set methods are utilized to decrea se communication and computational overhead. A numerical example comparing centralized and distributed learning schemes indicates favorable properties of the proposed framework such as configurability and fast convergence. ©2009 IEEE.