Hier finden Sie wissenschaftliche Publikationen aus den Fraunhofer-Instituten.

Lifted online training of relational models with stochastic gradient methods

: Ahmadi, B.; Kersting, K.; Natarajan, S.


Flach, P.A.; Bie, T.; Cristianini, N.:
Machine learning and knowledge discovery in databases. European conference, ECML PKDD 2012. Pt.1 : Bristol, UK, September 24-28, 2012; proceedings
Berlin: Springer, 2012 (Lecture Notes in Computer Science 7523)
ISBN: 978-3-642-33459-7
ISBN: 3-642-33459-8
ISBN: 978-3-642-33460-3
European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML-PKDD) <2012, Bristol>
Fraunhofer IAIS ()

Lifted inference approaches have rendered large, previously intractable probabilistic inference problems quickly solvable by employing symmetries to handle whole sets of indistinguishable random variables. Still, in many if not most situations training relational models will not benefit from lifting: symmetries within models easily break since variables become correlated by virtue of depending asymmetrically on evidence. An appealing idea for such situations is to train and recombine local models. This breaks long-range dependencies and allows to exploit lifting within and across the local training tasks. Moreover, it naturally paves the way for online training for relational models. Specifically, we develop the first lifted stochastic gradient optimization method with gain vector adaptation, which processes each lifted piece one after the other. On several datasets, the resulting optimizer converges to the same quality solution over an order of magnitude faster, simply because unlike batch training it starts optimizing long before having seen the entire mega-example even once.