Options
2017
Conference Paper
Titel
A neural network implementation of Frank-Wolfe optimization
Abstract
We revisit the Frank-Wolfe algorithm for constrained convex optimization and show that it can be implemented as a simple recurrent neural network with softmin activation functions. As an example for a practical application of this result, we discuss how to train such a network to act as an associative memory.