Options
2022
Conference Paper
Title
Neural Network Verification with DSE
Abstract
Neural network with Linear and ReLU nodes can be represented as sequential linear programs that are simple in structure but have many program paths: different combinations of ReLU activations correspond to paths in the corresponding program. Naive applications of conventional program analysis techniques for proving properties of such networks are hampered by the expontential number of activation patterns (i.e., program paths). In this paper, we explore a technique for scaling verification by decomposing the verification task into first finding feasible paths and then proving properties for individual paths, resulting in multiple small verification tasks (compared to monolithic analysis of the network). Moreover, this enables horizontal scaling, i.e., parallel execution, further decreasing analysis time. Finally, the proposed decomposition allows us to reuse a once computed set of feasible paths for the verfication of multiple properties, compounding performance gains when checking multiple properties on the same network.
Author(s)