Options
2026
Conference Paper
Title
The Local Convexification Method and its Application to Learning Weakly Convex Boolean Functions
Abstract
We study the problem of finding consistent hypotheses over finite metric spaces, focusing on hypothesis classes formed by weakly convex subsets of the domain. These hypotheses are closed under geodesics of length below a given threshold and exhibit a natural partitioning. Their generalization performance is strongly correlated with the number of blocks in the partition: fewer blocks yield greater generalization power. We prove that finding consistent weakly convex hypotheses with a minimum number of blocks is NP-hard. To address this negative result, we propose a novel greedy heuristic for computing compact solutions across a broad class of metric spaces and analyze its formal properties. Unlike standard approaches that calculate a single global distance threshold, our heuristic dynamically adjusts multiple local thresholds to seek compact hypotheses. To evaluate our method, we consider the specific case where the underlying metric space is the Hamming space, corresponding to learning weakly convex Boolean functions. Our empirical results demonstrate that our general-purpose algorithm outperforms the method specifically designed for learning this kind of Boolean functions in both model compactness and predictive performance. In fact, our approach generates hypotheses that are near-optimal with respect to the number of blocks in most cases.
Author(s)