Options
2026
Conference Paper
Title
SABSE: Segmentation-Assisted Baseline Shapley Values
Abstract
We introduce Segmentation-Assisted Baseline Shapley Values (SABSE), a novel model-agnostic framework for generating explanations of black-box object detection models. Unlike conventional methods that rely on generic superpixels, our approach leverages semantically meaningful segments generated by the Segment Anything Model (SAM) to more accurately capture critical image regions. By integrating SHAP-based attribution with advanced segmentation, SABSE produces explanations that better align with human perception and enhance the interpretability of object detection models decisions. Additionally, we explore various segment replacement strategies to assess the impact of feature removal on detection performance, quantified by an adapted insertion score for object detection models. Experimental evaluations across multiple object categories demonstrate that SABSE not only improves explanation fidelity but also provides deeper insights into the decision-making process of detection models. These findings open promising avenues for enhancing transparency and trust in real-world AI applications.
Author(s)