Options
2022
Conference Paper
Title
History Dependent Significance Coding for Incremental Neural Network Compression
Abstract
This paper presents an improved probability estimation scheme for the entropy coder of Incremental Neural Network Coding (INNC), which is currently under standardization in ISO/IEC MPEG. More specifically, the paper first analyzes the compression performance of INNC and how the bitstream size relates to the neural network (NN) layers. For the layers requiring the most bits, it analyzes the coded NN weight updates and their temporal dependencies. Major finding is that the probability of a significant (i.e., non-zero) update for a weight can depend considerably on whether the weight has been updated before. Based on this finding, the paper proposes a new probability estimation scheme: Depending on whether a significant update has been received before (i.e., based on the weight's history), the entropy coder models the probability for a current significant update differently. This scheme achieves a bitstream size reduction of about 2% and 1% in a transfer and a federated learning scenario, respectively, without any accuracy loss or significant complexity increase. Therefore, MPEG adopted our history dependent significance probability (HDSP) scheme to its emerging standard for INNC.
Author(s)