On the importance of temporal dependencies of weight updates in communication efficient federated learning

Abstract

This paper studies the effect of exploiting temporal dependency of successive weight updates on compressing communications in Federated Learning (FL). For this, we propose residual coding for FL, which utilizes temporal dependencies by communicating compressed residuals of the weight updates whenever they are beneficial to bandwidth. We further consider Temporal Context Adaptation (TCA) which compares co-located elements of consecutive weight updates to select optimal setting for compression of bitstream in DeepCABAC encoder. Following experimental settings of MPEG standard on Neural Network Compression (NNC), we demonstrate that both temporal dependency based technologies reduce communication overhead, where the maximum reduction is obtained using both technologies, simultaneously.

Publication
International Conference on Visual Communication and Image Processing