Federated Learning (FL) has emerged as a privacy-preserving machine learning paradigm facilitating collaborative training across multiple clients without sharing local data. Despite advancements in edge device capabilities, communication bottlenecks present challenges in aggregating a large number of clients; only a portion of the clients can update their parameters upon each global aggregation. This phenomenon introduces the critical challenge of stragglers in FL and the profound impact of client scheduling policies on global model convergence and stability. Existing scheduling strategies address staleness but predominantly focus on either timeliness or content. Motivated by this, we introduce the novel concept of Version Age of Information (VAoI) to FL. Unlike traditional Age of Information metrics, VAoI considers both timeliness and content staleness. Each client's version age is updated discretely, indicating the freshness of information. VAoI is incorporated into the client scheduling policy to minimize the average VAoI, mitigating the impact of outdated local updates and enhancing the stability of FL systems.
Funding Agencies|National Natural Science Foundation of China [62201504]; Zhejiang Provincial Natural Science Foundation of China [LGJ22F010001]; Zhejiang -Singapore Innovation and AI Joint Research Lab; Swedish Research Council (VR); ELLIIT; European Union [101096526]; European Union's Horizon Europe research and innovation programme under the Marie Sklodowska-Curie Grant [101131481]; Horizon Europe/JU SNS project ROBUST-6G [101139068]