DefinePK hosts the largest index of Pakistani journals, research articles, news headlines, and videos. It also offers chapter-level book search.
Title: Energy-Efficient Federated Learning in Edge Networks Using Sparse Update Compression and ADMM-Convergent Scheduling Under Varying Load Conditions
Authors: Biju Balakrishnan, Amudha R, E. Saraswathi, K. Ramya, Sabitha K
Journal: Journal of Neonatal Surgery
Publisher: EL-MED-Pub Publishers
Country: Pakistan
Year: 2025
Volume: 14
Issue: 7
Language: en
Federated Learning (FL) at the network edge offers a promising route to privacy‐preserving model training across distributed devices. However, the stringent energy budgets and highly variable computational loads of edge nodes pose significant challenges: frequent gradient exchanges incur heavy communication overhead, and naïve client scheduling can stall convergence or exhaust device batteries. In this work, we introduce a joint sparse‐update compression and ADMM‐convergent scheduling framework to minimize overall energy consumption while preserving learning accuracy under time‐varying load conditions. First, each client applies a tunable sparsification and error‐feedback scheme to its local model updates, reducing uplink traffic by up to 90% with negligible impact on convergence. Second, we cast client selection and aggregation timing as an Alternating Direction Method of Multipliers (ADMM) subproblem, deriving provably convergent update rules that adaptively prioritize low‐energy or under‐loaded nodes. Through simulations on CIFAR-10 and FEMNIST benchmarks with realistic edge‐cloud latency and load traces, our approach achieves up to 35% reduction in per‐round energy cost and 20% faster convergence compared to state-of-the-art FL protocols. These results demonstrate that integrated compression and scheduling is key to energy‐efficient, robust FL in resource‐constrained edge networks.
 
 
 
 
Loading PDF...
Loading Statistics...