DefinePK hosts the largest index of Pakistani journals, research articles, news headlines, and videos. It also offers chapter-level book search.
Title: Enhancing Model Robustness in Federated Learning: A Systematic Literature Review of Byzantine-Resilient Aggregation Methods
Authors: Muhammad Ahmad, Shaista Habib, Fatima Tariq
Journal: VFAST Transactions on Software Engineering
Publisher: VFAST-Research Platform
Country: Pakistan
Year: 2025
Volume: 13
Issue: 2
Language: en
The demand for privacy-preserving machine learning has led to the rise of Federated Learning (FL), where multiple clients collaboratively train a model without sharing raw data. Despite its privacy benefits, FL is vulnerable to Byzantine failures, where malicious or faulty participants inject corrupted updates, threatening model integrity. To address this, a range of Byzantine-resilient aggregation techniques have been proposed, including statistical filters (e.g., Trimmed Mean, Krum), trust-based weighting, cryptographic protocols, and hybrid strategies. This paper presents a systematic literature review (SLR) of these defenses, evaluating their robustness, scalability, and suitability for real-world applications. Challenges such as non-IID data, adaptive attacks, and trade-offs between security and efficiency are critically examined. In addition, we explore emerging trends such as domain-specific defenses, energy-aware FL, quantum-resilient methods, and federated zero-knowledge proofs. A novel classification of hybrid approaches and a standardized benchmarking framework are proposed to guide future research. This review aims to support the development of resilient, efficient and scalable decentralized learning systems in adversarial environments.
Loading PDF...
Loading Statistics...