Abstract
The integration of federated learning (FL) in healthcare enables collaborative intelligence across hospitals while ensuring patient data privacy. However, conventional FL frameworks often disregard energy efficiency, limiting their practicality for resource-constrained edge hospitals. This paper proposes Energy-Aware FedDeepRiskNet (EA-FedDeepRiskNet), an optimized federated learning framework that enhances energy sustainability and communication efficiency in multi-hospital systems. The proposed model extends FedDeepRiskNet by embedding three adaptive modules, Energy-Aware Client Selection (EACS), Adaptive Local Epoch Control (ALEC), and a Gradient Compression Module (GCM), to jointly minimize computation and communication energy without sacrificing diagnostic accuracy. A comprehensive energy consumption model guides the training process, dynamically adjusting client participation and local epochs based on residual energy and communication costs. Experiments on PhysioNet and MIMIC-III datasets demonstrate that EA-FedDeepRiskNet reduces energy consumption by up to 35% and communication overhead by 28%, while maintaining over 91% accuracy in patient risk prediction. These results confirm the feasibility of energy-optimized, privacy-preserving federated learning for sustainable healthcare AI. The framework represents a significant step toward green, adaptive, and trustworthy distributed medical intelligence, supporting scalable, real-time analytics across heterogeneous hospital networks.
Keywords
Get full access to this article
View all access options for this article.
