The rapid expansion of IoT ecosystems introduces severe challenges in
scalability, security, and real-time decision-making. Traditional centralized
architectures struggle with latency, privacy concerns, and excessive resource
consumption, making them unsuitable for modern large-scale IoT deployments.
This paper presents a novel Federated Learning-driven Large Language Model
(FL-LLM) framework, designed to enhance IoT system intelligence while ensuring
data privacy and computational efficiency. The framework integrates Generative
IoT (GIoT) models with a Gradient Sensing Federated Strategy (GSFS),
dynamically optimizing model updates based on real-time network conditions. By
leveraging a hybrid edge-cloud processing architecture, our approach balances
intelligence, scalability, and security in distributed IoT environments.
Evaluations on the IoT-23 dataset demonstrate that our framework improves model
accuracy, reduces response latency, and enhances energy efficiency,
outperforming traditional FL techniques (i.e., FedAvg, FedOpt). These findings
highlight the potential of integrating LLM-powered federated learning into
large-scale IoT ecosystems, paving the way for more secure, scalable, E
adaptive IoT management solutions.
Questo articolo esplora i giri e le loro implicazioni.
Scarica PDF:
2504.16032v1