论文标题
FEDSPACE:卫星和地面站的有效的联合学习框架
FedSpace: An Efficient Federated Learning Framework at Satellites and Ground Stations
论文作者
论文摘要
低地球轨道(LEO)卫星的大规模部署收集了大量的地球成像和传感器数据,这些数据可以增强机器学习能力(ML),以应对全球挑战,例如实时灾难导航和缓解。但是,由于有限的下行链路带宽,稀疏的连接性和正则化约束,因此通常不可行地下载所有高分辨率图像并在地面上训练这些ML模型。为了应对这些挑战,我们利用联邦学习(FL),地面站和卫星协作训练全球ML模型,而无需在卫星上共享捕获的图像。我们在卫星和地面站应用现有的FL算法时表现出了根本的挑战,并且我们制定了一个优化问题,该问题捕捉了稳定性和闲置之间的独特权衡。我们提出了一个名为FedSpace的新型FL框架,该框架根据卫星轨道的确定性和时变连通性动态地安排模型聚合。基于现实世界卫星图像和卫星网络的广泛数值评估表明,在最先进的FL算法上,FEDSPACE将训练时间减少了1.7天(38.6%)。
Large-scale deployments of low Earth orbit (LEO) satellites collect massive amount of Earth imageries and sensor data, which can empower machine learning (ML) to address global challenges such as real-time disaster navigation and mitigation. However, it is often infeasible to download all the high-resolution images and train these ML models on the ground because of limited downlink bandwidth, sparse connectivity, and regularization constraints on the imagery resolution. To address these challenges, we leverage Federated Learning (FL), where ground stations and satellites collaboratively train a global ML model without sharing the captured images on the satellites. We show fundamental challenges in applying existing FL algorithms among satellites and ground stations, and we formulate an optimization problem which captures a unique trade-off between staleness and idleness. We propose a novel FL framework, named FedSpace, which dynamically schedules model aggregation based on the deterministic and time-varying connectivity according to satellite orbits. Extensive numerical evaluations based on real-world satellite images and satellite networks show that FedSpace reduces the training time by 1.7 days (38.6%) over the state-of-the-art FL algorithms.