论文标题
DNN驱动的压缩卸载,用于边缘辅助语义视频细分
DNN-Driven Compressive Offloading for Edge-Assisted Semantic Video Segmentation
论文作者
论文摘要
深度学习在语义细分中表现出了令人印象深刻的性能,但是对于资源受限的移动设备仍无法承受。虽然卸载计算任务是有希望的,但高流量要求不堪重负的带宽。现有的压缩算法不适合语义分割,因为缺乏明显和集中的兴趣区域(ROI)会迫使采用统一的压缩策略,从而导致低压比或准确性。本文介绍了STAC,这是一种针对边缘辅助语义视频分割的DNN驱动的压缩方案。 STAC是第一个利用DNN梯度作为空间灵敏度的空间自适应压缩的空间敏感性指标,并实现了较高的压缩比和准确性。然而,将这种限制的压缩调整为视频是一个挑战。实际问题包括不同的空间敏感性和用于压缩策略反馈和卸载的大量带宽消耗。我们通过时空自适应方案解决这些问题,(1)(1)将部分策略生成运营脱机以减少沟通负荷,(2)通过密集的光流传播压缩策略和分割结果,并且通过密集的光流进行了框架,并且适应性地卸载关键框架以适应视频内容。我们在商品移动设备上实现了STAC。实验表明,与最先进的算法相比,Stac最多可以节省多达20.95%的带宽而不会丢失精度。
Deep learning has shown impressive performance in semantic segmentation, but it is still unaffordable for resource-constrained mobile devices. While offloading computation tasks is promising, the high traffic demands overwhelm the limited bandwidth. Existing compression algorithms are not fit for semantic segmentation, as the lack of obvious and concentrated regions of interest (RoIs) forces the adoption of uniform compression strategies, leading to low compression ratios or accuracy. This paper introduces STAC, a DNN-driven compression scheme tailored for edge-assisted semantic video segmentation. STAC is the first to exploit DNN's gradients as spatial sensitivity metrics for spatial adaptive compression and achieves superior compression ratio and accuracy. Yet, it is challenging to adapt this content-customized compression to videos. Practical issues include varying spatial sensitivity and huge bandwidth consumption for compression strategy feedback and offloading. We tackle these issues through a spatiotemporal adaptive scheme, which (1) takes partial strategy generation operations offline to reduce communication load, and (2) propagates compression strategies and segmentation results across frames through dense optical flow, and adaptively offloads keyframes to accommodate video content. We implement STAC on a commodity mobile device. Experiments show that STAC can save up to 20.95% of bandwidth without losing accuracy, compared to the state-of-the-art algorithm.