论文标题
随机自适应线搜索差异性私有优化
Stochastic Adaptive Line Search for Differentially Private Optimization
论文作者
论文摘要
基于私有梯度的优化算法的性能高度取决于步骤尺寸(或学习率)的选择,而步长(或学习率)通常需要非平凡的调整。在本文中,我们介绍了满足Rényi差异隐私的经典回溯线搜索算法的随机变体。具体而言,所提出的算法适应地选择了使用嘈杂的梯度和功能估计值的步骤尺寸(以很高的概率)满足。此外,为了提高所选步骤大小满足条件的概率,它根据噪声梯度的可靠性调整运行时的均值隐私预算。回溯搜索算法的幼稚实施可能最终会以不可接受的大隐私预算,因为自适应步长选择的能力以额外的功能评估为代价。提出的算法通过使用稀疏向量技术与最近的隐私放大引理相结合来避免此问题。我们还引入了一种隐私预算适应策略,其中算法在发现连续梯度指向的指示大不相同时会自适应增加预算。在凸和非凸问题上进行的广泛实验表明,自适应选择的步骤尺寸允许拟议的算法有效地使用隐私预算,并针对现有的私人优化者显示竞争性能。
The performance of private gradient-based optimization algorithms is highly dependent on the choice of step size (or learning rate) which often requires non-trivial amount of tuning. In this paper, we introduce a stochastic variant of classic backtracking line search algorithm that satisfies Rényi differential privacy. Specifically, the proposed algorithm adaptively chooses the step size satsisfying the the Armijo condition (with high probability) using noisy gradients and function estimates. Furthermore, to improve the probability with which the chosen step size satisfies the condition, it adjusts per-iteration privacy budget during runtime according to the reliability of noisy gradient. A naive implementation of the backtracking search algorithm may end up using unacceptably large privacy budget as the ability of adaptive step size selection comes at the cost of extra function evaluations. The proposed algorithm avoids this problem by using the sparse vector technique combined with the recent privacy amplification lemma. We also introduce a privacy budget adaptation strategy in which the algorithm adaptively increases the budget when it detects that directions pointed by consecutive gradients are drastically different. Extensive experiments on both convex and non-convex problems show that the adaptively chosen step sizes allow the proposed algorithm to efficiently use the privacy budget and show competitive performance against existing private optimizers.