论文标题

学习线性运算符:无限维度回归是一个行为良好的非压缩反问题

Learning linear operators: Infinite-dimensional regression as a well-behaved non-compact inverse problem

论文作者

Mollenhauer, Mattes, Mücke, Nicole, Sullivan, T. J.

论文摘要

我们考虑了从经验观察中学习一个线性操作员$θ$的问题,我们将其解释为无限维度中的最小二乘回归。我们表明,该目标可以作为$θ$的反问题重新构成,其远期操作员通常是非压缩的功能(即使假定$θ$是紧凑或$ p $ -schatten类)。但是,我们证明,就光谱特性和正则化理论而言,这个反问题等同于与标量响应回归相关的已知紧凑型逆问题。 我们的框架允许在Hölder-type源条件下的通用学习算法的无维率的优雅推导。这些证明依赖于内核回归中技术的组合以及最新的结果浓度的次数希尔伯特随机变量的量度浓度。在功能回归中的各种实际上相关的方案以及与操作员价值的核中的非线性回归中所获得的速率,并与经典内核回归的核与标量响应匹配。

We consider the problem of learning a linear operator $θ$ between two Hilbert spaces from empirical observations, which we interpret as least squares regression in infinite dimensions. We show that this goal can be reformulated as an inverse problem for $θ$ with the feature that its forward operator is generally non-compact (even if $θ$ is assumed to be compact or of $p$-Schatten class). However, we prove that, in terms of spectral properties and regularisation theory, this inverse problem is equivalent to the known compact inverse problem associated with scalar response regression. Our framework allows for the elegant derivation of dimension-free rates for generic learning algorithms under Hölder-type source conditions. The proofs rely on the combination of techniques from kernel regression with recent results on concentration of measure for sub-exponential Hilbertian random variables. The obtained rates hold for a variety of practically-relevant scenarios in functional regression as well as nonlinear regression with operator-valued kernels and match those of classical kernel regression with scalar response.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源