论文标题

舒适:自我监视的神经动态服装

SNUG: Self-Supervised Neural Dynamic Garments

论文作者

Santesteban, Igor, Otaduy, Miguel A., Casas, Dan

论文摘要

我们提出了一种自我监督的方法,可以学习通过参数人体佩戴的服装的动态3D变形。使用需要大型数据集的监督策略对模型3D服装变形的最先进的数据驱动方法进行了培训,这些策略通常通过昂贵的基于物理的仿真方法或专业的多相机捕获设置获得。相比之下,我们提出了一项新的培训计划,该计划消除了对基地真实样本的需求,从而可以自制的3D服装变形训练。我们的关键贡献是要意识到,基于物理的变形模型,传统上以隐式集成商在逐框基础上解决,可以将其重新铸造为一个优化问题。我们利用这种基于优化的方案来制定一组基于物理的损失项,这些损失项可用于训练神经网络而无需对地面真相的数据进行训练。这使我们能够学习互动服装的模型,包括动态变形和细皱纹,与最先进的监督方法相比,训练时间的两位数量级速度都提高了。

We present a self-supervised method to learn dynamic 3D deformations of garments worn by parametric human bodies. State-of-the-art data-driven approaches to model 3D garment deformations are trained using supervised strategies that require large datasets, usually obtained by expensive physics-based simulation methods or professional multi-camera capture setups. In contrast, we propose a new training scheme that removes the need for ground-truth samples, enabling self-supervised training of dynamic 3D garment deformations. Our key contribution is to realize that physics-based deformation models, traditionally solved in a frame-by-frame basis by implicit integrators, can be recasted as an optimization problem. We leverage such optimization-based scheme to formulate a set of physics-based loss terms that can be used to train neural networks without precomputing ground-truth data. This allows us to learn models for interactive garments, including dynamic deformations and fine wrinkles, with two orders of magnitude speed up in training time compared to state-of-the-art supervised methods

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源