论文标题
样品有效学习张量场的嵌入
Sample Efficient Learning of Factored Embeddings of Tensor Fields
论文作者
论文摘要
订单2及以上的数据张量通常正在生成。这些数据收集越来越大且增长。许多科学和医学数据张量是张量字段(例如图像,视频,地理数据),其中空间社区包含重要信息。直接访问如此大的数据张量收集以获取信息已变得越来越令人难以置信。我们学习近似的全级和紧凑量张量的草图,具有分解表示,可提供紧凑的空间,时间和光谱嵌入量。现在,在原始张量字段上的所有信息查询和后处理现在都可以更有效,可以定制精度,因为它们在潜在生成空间中的这些紧凑的货运草图上进行。我们通过从张量切片的样品有效的子采样中构建紧凑型因子矩阵来产生最佳的级别tucky tucker tucker分解。我们的样本有效策略是通过使用与共轭先验的Dirichlet分布的适应性随机汤普森采样来学习的。
Data tensors of orders 2 and greater are now routinely being generated. These data collections are increasingly huge and growing. Many scientific and medical data tensors are tensor fields (e.g., images, videos, geographic data) in which the spatial neighborhood contains important information. Directly accessing such large data tensor collections for information has become increasingly prohibitive. We learn approximate full-rank and compact tensor sketches with decompositive representations providing compact space, time and spectral embeddings of tensor fields. All information querying and post-processing on the original tensor field can now be achieved more efficiently and with customizable accuracy as they are performed on these compact factored sketches in latent generative space. We produce optimal rank-r sketchy Tucker decomposition of arbitrary order data tensors by building compact factor matrices from a sample-efficient sub-sampling of tensor slices. Our sample efficient policy is learned via an adaptable stochastic Thompson sampling using Dirichlet distributions with conjugate priors.