论文标题

法律论点的挑战和刺激

Challenges and Thrills of Legal Arguments

论文作者

Pallaprolu, Anurag, Vaidya, Radha, Attawar, Aditya Swaroop

论文摘要

基于最新的注意力模型主要围绕变压器体系结构,使用所谓的缩放点产生关注来解决序列到序列翻译的问题。尽管该技术对于估计界面的关注非常有效,但当我们处理类似对话的场景时,它并没有回答相互关注的问题。我们提出了一个扩展,Humbert试图使用本地训练的变压器进行连续的上下文论证生成。

State-of-the-art attention based models, mostly centered around the transformer architecture, solve the problem of sequence-to-sequence translation using the so-called scaled dot-product attention. While this technique is highly effective for estimating inter-token attention, it does not answer the question of inter-sequence attention when we deal with conversation-like scenarios. We propose an extension, HumBERT, that attempts to perform continuous contextual argument generation using locally trained transformers.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源