Download PDFOpen PDF in browser

Tripod: Learning Latent Representations for Sequences

EasyChair Preprint no. 3196

9 pagesDate: April 18, 2020


We propose a new model for learning and extracting latent representations from sequences, which generates a tripartite representation: global style, memory-based and summary-based partitions. We show the relevance of these representations on a couple of mainstream tasks such as text similarity and natural language inference. We argue that the generic nature of this approach makes it applicable to many other tasks that involve modelling of discreet-valued sequences (time-ordered) and, with some modifications even to image and speech processing. We encourage everyone to try our opensource code and our Python3 API

Keyphrases: embeddings, latent representations, machine learning, Natural Language Processing, Paragraphs, sentences, sequence

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
  author = {Tiberiu Boroş and Andrei Cotaie and Alexandru Meterez and Paul Ilioaica},
  title = {Tripod: Learning Latent Representations for Sequences},
  howpublished = {EasyChair Preprint no. 3196},

  year = {EasyChair, 2020}}
Download PDFOpen PDF in browser