Comparing Dependency-based Compositional Models with Contextualized Word Embedding

TítuloComparing Dependency-based Compositional Models with Contextualized Word Embedding
AutoresPablo Gamallo, Manuel Prada Corral, Marcos Garcia
TipoPoster para congreso
Fonte 13th International Conference on Agents and Artificial Intelligence, SCITEPRESS – Science and Technology Publications, Lda, 2021.
AbstractIn this article, we compare two different strategies to contextualize the meaning of words in a sentence: both distributional models that make use of syntax-based methods following the Principle of Compositionality and Transformer technology such as BERT-like models. As the former methods require controlled syntactic struc- tures, the two approaches are compared against datasets with syntactically fixed sentences, namely subject- predicate and subject-predicate-object expressions. The results show that syntax-based compositional ap- proaches working with syntactic dependencies are competitive with neural-based Transformer models, and could have a greater potential when trained and developed using the same resources.
Palabras chaveCompositional Distributional Models, Contextualized Word Embeddings, Transformers, Compositionality, Dependency-based Parsing.