Arrow Research search
Back to FLAP

FLAP 2019

Compositionality for Recursive Neural Networks.

Journal Article Number 4 Logic in Computer Science

Abstract

Modelling compositionality has been a longstanding area of research in the field of vector space semantics. The categorical approach to compositionality maps grammar onto vector spaces in a principled way, but comes under fire for requiring the formation of very high-dimensional matrices and tensors, and therefore being computationally infeasible. In this paper I show how a linear simplification of recursive neural tensor network models can be mapped directly onto the categorical approach, giving a way of computing the required matrices and tensors. This mapping suggests a number of lines of research for both categorical compositional vector space models of meaning and for recursive neural network models of compositionality.

Authors

Keywords

No keywords are indexed for this paper.

Context

Venue
IfCoLog Journal of Logics and their Applications
Archive span
2014-2026
Indexed papers
633
Paper id
116731771137105631