Representational Strengths and Limitations of Transformers
- URL: http://arxiv.org/abs/2306.02896v2
- Date: Thu, 16 Nov 2023 14:48:16 GMT
- Title: Representational Strengths and Limitations of Transformers
- Authors: Clayton Sanford, Daniel Hsu, Matus Telgarsky
- Abstract summary: We establish both positive and negative results on the representation power of attention layers.
We show the necessity and role of a large embedding dimension in a transformer.
We also present natural variants that can be efficiently solved by attention layers.
- Score: 33.659870765923884
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Attention layers, as commonly used in transformers, form the backbone of
modern deep learning, yet there is no mathematical description of their
benefits and deficiencies as compared with other architectures. In this work we
establish both positive and negative results on the representation power of
attention layers, with a focus on intrinsic complexity parameters such as
width, depth, and embedding dimension. On the positive side, we present a
sparse averaging task, where recurrent networks and feedforward networks all
have complexity scaling polynomially in the input size, whereas transformers
scale merely logarithmically in the input size; furthermore, we use the same
construction to show the necessity and role of a large embedding dimension in a
transformer. On the negative side, we present a triple detection task, where
attention layers in turn have complexity scaling linearly in the input size; as
this scenario seems rare in practice, we also present natural variants that can
be efficiently solved by attention layers. The proof techniques emphasize the
value of communication complexity in the analysis of transformers and related
models, and the role of sparse averaging as a prototypical attention task,
which even finds use in the analysis of triple detection.
Related papers
Err
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.