Papers
arxiv:2401.06104

Transformers are Multi-State RNNs

Published on Jan 11
Β· Submitted by akhaliq on Jan 12

Abstract

Transformers are considered conceptually different compared to the previous generation of state-of-the-art NLP models - recurrent neural networks (RNNs). In this work, we demonstrate that decoder-only transformers can in fact be conceptualized as infinite multi-state RNNs - an RNN variant with unlimited hidden state size. We further show that pretrained transformers can be converted into finite multi-state RNNs by fixing the size of their hidden state. We observe that several existing transformers cache compression techniques can be framed as such conversion policies, and introduce a novel policy, TOVA, which is simpler compared to these policies. Our experiments with several long range tasks indicate that TOVA outperforms all other baseline policies, while being nearly on par with the full (infinite) model, and using in some cases only 1{8} of the original cache size. Our results indicate that transformer decoder LLMs often behave in practice as RNNs. They also lay out the option of mitigating one of their most painful computational bottlenecks - the size of their cache memory. We publicly release our code at https://github.com/schwartz-lab-NLP/TOVA.

Community

Β·
Paper author

It should work now

https://github.com/schwartz-lab-NLP/TOVA page not found...

The repository was reuploaded! However, this repository is under construction.. Please check it out again!!

Paper author

https://github.com/schwartz-lab-NLP/TOVA page not found...

https://github.com/schwartz-lab-NLP/TOVA page not found...

The repository was reuploaded! However, this repository is under construction.. Please check it out again!!

The repository is ready to use! Go ahead and convert your LLaMA into finite MSRNN :)

https://github.com/schwartz-lab-NLP/TOVA

Unlocking Transformers: The Secret Connection to RNNs Revealed!

Links πŸ”—:

πŸ‘‰ Subscribe: https://www.youtube.com/@Arxflix
πŸ‘‰ Twitter: https://x.com/arxflix
πŸ‘‰ LMNT (Partner): https://lmnt.com/

By Arxflix
9t4iCUHx_400x400-1.jpg

Sign up or log in to comment

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2401.06104 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2401.06104 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2401.06104 in a Space README.md to link it from this page.

Collections including this paper 11