Researchers from Instituto de Telecomunicações, DeepMind, Institute of Systems and Robotics, Instituto Superior Técnico and Unbabel propose “∞-former” — a transformer model with unbounded long-term memory (LTM) that can attend to arbitrarily long contexts.

Here is a quick read: Infinite Memory Transformer: Attending to Arbitrarily Long Contexts Without Increasing Computation Burden.

The paper ∞-former: Infinite Memory Transformer is on arXiv.



Source link