↓
Skip to main content
AesVoy
Posts
Posts
LLM
Infini-Attention Paper Review
3 May 2024
·
438 words
·
3 mins
Infini-Attention introduces a novel approach to scaling Transformer models for infinitely long inputs while managing memory and computation.