Skip to content(if available)orjump to list(if available)

EM-LLM: Human-Inspired Episodic Memory for Infinite Context LLMs

mountainriver

TTT, cannon layers, and titans seem like a stronger approach IMO.

Information needs to be compressed into latent space or it becomes computationally intractable

searchguy

do you have references to

> TTT, cannon layers, and titans

MacsHeadroom

So, infinite context length by making it compute bound instead of memory bound. Curious how much longer this takes to run and when it makes sense to use vs RAG.