logo
ResearchBunny Logo
HMT: Hierarchical Memory Transformer for Efficient Long Context Language Processing

Computer Science

HMT: Hierarchical Memory Transformer for Efficient Long Context Language Processing

Z. He, Y. Cao, et al.

Hierarchical Memory Transformer (HMT) imitates human memory hierarchy to boost long-context processing: it preserves tokens from early segments, passes memory embeddings forward, and recalls relevant history using memory-augmented segment-level recurrence, improving language modeling, QA, and summarization while using far fewer parameters and much less inference memory. Research conducted by Zifan He, Yingqi Cao, Zongyue Qin, Neha Prakriya, Yizhou Sun, and Jason Cong.... show more
Citation Metrics
Citations
2
Influential Citations
0
Reference Count
52
Citation by Year

Note: The citation metrics presented here have been sourced from Semantic Scholar and OpenAlex.

Listen, Learn & Level Up
Over 10,000 hours of research content in 25+ fields, available in 12+ languages.
No more digging through PDFs, just hit play and absorb the world's latest research in your language, on your time.
listen to research audio papers with researchbunny