Attention Collection by amenur Mar 1, 2024 - System 2 Attention (is something you might need too) Paper • 2311.11829 • Published Nov 20, 2023 • 40 Transformers are Multi-State RNNs Paper • 2401.06104 • Published Jan 11, 2024 • 37 The Era of 1-bit LLMs: All Large Language Models are in 1.58 Bits Paper • 2402.17764 • Published Feb 27, 2024 • 608
System 2 Attention (is something you might need too) Paper • 2311.11829 • Published Nov 20, 2023 • 40
The Era of 1-bit LLMs: All Large Language Models are in 1.58 Bits Paper • 2402.17764 • Published Feb 27, 2024 • 608