FalconMamba 7B Collection This collection features the FalconMamba 7B base model, the instruction-tuned version, their 4-bit and GGUF variants, and the demo. • 15 items • Updated Nov 6, 2025 • 35
Falcon3 Collection Falcon3 family of Open Foundation Models is a set of pretrained and instruct LLMs ranging from 1B to 10B parameters. • 40 items • Updated Nov 6, 2025 • 89
Falcon Edge series Collection A series of powerful, universal and fine-tunable small Language Models • 7 items • Updated Nov 6, 2025 • 25
Falcon-H1 Collection Falcon-H1 Family of Hybrid-Head Language Models (Transformer-SSM), including 0.5B, 1.5B, 1.5B-Deep, 3B, 7B, and 34B (pretrained & instruction-tuned). • 39 items • Updated 1 day ago • 58
Learnable Multipliers: Freeing the Scale of Language Model Matrix Layers Paper • 2601.04890 • Published 2 days ago • 31
Mem0: Building Production-Ready AI Agents with Scalable Long-Term Memory Paper • 2504.19413 • Published Apr 28, 2025 • 38
AT^2PO: Agentic Turn-based Policy Optimization via Tree Search Paper • 2601.04767 • Published 2 days ago • 21
Can LLMs Predict Their Own Failures? Self-Awareness via Internal Circuits Paper • 2512.20578 • Published 18 days ago • 68
Digital Twin AI: Opportunities and Challenges from Large Language Models to World Models Paper • 2601.01321 • Published 7 days ago • 15
UniCorn: Towards Self-Improving Unified Multimodal Models through Self-Generated Supervision Paper • 2601.03193 • Published 4 days ago • 41
EXAONE-3.0 Collection EXAONE 3.0 7.8B instruction-tuned language model • 3 items • Updated Jul 7, 2025 • 6