Running on CPU Upgrade Featured 2.69k The Smol Training Playbook 📚 2.69k The secrets to building world-class LLMs
Running 3.6k The Ultra-Scale Playbook 🌌 3.6k The ultimate guide to training LLM on large GPU Clusters
DBRX Collection DBRX is a mixture-of-experts (MoE) large language model trained from scratch by Databricks. • 3 items • Updated Mar 27, 2024 • 96