Running 384 384 LLM Model VRAM Calculator 📈 Calculate VRAM requirements for running large language models
DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters Updated 24 days ago • 89
Running on CPU Upgrade 12.6k 12.6k Open LLM Leaderboard 🏆 Track, rank and evaluate open LLMs and chatbots