Running 22 LiquidAI/LFM2.5-1.2B-Instruct ⚡ 22 LFM2.5 (1.2B) runs on Ollama using only a dual-core CPU
view reply It's a good question. Experimentally, it generalizes very well to other languages, but adding prompts in your target languages will always be more efficient and fine-grained.
Running Featured 54 LFM2.5-VL-1.6B WebGPU 🧠 54 In-browser vision-language inference with LFM2.5-VL-1.6B