3090s?

#39
by xiaotianyu2025 - opened

Can it be deployed on one or two 3090s?

I think you might need at least 45GB vRAM if you are using 20b models.

I am running it on my 3090 with 24 gigs of RAM both on Ollama and LM Studio without any issues. Haven't tried the tensor model.

ollama 15GB

Sign up or log in to comment