Story

#2
by Noose1 - opened

Story writing

Can we fine-tune it with mlx_lm.lora?
it is giving this error, and model repo dont seems to have a config file

Thanks David..

Can you fine-tune it on custom data? even with 64gb of RAM, it is not starting

Sorry, I can not help here - please contact the software/app provider about fine tuning.
I don't tune MOEs ; just the models that go into them (on a case by case basis - otherwise I use merge/"dna" swap techniques)
Tuning "moes" is a bit more involved than single model.

im new to this, is there anyone providing api access for this?

@3Zon Do you mean the model or fine tuning?
For the model; you need an app like Lmstudio, TextGeneration Webui, Koboldcpp to "run" the model so to speak.

For fine tuning:
Unsloth , for model merges : Mergekit .

@DavidAU i mean is there anyone already running the model somewhere, and providing api access that i can try out

@3Zon
Click on tab "use this model" (this repo) -> see "vllm"
; backyard may also have it avail.

Sign up or log in to comment