These source files work for this model too?

#1
by ElvisM - opened

https://huggingface.co/DavidAU/Llama-3.2-8X4B-MOE-V2-Dark-Champion-Instruct-uncensored-abliterated-21B-GGUF

It's version 2, I know, but just wanted to confirm. I haven't gotten good results from the 8x4b version. Not sure if it's because even though it's a MOE, it can't achieve the same quality as a good Nemo finetune.

Not sure if I am reading your comment correctly - are you comparing this to V1 or Nemo Finetune?

If the latter ->
Nemo finetune will likely be stronger, due to 12B parameters, VS 8 versions of 4B models working together especially in terms of nuance / specific prompts.
In other words, this MOE will have a lot of brainpower, but not the "depth" of a 12B so to speak.

However, you may be able to get stronger perform from this MOE, if your prompt(s) are more detailed, have more directives or requirements VS same
prompt for a 12B Nemo. Smaller models need more direction - they guess less / don't "understand" unwritten nuance in instructions the same way
larger models do.

Sign up or log in to comment