Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
p1atdev
/
dart-v2-moe-sft
like
9
Text Generation
Transformers
Safetensors
isek-ai/danbooru-tags-2024
mixtral
trl
sft
optimum
danbooru
text-generation-inference
License:
apache-2.0
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
322a6a2
dart-v2-moe-sft
1 contributor
History:
3 commits
p1atdev
Upload MixtralForCausalLM
322a6a2
verified
10 months ago
.gitattributes
1.52 kB
initial commit
10 months ago
README.md
5.18 kB
Upload tokenizer
10 months ago
config.json
817 Bytes
Upload MixtralForCausalLM
10 months ago
generation_config.json
162 Bytes
Upload MixtralForCausalLM
10 months ago
model.safetensors
331 MB
LFS
Upload MixtralForCausalLM
10 months ago
special_tokens_map.json
889 Bytes
Upload tokenizer
10 months ago
tokenizer.json
996 kB
Upload tokenizer
10 months ago
tokenizer_config.json
19.7 kB
Upload tokenizer
10 months ago