--- library_name: transformers tags: - mergekit - merge - llama-cpp - gguf-my-repo - not-for-all-audiences base_model: v000000/L3-8B-Poppy-Sunspice --- Why choose? The full model name is "Llama-3-8B-Poppy-Sunspice" RP Model, pretty creative. ![image/png](https://cdn-uploads.huggingface.co/production/uploads/64f74b6e6389380c77562762/Og-dxkbY6ICxoQv4r1vjg.png) # v000000/L3-8B-Poppy-Sunspice-Q8_0-GGUF This model was converted to GGUF format from [`v000000/L3-8B-Poppy-Sunspice`](https://huggingface.co/v000000/L3-8B-Poppy-Sunspice) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space. Refer to the [original model card](https://huggingface.co/v000000/L3-8B-Poppy-Sunspice) for more details on the model. # Update/Notice: This has a tendency for endless generations. It likes a bit of penalty parameters if this occurs. # merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [linear](https://arxiv.org/abs/2203.05482) merge method. ### Models Merged The following models were included in the merge: * [Sao10K/L3-8B-Stheno-v3.1](https://huggingface.co/Sao10K/L3-8B-Stheno-v3.1) * [Nitral-Archive/Poppy_Porpoise-Biomix](https://huggingface.co/Nitral-Archive/Poppy_Porpoise-Biomix) * [Hastagaras/HALU-8B-LLAMA3-BRSLURP](https://huggingface.co/Hastagaras/HALU-8B-LLAMA3-BRSLURP) * [crestf411/L3-8B-sunfall-abliterated-v0.1](https://huggingface.co/crestf411/L3-8B-sunfall-abliterated-v0.1) * [cgato/L3-TheSpice-8b-v0.8.3](https://huggingface.co/cgato/L3-TheSpice-8b-v0.8.3) * [Nitral-AI/Poppy_Porpoise-0.72-L3-8B](https://huggingface.co/Nitral-AI/Poppy_Porpoise-0.72-L3-8B) ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: crestf411/L3-8B-sunfall-abliterated-v0.1 parameters: weight: 0.1 - model: Nitral-AI/Poppy_Porpoise-0.72-L3-8B parameters: weight: 0.3 - model: Nitral-Archive/Poppy_Porpoise-Biomix parameters: weight: 0.1 - model: Sao10K/L3-8B-Stheno-v3.1 parameters: weight: 0.2 - model: Hastagaras/HALU-8B-LLAMA3-BRSLURP parameters: weight: 0.1 - model: cgato/L3-TheSpice-8b-v0.8.3 parameters: weight: 0.2 merge_method: linear dtype: float16 ``` # Prompt Template: ```bash <|begin_of_text|><|start_header_id|>system<|end_header_id|> {system_prompt}<|eot_id|><|start_header_id|>user<|end_header_id|> {input}<|eot_id|><|start_header_id|>assistant<|end_header_id|> {output}<|eot_id|> ```