Update README.md
Browse files
README.md
CHANGED
@@ -1,47 +1,36 @@
|
|
1 |
---
|
2 |
-
base_model:
|
3 |
-
- Ppoyaa/LlumiLuminRP-8B-Instruct-262k-v0.3
|
4 |
-
- ChaoticNeutrals/Poppy_Porpoise-v0.7-L3-8B
|
5 |
library_name: transformers
|
6 |
tags:
|
7 |
- mergekit
|
8 |
- merge
|
9 |
-
|
10 |
---
|
11 |
-
#
|
12 |
-
|
13 |
-
|
14 |
-
|
15 |
-
|
16 |
-
|
17 |
-
|
18 |
-
|
19 |
-
|
20 |
-
|
21 |
-
|
22 |
-
|
23 |
-
|
24 |
-
|
25 |
-
|
26 |
-
|
27 |
-
|
28 |
-
|
29 |
-
|
30 |
-
|
31 |
-
|
32 |
-
|
33 |
-
|
34 |
-
|
35 |
-
|
36 |
-
|
37 |
-
|
38 |
-
|
39 |
-
|
40 |
-
t:
|
41 |
-
- filter: self_attn
|
42 |
-
value: [0, 0.5, 0.3, 0.7, 1]
|
43 |
-
- filter: mlp
|
44 |
-
value: [1, 0.5, 0.7, 0.3, 0]
|
45 |
-
- value: 0.5
|
46 |
-
dtype: bfloat16
|
47 |
-
```
|
|
|
1 |
---
|
|
|
|
|
|
|
2 |
library_name: transformers
|
3 |
tags:
|
4 |
- mergekit
|
5 |
- merge
|
6 |
+
license: apache-2.0
|
7 |
---
|
8 |
+
# LlumiLuminRP-8B-Instruct-262k-v0.4
|
9 |
+

|
10 |
+
***
|
11 |
+
## Description
|
12 |
+
An update to v0.3 to improve coherence and roleplaying experience. This model is the result of merging a bunch of Llama-3-8B RP/ERP models and is using a context window of 262k.
|
13 |
+
***
|
14 |
+
## 💻 Usage
|
15 |
+
```python
|
16 |
+
!pip install -qU transformers accelerate
|
17 |
+
|
18 |
+
from transformers import AutoTokenizer
|
19 |
+
import transformers
|
20 |
+
import torch
|
21 |
+
|
22 |
+
model = "Ppoyaa/LlumiLuminRP-8B-Instruct-262k-v0.4"
|
23 |
+
messages = [{"role": "user", "content": "What is a large language model?"}]
|
24 |
+
|
25 |
+
tokenizer = AutoTokenizer.from_pretrained(model)
|
26 |
+
prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
|
27 |
+
pipeline = transformers.pipeline(
|
28 |
+
"text-generation",
|
29 |
+
model=model,
|
30 |
+
torch_dtype=torch.float16,
|
31 |
+
device_map="auto",
|
32 |
+
)
|
33 |
+
|
34 |
+
outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
|
35 |
+
print(outputs[0]["generated_text"])
|
36 |
+
```
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|