Update README.md
Browse files
README.md
CHANGED
@@ -4,25 +4,25 @@ tags:
|
|
4 |
- text-generation
|
5 |
- poetry
|
6 |
- gpt2
|
7 |
-
-
|
8 |
license: mit
|
9 |
datasets:
|
10 |
- biglam/gutenberg-poetry-corpus
|
11 |
-
model_name: GPT2-124 Poetry
|
12 |
library_name: transformers
|
13 |
---
|
14 |
|
15 |
-
# GPT2-124 Fine-Tuned on Poetry with
|
16 |
|
17 |
## Model Description
|
18 |
-
This model, **GPT2-124 Poetry
|
19 |
|
20 |
## Training Details
|
21 |
- **Base Model**: [GPT-2 (124M)](https://huggingface.co/gpt2)
|
22 |
- **Dataset**: [Gutenberg Poetry Corpus](https://huggingface.co/datasets/biglam/gutenberg-poetry-corpus)
|
23 |
- **Fine-Tuning Approach**:
|
24 |
- Supervised fine-tuning on poetry lines.
|
25 |
-
-
|
26 |
- **Rhyme Reward**: Encourages rhyming lines.
|
27 |
- **Coherence Reward**: Ensures logical flow.
|
28 |
- **Creativity Reward**: Penalizes repetition and rewards unique wording.
|
@@ -33,8 +33,8 @@ You can generate poetry using the `transformers` library:
|
|
33 |
```python
|
34 |
from transformers import GPT2LMHeadModel, GPT2Tokenizer
|
35 |
|
36 |
-
tokenizer = GPT2Tokenizer.from_pretrained("ayazfau/GPT2-124-poetry-
|
37 |
-
model = GPT2LMHeadModel.from_pretrained("ayazfau/GPT2-124-poetry-
|
38 |
|
39 |
def generate_poetry(prompt, max_length=50):
|
40 |
input_ids = tokenizer(prompt, return_tensors="pt").input_ids
|
@@ -42,7 +42,6 @@ def generate_poetry(prompt, max_length=50):
|
|
42 |
return tokenizer.decode(output[0], skip_special_tokens=True)
|
43 |
|
44 |
print(generate_poetry("fear kill dreams,"))
|
45 |
-
|
46 |
```
|
47 |
|
48 |
## Model Performance
|
@@ -58,4 +57,5 @@ This model is released under the **MIT License**. Feel free to use it for resear
|
|
58 |
- Special thanks to the **Gutenberg Poetry Corpus** for providing high-quality literary data.
|
59 |
|
60 |
---
|
61 |
-
_If you use this model, please consider citing it or leaving a star on Hugging Face!_ ⭐
|
|
|
|
4 |
- text-generation
|
5 |
- poetry
|
6 |
- gpt2
|
7 |
+
- rl
|
8 |
license: mit
|
9 |
datasets:
|
10 |
- biglam/gutenberg-poetry-corpus
|
11 |
+
model_name: GPT2-124 Poetry RL
|
12 |
library_name: transformers
|
13 |
---
|
14 |
|
15 |
+
# GPT2-124 Fine-Tuned on Poetry with Reinforcement Learning (RL)
|
16 |
|
17 |
## Model Description
|
18 |
+
This model, **GPT2-124 Poetry RL**, is a fine-tuned version of GPT-2 trained on the **Gutenberg Poetry Corpus** with **Reinforcement Learning (RL)**. The model is optimized for poetic generation with enhanced stylistic qualities such as **rhyme, coherence, and creativity**.
|
19 |
|
20 |
## Training Details
|
21 |
- **Base Model**: [GPT-2 (124M)](https://huggingface.co/gpt2)
|
22 |
- **Dataset**: [Gutenberg Poetry Corpus](https://huggingface.co/datasets/biglam/gutenberg-poetry-corpus)
|
23 |
- **Fine-Tuning Approach**:
|
24 |
- Supervised fine-tuning on poetry lines.
|
25 |
+
- RL with custom reward functions:
|
26 |
- **Rhyme Reward**: Encourages rhyming lines.
|
27 |
- **Coherence Reward**: Ensures logical flow.
|
28 |
- **Creativity Reward**: Penalizes repetition and rewards unique wording.
|
|
|
33 |
```python
|
34 |
from transformers import GPT2LMHeadModel, GPT2Tokenizer
|
35 |
|
36 |
+
tokenizer = GPT2Tokenizer.from_pretrained("ayazfau/GPT2-124-poetry-RL")
|
37 |
+
model = GPT2LMHeadModel.from_pretrained("ayazfau/GPT2-124-poetry-RL")
|
38 |
|
39 |
def generate_poetry(prompt, max_length=50):
|
40 |
input_ids = tokenizer(prompt, return_tensors="pt").input_ids
|
|
|
42 |
return tokenizer.decode(output[0], skip_special_tokens=True)
|
43 |
|
44 |
print(generate_poetry("fear kill dreams,"))
|
|
|
45 |
```
|
46 |
|
47 |
## Model Performance
|
|
|
57 |
- Special thanks to the **Gutenberg Poetry Corpus** for providing high-quality literary data.
|
58 |
|
59 |
---
|
60 |
+
_If you use this model, please consider citing it or leaving a star on Hugging Face!_ ⭐
|
61 |
+
|