Keyurjotaniya007's picture
Update README.md
0af0f7f verified
metadata
license: apache-2.0
datasets:
  - knkarthick/samsum
language:
  - en
metrics:
  - rouge
base_model:
  - google-t5/t5-base
pipeline_tag: summarization
tags:
  - generated_from_text

Summarizer

This model is a fine-tuned version of google-t5/t5-base on a knkarthick/samsum dataset. It achieves the following results on the evaluation set:

• ROUGE-1: 51.41

• ROUGE-2: 26.72

• ROUGE-L: 42.15

• ROUGE-Lsum: 42.17

Training hyperparameters

The following hyperparameters were used during training:

• learning_rate: 5e-5

• train_batch_size: 2

• gradient_accumulation_steps=2

• seed: 42

• weight_decay=0.01

• lr_scheduler_type: linear

• warmup_ratio=0.1

• num_epochs: 3

Training results

Training Loss Epoch Step
1.491800 1 3650
1.404900 2 7350
1.322800 3 11000

Framework versions

• transformers: 4.56.0

• pytorch: 2.0.1+cu118

• datasets: 2.14.4

• tokenizers: 0.13.3