Summarizer

This model is a fine-tuned version of google-t5/t5-base on a knkarthick/samsum dataset. It achieves the following results on the evaluation set:

• ROUGE-1: 51.41

• ROUGE-2: 26.72

• ROUGE-L: 42.15

• ROUGE-Lsum: 42.17

Training hyperparameters

The following hyperparameters were used during training:

• learning_rate: 5e-5

• train_batch_size: 2

• gradient_accumulation_steps=2

• seed: 42

• weight_decay=0.01

• lr_scheduler_type: linear

• warmup_ratio=0.1

• num_epochs: 3

Training results

Training Loss Epoch Step
1.491800 1 3650
1.404900 2 7350
1.322800 3 11000

Framework versions

• transformers: 4.56.0

• pytorch: 2.0.1+cu118

• datasets: 2.14.4

• tokenizers: 0.13.3

Downloads last month
25
Safetensors
Model size
223M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Keyurjotaniya007/t5-base-samsum-summarizer

Base model

google-t5/t5-base
Finetuned
(646)
this model

Dataset used to train Keyurjotaniya007/t5-base-samsum-summarizer