CTMAE-P2-V4-S5

This model is a fine-tuned version of MCG-NJU/videomae-large-finetuned-kinetics on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.9568
  • Accuracy: 0.6889

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • training_steps: 6500

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.6674 0.0202 131 0.8697 0.5556
0.4329 1.0202 262 2.0814 0.5556
1.1234 2.0202 393 1.8388 0.5556
0.6448 3.0202 524 0.8365 0.5556
1.422 4.0202 655 1.5742 0.5556
0.8229 5.0202 786 1.4841 0.5556
1.0158 6.0202 917 1.6325 0.5556
1.5449 7.0202 1048 1.0645 0.5556
0.7285 8.0202 1179 1.6570 0.5556
1.0003 9.0202 1310 1.3149 0.5556
0.6923 10.0202 1441 1.4487 0.5556
0.6469 11.0202 1572 1.5096 0.5556
0.6441 12.0202 1703 0.6743 0.5778
1.3927 13.0202 1834 1.5688 0.5556
0.7324 14.0202 1965 1.8057 0.5556
0.8535 15.0202 2096 0.9958 0.5556
0.6217 16.0202 2227 1.8299 0.5556
0.6316 17.0202 2358 0.9438 0.5778
0.6872 18.0202 2489 1.0891 0.5111
1.5154 19.0202 2620 1.5606 0.5778
0.5531 20.0202 2751 2.0657 0.5111
0.4864 21.0202 2882 1.5141 0.5778
1.0543 22.0202 3013 1.6743 0.5333
1.4812 23.0202 3144 1.6335 0.5111
0.5504 24.0202 3275 2.0608 0.5778
0.0955 25.0202 3406 2.5849 0.5111
0.4181 26.0202 3537 1.5860 0.5778
0.0738 27.0202 3668 1.9424 0.6
0.2064 28.0202 3799 1.8362 0.6667
0.2696 29.0202 3930 2.2124 0.5111
0.5201 30.0202 4061 1.8246 0.6444
0.3376 31.0202 4192 1.9568 0.6889
0.2752 32.0202 4323 1.7191 0.6667
0.2664 33.0202 4454 2.6353 0.5778
0.641 34.0202 4585 2.3520 0.5778
0.9121 35.0202 4716 2.0164 0.5556
0.1392 36.0202 4847 2.5261 0.5778
0.3777 37.0202 4978 2.2239 0.5778
0.0005 38.0202 5109 2.4557 0.5778
0.3699 39.0202 5240 2.6679 0.5333
0.1673 40.0202 5371 2.7295 0.5778
0.2676 41.0202 5502 2.5027 0.5778
0.0822 42.0202 5633 2.8017 0.5556
0.2881 43.0202 5764 2.9313 0.5556
0.364 44.0202 5895 2.7614 0.5333
0.3917 45.0202 6026 2.7807 0.5111
0.0007 46.0202 6157 2.7458 0.5333
0.0582 47.0202 6288 2.7683 0.5556
0.6675 48.0202 6419 2.8154 0.5556
0.0003 49.0125 6500 2.8304 0.5556

Framework versions

  • Transformers 4.46.2
  • Pytorch 2.0.1+cu117
  • Datasets 3.0.1
  • Tokenizers 0.20.0
Downloads last month
29
Safetensors
Model size
304M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for beingbatman/CTMAE-P2-V4-S5

Finetuned
(54)
this model