metadata
library_name: transformers
tags: []
Model Card for Model ID
This is a model I accidentally trained with too low a batch size, causing the training loss to spike and essentially fail. I found it amusing that it nevertheless does very well on EWoK, Entity Tracking, Adjective Nominalization, COMPS, and AoA. Maybe this says something about ourselves, how so many in society fail upwards... food for thought.