ltg
/

PyTorch
English
custom_code
File size: 569 Bytes
28410e5
 
 
 
1bcb957
 
9a341a0
 
d5e44c9
 
9a341a0
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
---
license: mit
language:
- en
---

Submission to the BabyLM challenge 2024 trained on [Baby-cosmo-fine-100M](https://huggingface.co/datasets/ltg/babylm-2024-baby-cosmo-fine-100m).

The training scripts are published here: https://github.com/ltgoslo/gpt-bert

```latex
@misc{charpentier2024gptbertboth,
      title={GPT or BERT: why not both?}, 
      author={Lucas Georges Gabriel Charpentier and David Samuel},
      year={2024},
      eprint={2410.24159},
      archivePrefix={arXiv},
      primaryClass={cs.CL},
      url={https://arxiv.org/abs/2410.24159}, 
}
```