|
--- |
|
language: |
|
- en |
|
- de |
|
tags: |
|
- translation |
|
- opus-mt |
|
license: cc-by-4.0 |
|
model-index: |
|
- name: opus-mt-tc-big-eng-deu |
|
results: |
|
- task: |
|
name: Translation eng-deu |
|
type: translation |
|
args: eng-deu |
|
dataset: |
|
name: Tatoeba-test.eng-deu |
|
type: tatoeba_mt |
|
args: eng-deu |
|
metrics: |
|
- name: BLEU |
|
type: bleu |
|
value: 45.7 |
|
--- |
|
|
|
# Opus Tatoeba English-German |
|
|
|
*This model was obtained by running the script [convert_marian_to_pytorch.py](https://github.com/huggingface/transformers/blob/master/src/transformers/models/marian/convert_marian_to_pytorch.py) - [Instruction available here](https://github.com/huggingface/transformers/tree/main/scripts/tatoeba). The original models were trained by [Jörg Tiedemann](https://blogs.helsinki.fi/tiedeman/) using the [MarianNMT](https://marian-nmt.github.io/) library. See all available `MarianMTModel` models on the profile of the [Helsinki NLP](https://huggingface.co/Helsinki-NLP) group. |
|
|
|
This is the conversion of checkpoint [opus+bt-2021-04-13.zip](https://object.pouta.csc.fi/Tatoeba-MT-models/eng-deu/opus+bt-2021-04-13.zip) |
|
* |
|
|
|
|
|
--- |
|
|
|
### eng-deu |
|
|
|
* source language name: English |
|
* target language name: German |
|
* OPUS readme: [README.md](https://object.pouta.csc.fi/Tatoeba-MT-models/eng-deu/README.md) |
|
|
|
* model: transformer-align |
|
* source language code: en |
|
* target language code: de |
|
* dataset: opus+bt |
|
* release date: 2021-02-22 |
|
* pre-processing: normalization + SentencePiece (spm32k,spm32k) |
|
* download original weights: [opus+bt-2021-04-13.zip](https://object.pouta.csc.fi/Tatoeba-MT-models/eng-deu/opus+bt-2021-04-13.zip) |
|
* Test set translations data: [opus+bt-2021-04-13.test.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/eng-deu/opus+bt-2021-04-13.test.txt) |
|
* test set scores file: [opus+bt-2021-04-13.eval.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/eng-deu/opus+bt-2021-04-13.eval.txt) |
|
* Benchmarks |
|
|Test set|BLEU|chr-F| |
|
|---|---|---| |
|
|newssyscomb2009.eng-deu|22.8|0.538| |
|
|news-test2008.eng-deu|23.7|0.533| |
|
|newstest2009.eng-deu|22.6|0.532| |
|
|newstest2010.eng-deu|25.5|0.552| |
|
|newstest2011.eng-deu|22.6|0.527| |
|
|newstest2012.eng-deu|23.4|0.530| |
|
|newstest2013.eng-deu|27.1|0.556| |
|
|newstest2014-deen.eng-deu|29.6|0.599| |
|
|newstest2015-ende.eng-deu|31.6|0.600| |
|
|newstest2016-ende.eng-deu|37.2|0.644| |
|
|newstest2017-ende.eng-deu|30.6|0.595| |
|
|newstest2018-ende.eng-deu|45.6|0.696| |
|
|newstest2019-ende.eng-deu|41.3|0.659| |
|
|Tatoeba-test.eng-deu|45.7|0.654| |
|
|