roberta-large-english-ud-goeswith

Model Description

This is a RoBERTa model for POS-tagging and dependency-parsing (using goeswith for subwords), derived from roberta-large.

How to Use

from transformers import pipeline
nlp=pipeline("universal-dependencies","KoichiYasuoka/roberta-large-english-ud-goeswith",trust_remote_code=True,aggregation_strategy="simple")
print(nlp("I saw a horse yesterday which had no name"))
Downloads last month
27
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.

Model tree for KoichiYasuoka/roberta-large-english-ud-goeswith

Finetuned
(309)
this model

Dataset used to train KoichiYasuoka/roberta-large-english-ud-goeswith