Ambiguity-aware RoBERTa
This model is trained on SemEval2007 Task 14 Affective Text dataset and is capable of representing the ambiguity occurring in emotion analysis tasks as an accurate distribution (i.e., softmax output). It was introduced in the following paper: "Deep Model Compression Also Helps Models Capture Ambiguity" (ACL 2023).
Usage
from transformers import RobertaTokenizer, RobertaForSequenceClassification
tokenizer = RobertaTokenizer.from_pretrained('hancheolp/ambiguity-aware-roberta-emotion')
model = RobertaForSequenceClassification.from_pretrained('hancheolp/ambiguity-aware-roberta-emotion')
news_headline = "Amateur rocket scientists reach for space."
encoded_input = tokenizer(news_headline, return_tensors='pt')
output = model(**encoded_input)
distribution = output.logits.softmax(dim=-1)
Each index of the output vector represents the following:
- 0: anger
- 1: disgust
- 2: fear
- 3: joy
- 4: sadness
- 5: surprise
- Downloads last month
- 7
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no library tag.
Model tree for hancheolp/ambiguity-aware-roberta-emotion
Base model
FacebookAI/roberta-base