tomaarsen's picture
tomaarsen HF Staff
Update model metadata to set pipeline tag to the new `text-ranking` and tags to `sentence-transformers`
ca6de07 verified
|
raw
history blame
1.46 kB
metadata
license: mit
language:
  - en
library_name: transformers
pipeline_tag: text-ranking
widget:
  - text: Prince Raoden went to Elantris. [SEP] Elantris is a great city.
tags:
  - sentence-transformers

bert-base-cased-NER-reranker

A BERT model trained on the synthetic literary NER context retrieval dataset Amalvy et. al, 2023.

To use this model, construct a text of the form NER-sentence [SEP] context-sentence. The model should predict the positive class if context-sentence is useful to predict NER-sentence, and the negative class otherwise.

Performance Metrics

The model obtains 98.34 F1 on the synthetic test set. See Amalvy et. al, 2023 for details about NER performance gains when using this retriever model to assit a NER model at inference.

How to Reproduce Training

See the training script here.

Citation

If you use this model in your research, please cite:

@InProceedings{Amalvy2023,
  title	       = {Learning to Rank Context for Named Entity Recognition Using a Synthetic Dataset},
  author       = {Amalvy, A. and Labatut, V. and Dufour, R.},
  booktitle    = {2023 Conference on Empirical Methods in Natural Language Processing},
  year	       = {2023},
  doi	       = {10.18653/v1/2023.emnlp-main.642},
  pages	       = {10372-10382},
}