BERT NER — Fine-tuned Named Entity Recognition Model

Model: ELHACHYMI/bert-ner
Base model: bert-base-uncased
Task: Token Classification — Named Entity Recognition (NER)
Dataset: CoNLL-2003 (English)


Model Overview

This model is a fine-tuned version of BERT Base Uncased on the CoNLL-2003 Named Entity Recognition (NER) dataset.
It predicts the following entity types:

  • PER — Person
  • ORG — Organization
  • LOC — Location
  • MISC — Miscellaneous
  • O — Outside any entity

The model is suitable for information extraction, document understanding, chatbot entity detection, and structured text processing.


Labels

The model uses the standard IOB2 tagging scheme:

ID Label
0 O
1 B-PER
2 I-PER
3 B-ORG
4 I-ORG
5 B-LOC
6 I-LOC
7 B-MISC
8 I-MISC

How to Load the Model

Using Hugging Face Pipeline

from transformers import pipeline

ner = pipeline("ner", model="ELHACHYMI/bert-ner", aggregation_strategy="simple")

text = "Bill Gates founded Microsoft in the United States."
print(ner(text))
Downloads last month
16
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for ELHACHYMI/bert-ner

Finetuned
(6282)
this model

Dataset used to train ELHACHYMI/bert-ner

Space using ELHACHYMI/bert-ner 1