Pre-BERT-SL1000

Model Description

Pre-BERT-SL1000 is a BERT-based sequence labeling model fine-tuned on the HiFi-KPI dataset for extracting financial key performance indicators (KPIs) from SEC earnings filings (10-K & 10-Q). It specializes in identifying entities, such as revenue, earnings, etc.

This model is trained on the HiFi-KPI dataset and is focused on the presentation layer taxonomy with n=1.

Use Cases

  • Extracting financial KPIs using iXBRL presentation taxonomy
  • Financial document parsing with entity recognition

Performance

  • Trained on 1,000 most frequent labels from the HiFi-KPI dataset with n=1 in the presentation taxonomy

Dataset & Code

Downloads last month
10
Safetensors
Model size
110M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for AAU-NLP/Pre-BERT-SL1000

Finetuned
(2999)
this model

Dataset used to train AAU-NLP/Pre-BERT-SL1000