Pre-BERT-SL1000
Model Description
Pre-BERT-SL1000 is a BERT-based sequence labeling model fine-tuned on the HiFi-KPI dataset for extracting financial key performance indicators (KPIs) from SEC earnings filings (10-K & 10-Q). It specializes in identifying entities, such as revenue, earnings, etc.
This model is trained on the HiFi-KPI dataset and is focused on the presentation layer taxonomy with n=1.
Use Cases
- Extracting financial KPIs using iXBRL presentation taxonomy
- Financial document parsing with entity recognition
Performance
- Trained on 1,000 most frequent labels from the HiFi-KPI dataset with n=1 in the presentation taxonomy
Dataset & Code
- Dataset: HiFi-KPI on Hugging Face
- Code example: HiFi-KPI GitHub Repository
- Downloads last month
- 10
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
Model tree for AAU-NLP/Pre-BERT-SL1000
Base model
google-bert/bert-base-uncased