Patrick Johnson
commited on
Update README.md
Browse files
README.md
CHANGED
@@ -19,7 +19,7 @@ CausalBERT (C-BERT) is a multi-task fine-tuned German BERT that extracts causal
|
|
19 |
2. Relation classification (CAUSE, EFFECT, INTERDEPENDENCY)
|
20 |
|
21 |
## Usage
|
22 |
-
Find the custom [library](https://github.com/
|
23 |
```python
|
24 |
from transformers import AutoTokenizer
|
25 |
from causalbert.infer import load_model, analyze_sentence_with_confidence
|
@@ -34,7 +34,7 @@ result = analyze_sentence_with_confidence(
|
|
34 |
|
35 |
- **Base model**: `google-bert/bert-base-german-cased`
|
36 |
- **Epochs**: 3, **LR**: 2e-5, **Batch size**: 8
|
37 |
-
- See [train.py](https://github.com/
|
38 |
|
39 |
## Limitations
|
40 |
|
|
|
19 |
2. Relation classification (CAUSE, EFFECT, INTERDEPENDENCY)
|
20 |
|
21 |
## Usage
|
22 |
+
Find the custom [library](https://github.com/norygami/causalbert). Once installed, run inference like so:
|
23 |
```python
|
24 |
from transformers import AutoTokenizer
|
25 |
from causalbert.infer import load_model, analyze_sentence_with_confidence
|
|
|
34 |
|
35 |
- **Base model**: `google-bert/bert-base-german-cased`
|
36 |
- **Epochs**: 3, **LR**: 2e-5, **Batch size**: 8
|
37 |
+
- See [train.py](https://github.com/norygami/causalbert/blob/main/causalbert/train.py) for details.
|
38 |
|
39 |
## Limitations
|
40 |
|