nielsr HF Staff commited on
Commit
6ac63a2
·
verified ·
1 Parent(s): ec50acf

Add project page link to model card

Browse files

This PR enhances the model card by adding a direct link to the project page: https://itay1itzhak.github.io/planted-in-pretraining. This provides users with more comprehensive resources related to the paper "Planted in Pretraining, Swayed by Finetuning: A Case Study on the Origins of Cognitive Biases in LLMs".

Files changed (1) hide show
  1. README.md +10 -9
README.md CHANGED
@@ -1,20 +1,20 @@
1
  ---
2
- license: apache-2.0
3
- tags:
4
- - language-modeling
5
- - causal-lm
6
- - bias-analysis
7
- - cognitive-bias
8
  datasets:
9
  - allenai/tulu-v2-sft-mixture
10
  language:
11
  - en
 
 
12
  metrics:
13
  - accuracy
14
- base_model:
15
- - allenai/OLMo-7B
16
  pipeline_tag: text-generation
17
- library_name: transformers
 
 
 
 
18
  ---
19
 
20
  # Model Card for OLMo-Tulu
@@ -33,6 +33,7 @@ This is one of 3 identical versions trained with different random seeds.
33
  - **Finetuned from**: `allenai/OLMo-7B`
34
  - **Paper**: https://arxiv.org/abs/2507.07186
35
  - **Repository**: https://github.com/itay1itzhak/planted-in-pretraining
 
36
 
37
  ## Uses
38
 
 
1
  ---
2
+ base_model:
3
+ - allenai/OLMo-7B
 
 
 
 
4
  datasets:
5
  - allenai/tulu-v2-sft-mixture
6
  language:
7
  - en
8
+ library_name: transformers
9
+ license: apache-2.0
10
  metrics:
11
  - accuracy
 
 
12
  pipeline_tag: text-generation
13
+ tags:
14
+ - language-modeling
15
+ - causal-lm
16
+ - bias-analysis
17
+ - cognitive-bias
18
  ---
19
 
20
  # Model Card for OLMo-Tulu
 
33
  - **Finetuned from**: `allenai/OLMo-7B`
34
  - **Paper**: https://arxiv.org/abs/2507.07186
35
  - **Repository**: https://github.com/itay1itzhak/planted-in-pretraining
36
+ - **Project Page**: https://itay1itzhak.github.io/planted-in-pretraining
37
 
38
  ## Uses
39