uploaded readme
Browse files
README.md
ADDED
@@ -0,0 +1,93 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
Quantization made by Richard Erkhov.
|
2 |
+
|
3 |
+
[Github](https://github.com/RichardErkhov)
|
4 |
+
|
5 |
+
[Discord](https://discord.gg/pvy7H8DZMG)
|
6 |
+
|
7 |
+
[Request more models](https://github.com/RichardErkhov/quant_request)
|
8 |
+
|
9 |
+
|
10 |
+
SEA-E - bnb 8bits
|
11 |
+
- Model creator: https://huggingface.co/ECNU-SEA/
|
12 |
+
- Original model: https://huggingface.co/ECNU-SEA/SEA-E/
|
13 |
+
|
14 |
+
|
15 |
+
|
16 |
+
|
17 |
+
Original model description:
|
18 |
+
---
|
19 |
+
license: apache-2.0
|
20 |
+
tags:
|
21 |
+
- Automated Peer Reviewing
|
22 |
+
- SFT
|
23 |
+
---
|
24 |
+
|
25 |
+
## Automated Peer Reviewing in Paper SEA: Standardization, Evaluation, and Analysis
|
26 |
+
|
27 |
+
Paper Link: https://arxiv.org/abs/2407.12857
|
28 |
+
|
29 |
+
Project Page: https://ecnu-sea.github.io/
|
30 |
+
|
31 |
+
|
32 |
+
## 🔥 News
|
33 |
+
- 🔥🔥🔥 SEA is accepted by EMNLP 2024 !
|
34 |
+
- 🔥🔥🔥 We have made SEA series models (7B) public !
|
35 |
+
|
36 |
+
## Model Description
|
37 |
+
The SEA-E model utilizes [Mistral-7B-Instruct-v0.2](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2) as its backbone. It is derived by performing supervised fine-tuning (SFT) on a high-quality peer review instruction dataset, standardized through the SEA-S model. **This model can provide comprehensive and insightful review feedback for submitted papers!**
|
38 |
+
|
39 |
+
## Review Paper With SEA-E
|
40 |
+
|
41 |
+
```python
|
42 |
+
from transformers import AutoModelForCausalLM, AutoTokenizer
|
43 |
+
|
44 |
+
instruction = system_prompt_dict['instruction_e']
|
45 |
+
paper = read_txt_file(mmd_file_path)
|
46 |
+
idx = paper.find("## References")
|
47 |
+
paper = paper[:idx].strip()
|
48 |
+
|
49 |
+
model_name = "/root/sea/"
|
50 |
+
tokenizer = AutoTokenizer.from_pretrained(model_name)
|
51 |
+
chat_model = AutoModelForCausalLM.from_pretrained(model_name)
|
52 |
+
chat_model.to("cuda:0")
|
53 |
+
|
54 |
+
messages = [
|
55 |
+
{"role": "system", "content": instruction},
|
56 |
+
{"role": "user", "content": paper},
|
57 |
+
]
|
58 |
+
|
59 |
+
encodes = tokenizer.apply_chat_template(messages, return_tensors="pt")
|
60 |
+
encodes = encodes.to("cuda:0")
|
61 |
+
len_input = encodes.shape[1]
|
62 |
+
generated_ids = chat_model.generate(encodes,max_new_tokens=8192,do_sample=True)
|
63 |
+
# response = chat_model.chat(messages)[0].response_text
|
64 |
+
response = tokenizer.batch_decode(generated_ids[: , len_input:])[0]
|
65 |
+
|
66 |
+
```
|
67 |
+
The code provided above is an example. For detailed usage instructions, please refer to https://github.com/ecnu-sea/sea.
|
68 |
+
|
69 |
+
## Additional Clauses
|
70 |
+
|
71 |
+
The additional clauses for this project are as follows:
|
72 |
+
|
73 |
+
- Commercial use is not allowed.
|
74 |
+
- The SEA-E model is intended solely to provide informative reviews for authors to polish their papers instead of directly recommending acceptance/rejection on papers.
|
75 |
+
- Currently, the SEA-E model is only applicable within the field of machine learning and does not guarantee insightful comments for other disciplines.
|
76 |
+
|
77 |
+
|
78 |
+
## Citation
|
79 |
+
|
80 |
+
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
|
81 |
+
|
82 |
+
If you find our paper or models helpful, please consider cite as follows:
|
83 |
+
|
84 |
+
```bibtex
|
85 |
+
@inproceedings{yu2024automated,
|
86 |
+
title={Automated Peer Reviewing in Paper SEA: Standardization, Evaluation, and Analysis},
|
87 |
+
author={Yu, Jianxiang and Ding, Zichen and Tan, Jiaqi and Luo, Kangyang and Weng, Zhenmin and Gong, Chenghua and Zeng, Long and Cui, RenJing and Han, Chengcheng and Sun, Qiushi and others},
|
88 |
+
booktitle={Findings of the Association for Computational Linguistics: EMNLP 2024},
|
89 |
+
pages={10164--10184},
|
90 |
+
year={2024}
|
91 |
+
}
|
92 |
+
```
|
93 |
+
|