Fill-Mask
Transformers
PyTorch
esm
pranamanam commited on
Commit
a099e14
·
verified ·
1 Parent(s): dfac9f0

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +0 -7
README.md CHANGED
@@ -30,13 +30,6 @@ In total, PepMLM enables the generative design of candidate binders to any targe
30
  - Colab Notebook: [Link](https://colab.research.google.com/drive/1u0i-LBog_lvQ5YRKs7QLKh_RtI-tV8qM?usp=sharing)
31
  - Preprint: [Link](https://arxiv.org/abs/2310.03842)
32
 
33
- # Apply for Access
34
- As of February 2024, the model has been gated on HuggingFace. If you wish to use our model, please visit our page on the HuggingFace site ([Link](https://huggingface.co/ChatterjeeLab/PepMLM-650M)) and submit your access request there. An active HuggingFace account is necessary for both the application and subsequent modeling use. Approval of requests may take a few days, as we are a small lab with a manual approval process.
35
-
36
- Once your request is approved, you will need your personal access token to begin using this notebook. We appreciate your understanding.
37
-
38
- - How to find your access token: https://huggingface.co/docs/hub/en/security-tokens
39
-
40
  ```
41
  # Load model directly
42
  from transformers import AutoTokenizer, AutoModelForMaskedLM
 
30
  - Colab Notebook: [Link](https://colab.research.google.com/drive/1u0i-LBog_lvQ5YRKs7QLKh_RtI-tV8qM?usp=sharing)
31
  - Preprint: [Link](https://arxiv.org/abs/2310.03842)
32
 
 
 
 
 
 
 
 
33
  ```
34
  # Load model directly
35
  from transformers import AutoTokenizer, AutoModelForMaskedLM