Fill-Mask
Transformers
PyTorch
esm
File size: 2,419 Bytes
43e7753
0352550
f66e885
8d45bf4
f66e885
 
 
 
 
 
 
 
 
 
dfac9f0
dbcd58c
d964335
f66e885
43e7753
0b807b0
77ff16f
 
522eb9d
 
 
 
 
 
9810eb0
8eb3412
4a09c97
3bf7ed3
77ff16f
66f45cb
0fc92ea
9810eb0
 
 
 
 
1aa9ce8
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
---
license: cc-by-nc-nd-4.0
extra_gated_fields:
  Name: text
  Company: text
  Country: country
  Specific date: date_picker
  I want to use this model for:
    type: select
    options: 
      - Research
      - Education
      - label: Other
        value: other
  I agree to include the authors of the code (Tianlai Chen and Pranam Chatterjee) as authors on manuscripts with data from designed peptides: checkbox
  I agree to share generated sequences and associated data with authors before publishing: checkbox
  I agree not to file patents on any sequences generated by this model: checkbox
  I agree to use this model for non-commercial use ONLY: checkbox
---
**PepMLM: Target Sequence-Conditioned Generation of Peptide Binders via Masked Language Modeling**

![image/png](https://cdn-uploads.huggingface.co/production/uploads/63df6223f351dc0745681f77/_U66d78-GCwZ5Z6dF2KOE.png)
In this work, we introduce **PepMLM**, a purely target sequence-conditioned *de novo* generator of linear peptide binders. 
By employing a novel masking strategy that uniquely positions cognate peptide sequences at the terminus of target protein sequences, 
PepMLM tasks the state-of-the-art ESM-2 pLM to fully reconstruct the binder region, 
achieving low perplexities matching or improving upon previously-validated peptide-protein sequence pairs. 
After successful *in silico* benchmarking with AlphaFold-Multimer, we experimentally verify PepMLM’s efficacy via fusion of model-derived peptides to E3 ubiquitin ligase domains, demonstrating endogenous degradation of target substrates in cellular models. 
In total, PepMLM enables the generative design of candidate binders to any target protein, without the requirement of target structure, empowering downstream programmable proteome editing applications.     

- Demo: HuggingFace Space Demo [Link](https://huggingface.co/spaces/TianlaiChen/PepMLM).[Temporarily Unavailable]
- Colab Notebook: [Link](https://colab.research.google.com/drive/1u0i-LBog_lvQ5YRKs7QLKh_RtI-tV8qM?usp=sharing)
- Preprint: [Link](https://arxiv.org/abs/2310.03842)
- Nature Biotechnology: [Link](https://www.nature.com/articles/s41587-025-02761-2)

```
# Load model directly
from transformers import AutoTokenizer, AutoModelForMaskedLM

tokenizer = AutoTokenizer.from_pretrained("TianlaiChen/PepMLM-650M")
model = AutoModelForMaskedLM.from_pretrained("TianlaiChen/PepMLM-650M")
```
![Logo](logo.png)