|
|
--- |
|
|
license: mit |
|
|
language: |
|
|
- en |
|
|
inference: true |
|
|
base_model: |
|
|
- FacebookAI/roberta-base |
|
|
pipeline_tag: fill-mask |
|
|
tags: |
|
|
- fill-mask |
|
|
- smart-contract |
|
|
- web3 |
|
|
- software-engineering |
|
|
- embedding |
|
|
- codebert |
|
|
library_name: transformers |
|
|
--- |
|
|
|
|
|
# SmartBERT V1 RoBERTa (2022) |
|
|
|
|
|
## Overview |
|
|
|
|
|
This **smart contract pre-trained model** is used to transfer smart contract _function-level_ code to embeddings. |
|
|
|
|
|
It is trained by **[Sen Fang](https://github.com/TomasAndersonFang)** in 2022 on over **40,000** smart contracts. |
|
|
|
|
|
Initialized with **RoBERTa** |
|
|
|
|
|
Please update to [SmartBERT V2](https://huggingface.co/web3se/SmartBERT-v2) |
|
|
|
|
|
## Citations |
|
|
|
|
|
```tex |
|
|
@article{huang2025smart, |
|
|
title={Smart Contract Intent Detection with Pre-trained Programming Language Model}, |
|
|
author={Huang, Youwei and Li, Jianwen and Fang, Sen and Li, Yao and Yang, Peng and Hu, Bin and Zhang, Tao}, |
|
|
journal={arXiv preprint arXiv:2508.20086}, |
|
|
year={2025} |
|
|
} |
|
|
``` |
|
|
|
|
|
## Thanks |
|
|
|
|
|
- [Institute of Intelligent Computing Technology, Suzhou, CAS](http://iict.ac.cn/) |