metadata
license: mit
language:
- en
inference: true
base_model:
- FacebookAI/roberta-base
pipeline_tag: fill-mask
tags:
- fill-mask
- smart-contract
- web3
- software-engineering
- embedding
- codebert
library_name: transformers
SmartBERT V1 RoBERTa (2022)
Overview
This smart contract pre-trained model is used to transfer smart contract function-level code to embeddings.
It is trained by Sen Fang in 2022 on over 40,000 smart contracts.
Initialized with RoBERTa
Please update to SmartBERT V2
Citations
@article{huang2025smart,
title={Smart Contract Intent Detection with Pre-trained Programming Language Model},
author={Huang, Youwei and Li, Jianwen and Fang, Sen and Li, Yao and Yang, Peng and Hu, Bin and Zhang, Tao},
journal={arXiv preprint arXiv:2508.20086},
year={2025}
}