|
|
|
--- |
|
license: apache-2.0 |
|
tags: |
|
- segmentation |
|
- mamba |
|
- wafer |
|
- electron-microscopy |
|
- tokenunify |
|
--- |
|
|
|
# TokenUnify Models |
|
|
|
This repository contains TokenUnify models of different sizes trained on wafer electron microscopy data, along with a superhuman baseline model. |
|
|
|
## Available Models |
|
|
|
- **TokenUnify-1B.pth**: 1B parameter TokenUnify model |
|
- **TokenUnify-500M.pth**: 500M parameter TokenUnify model |
|
- **TokenUnify-200M.pth**: 200M parameter TokenUnify model |
|
- **TokenUnify-100M.pth**: 100M parameter TokenUnify model |
|
- **superhuman.pth**: Superhuman baseline model |
|
|
|
## Model Details |
|
|
|
- **Architecture**: TokenUnify (based on Mamba) |
|
- **Training Data**: Wafer electron microscopy images |
|
- **Task**: Image Segmentation |
|
- **Framework**: PyTorch |
|
|
|
## Usage |
|
|
|
```python |
|
import torch |
|
|
|
# Load a specific model |
|
model_path = "TokenUnify-1B.pth" # or any other model file |
|
checkpoint = torch.load(model_path, map_location='cpu') |
|
|
|
# Your model loading code here |
|
``` |
|
|
|
## Model Sizes |
|
|
|
| Model | Parameters | File Name | |
|
|-------|------------|-----------| |
|
| TokenUnify Large | 1B | TokenUnify-1B.pth | |
|
| TokenUnify Medium | 500M | TokenUnify-500M.pth | |
|
| TokenUnify Small | 200M | TokenUnify-200M.pth | |
|
| TokenUnify Tiny | 100M | TokenUnify-100M.pth | |
|
| Superhuman Baseline | - | superhuman.pth | |
|
|
|
## Citation |
|
|
|
If you use these models, please cite the relevant paper. |
|
EOF |
|
|