|
--- |
|
license: apache-2.0 |
|
datasets: |
|
- common-pile/comma_v0.1_training_dataset |
|
language: |
|
- en |
|
base_model: common-pile/comma-v0.1-2t |
|
pipeline_tag: text-generation |
|
tags: |
|
- mlx |
|
library_name: mlx |
|
--- |
|
# common-pile/comma-v0.1-2t ported to MLX |
|
|
|
See [common-pile/comma-v0.1-2t](https://huggingface.co/common-pile/comma-v0.1-2t) for the original model card. |
|
|
|
Try this model out using [uv](https://github.com/astral-sh/uv) like this, which will download a 15GB model the first time you run it: |
|
``` |
|
uv run --python 3.12 \ |
|
--with mlx-lm \ |
|
mlx_lm.generate \ |
|
--model simonw/comma-v0.1-2t-mlx \ |
|
--prompt 'Facts about pelicans:' |
|
``` |
|
|
|
More notes [on my blog](https://simonwillison.net/2025/Jun/7/comma/). I created the MLX port using this command: |
|
|
|
```bash |
|
uv run --python 3.12 \ |
|
--with mlx-lm \ |
|
python -m mlx_lm convert \ |
|
--hf-path common-pile/comma-v0.1-2t \ |
|
--mlx-path ./comma-v0.1-2t-mlx |
|
``` |
|
|