File size: 1,355 Bytes
7ca5530
 
 
 
 
 
 
e0dd4b1
 
7ca5530
 
 
 
 
 
 
 
 
 
 
 
 
e0dd4b1
 
7ca5530
 
 
 
966c146
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
7ca5530
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
---
license: mit
base_model:
- google/paligemma-3b-pt-224
tags:
- openpi0
- jax
datasets:
- IPEC-COMMUNITY/bridge_orig_lerobot
---



download the model

```bash
huggingface-cli download --resume-download --local-dir-use-symlinks False ${model} --local-dir $(basename ${model})
```

launch the openpi0 server, please create the [openpi](https://github.com/Physical-Intelligence/openpi/) environment first

```bash
export OPENPI_DATA_HOME=/PATH/TO/OPENPI_DATA_HOME
export LEROBOT_HOME=/PATH/TO/LEROBOT_HOME

uv run scripts/serve_policy.py policy:checkpoint \
    --policy.config=pi0_bridge_lora \
    --policy.dir=$THE_MODEL_PATH
```

```python
TrainConfig(
    name="pi0_bridge_low_mem_finetune",
    model=pi0.Pi0Config(paligemma_variant="gemma_2b_lora", action_expert_variant="gemma_300m_lora"),
    data=LeRobotBridgeDataConfig(
        repo_id="local/bridge_lerobot",
        base_config=DataConfig(
            local_files_only=True,  # Set to True for local-only datasets.
            prompt_from_task=True,
        ),
    ),
    weight_loader=weight_loaders.CheckpointWeightLoader("s3://openpi-assets/checkpoints/pi0_base/params"),
    num_train_steps=30_000,
    freeze_filter=pi0.Pi0Config(
        paligemma_variant="gemma_2b_lora", action_expert_variant="gemma_300m_lora"
    ).get_freeze_filter(),
    ema_decay=None,
    num_workers=8,
),
```