Refaçade: Editing Object with Given Reference Texture

Youze Huang1,* Penghui Ruan2,* Bojia Zi3,* Xianbiao Qi4,† Jianan Wang5 Rong Xiao4
* Equal contribution. Corresponding author.

Huggingface Model Github arXiv Huggingface Space Demo Page

🚀 Overview

Refaçade is a unified image–video retexturing model built upon the Wan2.1-based VACE framework. It edits the surface material of specified objects in a video using user-provided reference textures, while preserving the original geometry and background. We use Jigsaw Permutation to decouple structural information in the reference image and a Texture Remover to disentangle the original object’s appearance. This functionality enables users to explore diverse possibilities effectively.


🛠️ Installation

Our project is built upon Wan2.1-based VACE.

pip install -r requirements.txt
pip install wan@git+https://github.com/Wan-Video/Wan2.1

🏃‍♂️ Gradio Demo

You can use this gradio demo to retexture objects. Note that you don't need to compile the SAM2.

python app.py

📂 Download

First, download our checkpoints:

huggingface-cli download --resume-download fishze/Refacade --local-dir models

Next, download SAM2 sam2_hiera_large.pt and place it at:

sam2/SAM2-Video-Predictor/checkpoints/

We recommend to organize local directories as:

Refacade
├── ...
├── examples
├── models
│   ├── refacade
│   │   └── ...
│   ├── texture_remover
│   │   └── ...
│   └── vae
│       └── ...
├── sam2
└── ...

Downloads last month
-
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Space using fishze/Refacade 1