Original model: xiaol/RWKV-v5.2-7B-Role-play-16k

You can run this model with ai00_rwkv_server.

Although ai00_rwkv_server is mainly for lowend PC, you can run it on servers which are support VULKAN.

To try it in Colab, you should install libnvidia-gl-* :

!apt -y install libnvidia-gl-535

Original Model Card:

RWKV (claude) better role play with v5.2, more logic and reasonable , could follow instructions

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.